Back to projects

Emotions for Human-Computer Interaction (HCI): recognizing, generating, interacting

Active

Studying how emotional understanding and expression can support more natural, adaptive, and socially intelligent human-computer interaction (HCI). Emphasis on both recognizing human affect and generating appropriate emotional behavior in agents.

Subdirections:

  1. Emotional reasoning: Study emerging behavioural patterns (manipulation, bluffing, etc.) to develop empathic agents, e.g. via learnable digital twins aligned through interaction data.

  2. Emotion generation: Design models capable of generating context-appropriate emotional expressions—textual, visual, or multimodal.

  3. Emotion recognition and understanding: Leverage multimodal signals to infer user affect, intentions, and social traits.

Main Tasks:

  1. Develop logic for an empathic agent, starting with a learnable digital twin.

  2. Build or evaluate models for multimodal emotion recognition and engagement prediction.

  3. Investigate generation of affective responses in dialogue or avatars.

  4. Analyze interview data for emotion, performance, and personality cues.

Participants

  • Ilya Makarov

    Team lead

  • Mikhail Mozikov

    ML Researcher

Related Publications

HL-EAI: A Multimodal Framework Enabling Emotional Reciprocity in Human-AI Strategic Decision-Making

Proceedings of the 33rd ACM International Conference on Multimedia (2025)

Mikhail Mozikov, Ilya Makarov

Links