Emotions for Human-Computer Interaction (HCI): recognizing, generating, interacting
Studying how emotional understanding and expression can support more natural, adaptive, and socially intelligent human-computer interaction (HCI). Emphasis on both recognizing human affect and generating appropriate emotional behavior in agents.
Subdirections:
Emotional reasoning: Study emerging behavioural patterns (manipulation, bluffing, etc.) to develop empathic agents, e.g. via learnable digital twins aligned through interaction data.
Emotion generation: Design models capable of generating context-appropriate emotional expressions—textual, visual, or multimodal.
Emotion recognition and understanding: Leverage multimodal signals to infer user affect, intentions, and social traits.
Main Tasks:
Develop logic for an empathic agent, starting with a learnable digital twin.
Build or evaluate models for multimodal emotion recognition and engagement prediction.
Investigate generation of affective responses in dialogue or avatars.
Analyze interview data for emotion, performance, and personality cues.
Participants
Ilya Makarov
Team lead
Mikhail Mozikov
ML Researcher
Related Publications
Links
- LLM-Based Explicit Models of Opponents for Multi-Agent Games
- Semi-supervised, Fine-grained, Descriptive Emotion Understanding with Emotion-enhanced Personality Recognition
- Assessing Personality Traits and Interview Performance from Asynchronous Video Interviews
- Personality-aware Depression Detection (embeddings only)
- The 3rd MI-GA Challenge
- Facial Expression Recognition with Adaptive Frame Rate based on Multiple Testing Correction
- EmotiEffNets for Facial Processing in Video-Based Valence-Arousal Prediction, Expression Classification and Action Unit Detection