We map the full spectrum of emotion with AI
Our research aligns foundation models with human well-being
40+
publications
3000+
citations
1 million+
participants
AI models of emotional expression
Traditional theories posited six discrete emotions, but we’ve discovered that emotional behavior is better explained by a high-dimensional, continuous space
1/6
Research highlights
A high-dimension map of expressive behavior
Semantic space theory
We’ve introduced datasets and machine learning methods to explore the dimensions of meaning underlying the feelings we report. We find nuanced patterns of expression in the face, voice, and beyond, replacing low-dimensional theories with a high-dimensional, data-driven understanding of human emotional expression. These discoveries provide a new foundation for emotion science and empathic AI.
28 nuanced facial expressions
The face
We’ve collected millions of natural facial expressions worldwide. Our findings reveal that facial expressions are more complex and nuanced than previously assumed. Our facial expression models replace time-consuming approaches to behavioral measurement with automated analysis.
Interpreting the value of laughs, cries, and sighs
The voice
The voice is even richer with nonverbal cues than facial expressions. We pioneered decoding subtle qualities in speech prosody: intonation, timbre, and rhythm. We go beyond language by understanding how something is said, not just what is said. We also study “vocal bursts” including sighs, gasps, grunts, laughs, shrieks, oohs, and ahhs. This work unlocks a parallel channel of information in the voice that was previously overlooked by AI.
Publications
Discover the research foundational to our products