Emotional intelligence for any application
Measure expression along with language. Optimize for happiness.





Trusted By


























































































Measure emotional expression with unmatched precision
One API, four modalities, hundreds of dimensions of emotional expression.
Voice models
Voice models
Image & video models






Speech Prosody
Image & video models
Voice models
Image & video models






Speech Prosody
Voice models
Image & video models






Speech Prosody
Voice models
Image & video models
Our models are built on 10+ years of research, millions of proprietary data points, and over 40 publications in leading journals.
Try in PlaygroundAI models of emotional expression
Traditional theories posited six discrete emotions, but we’ve discovered that emotional behavior is better explained by a high-dimensional, continuous space
1/6
Research highlights
A high-dimension map of expressive behavior
Semantic space theory
We’ve introduced datasets and machine learning methods to explore the dimensions of meaning underlying the feelings we report. We find nuanced patterns of expression in the face, voice, and beyond, replacing low-dimensional theories with a high-dimensional, data-driven understanding of human emotional expression. These discoveries provide a new foundation for emotion science and empathic AI.

28 nuanced facial expressions
The face
We’ve collected millions of natural facial expressions worldwide. Our findings reveal that facial expressions are more complex and nuanced than previously assumed. Our facial expression models replace time-consuming approaches to behavioral measurement with automated analysis.

Interpreting the value of laughs, cries, and sighs
The voice
The voice is even richer with nonverbal cues than facial expressions. We pioneered decoding subtle qualities in speech prosody: intonation, timbre, and rhythm. We go beyond language by understanding how something is said, not just what is said. We also study “vocal bursts” including sighs, gasps, grunts, laughs, shrieks, oohs, and ahhs. This work unlocks a parallel channel of information in the voice that was previously overlooked by AI.

Publications
Discover the research foundational to our products
References

Developer Platform
Create your Hume account, get your API keys, monitor your usage, and explore our products in the interactive platform.

Are emotional expressions universal?
Do people around the world express themselves in the same way? Does a smile mean the same thing worldwide? And how about a chuckle, a sigh, or a grimace? These questions about the cross-cultural universality of expressions are among the more important and long-standing in behavioral sciences like psychology and anthropology—and central to the study of emotion.
.png?width=810&height=810&quality=75&format=webp&fit=inside)
What is emotion science?
How can artificial intelligence achieve the level of emotional intelligence required to understand what makes us happy? As AI becomes increasingly integrated into our daily lives, the need for AI to understand emotional behaviors and what they signal about our intentions and preferences has never been more critical.
00/00
From cutting edge research to proven applications
Supported use cases
Health & Wellness

Social Networks

Education & Coaching

AI Research

Creative Tools

Financial Analysis

Empathic AI companions
How Dot uses Hume’s API for emotional intelligence

Media analytics
How Hume's API helps boost audience growth

Interactive EdTech
How Hume's API enables empathic language toys


Preventative healthcare
How Hume's EVI bridges the gap between appointments
