Predict mood better than with language alone

Go beyond transcription with multimodal AI for human understanding in audio and video

Trusted by

Nature Human BehaviorScience AdvancesNatureTrends In Cognitive SciencesPNAS

Why Hume

Train and deploy your own expression models to measure any outcome involving human communication

Facial Expression
Model

Face + Body

Differentiate 37 kinds of facial movement that are recognized as conveying distinct meanings, and the many ways they are blended together

Speech Prosody
Model

Voice

Discover over 25 patterns of tune, rhythm, and timbre that imbue everyday speech with complex, blended meanings

FACS 2.0
Model

Face + Body

An improved, automated facial action coding system (FACS): measure 26 facial action units (AUs) and 29 other features with even less bias than traditional FACS

Vocal Expression
Model

Voice

Differentiate 28 kinds of vocal expression recognized as conveying distinct meanings, and the many ways they are blended together

Latest releases

Valence & Arousal
Model

Multimodal

Predict perceived valence and arousal in facial expression, vocal bursts, speech, or language

Facial Expression
Model

Face + Body

Differentiate 37 kinds of facial movement that are recognized as conveying distinct meanings, and the many ways they are blended together

Vocal Call Types
Model

Voice

Explore vocal utterances by inferring probabilities of 67 descriptors, like 'laugh', 'sigh', 'shriek', 'oh', 'ahh', 'mhm', and more

01

03

How it works

Using Hume’s science-backed expression embedding models and state of the art language models

The Science

No one can read minds, so we start by using statistics to ask what expressions mean to the people making them

Our data-driven science represents real emotional behavior with 3x more precision than traditional approaches

We use scientific control and globally diverse data to remove biases that are entrenched in most AI models

    1 of 3

Reinventing the science of human expression

Nature Human BehaviorScience AdvancesNatureTrends In Cognitive SciencesPNAS

Principles & Testimonial

Why developers and product managers love building with Hume

End-to-end infrastructure to make all of your videos searchable

“Hume Ai continually amazes me as to what I can find in all parts of video – be it the video itself, the conversation or text on screen – not only specific results but an understanding of the context of the query. With each new search, I discover new results and possibilities of the technology for our customers."

Pedro AlmeidaPnas, CEO

Built for product defining developers

Build the next, highly-improved generation of multimodal AI-powered products

Continuously updated models based on psychologically valid, peer-reviewed, published research

Run

Request

Our Blog

Latest blog posts

Our latest product updates, developer news, platform tutorials, and How-To guides.

  • Product Updates

    Hume Powers Projects at UC Berkeley LLM Hackathon

    UC Berkeley hosted the world's largest AI hackathon on June 17-18, with over 1200 students developing diverse applications using large language models and open-source APIs. Hume's APIs were used alongside LLMs in 57 of the 240 projects and three of the 12 finalists, ranging from healthcare devices to education tools, indicating its wide applicability and versatility.

  • Product Updates

    Hume AI Raises $12.7M in Series A Funding

    We're pleased to announce that we've raised a $12.7M Series A led by Union Square Ventures! We're excited to be partnering with leading investors who understand the importance of technology that can align itself with human well-being using the same cues humans do: our expressions.

  • Science

    Hume AI Publication in Nature Human Behavior: Deep Learning & Vocal Bursts in Different Cultures

    Our first in a series of publications is in Nature Human Behavior! "Deep learning reveals what vocal bursts express in different cultures" addresses a key question at the intersection of AI and psych: What does the voice convey without words? And do chuckles, gasps, sighs, etc. have the same meaning across cultures?

  • Product Updates

    Tutorial: Hands-on with Hume AI’s API

    Working with the Hume AI Platform is easy! Built atop rigorous scientific studies of human expressive behavior, it's the only toolkit you need to measure nonverbal cues in audio, video, and images. Get started with our API, and integrate our cutting-edge models into your applications.

  • Podcast

    Episode 25 Myths About Emotion Science | The Feelings Lab

    Do expressions “reveal our emotions”? What did Darwin think? Is “emotion AI” really what it sounds like? In this episode, Dacher Keltner helps us unpack popular myths and misconceptions about emotion science, including how recent headlines have departed from peer-reviewed science.

  • Science

    Can AI Teach Itself to Improve Our Well-Being?

    The future of technology hinges on the measurement of well-being, the importance of which can’t be overstated. As emotion scientists work with AI practitioners to translate this knowledge into ML models, we ask: Can AI teach itself to make the world a better, happier place?

  • Product Updates

    Welcome to the Hume AI Blog

    Introducing the Hume AI blog, a place to learn about the science of human expression, emerging expressive communication technologies, and our latest product updates, developer news, platform tutorials, and How-To guides.

  • Podcast

    Episode 24 Trust & Safety in Online Conversations | The Feelings Lab

    Amid rising concerns over bots, BrandBastion CEO Jenny Wolfram joins us to discuss how AI is also empowering authentic voices, from classifying harmful content to delivering genuine feedback to organizations.

  • Podcast

    Episode 24 Compassion & Customer Service | The Feelings Lab

    In a world of products, customer service is essential to our well-being. In this episode, Cogito CEO Josh Feast joins us to discuss how companies are studying nonverbal behavior to build more empathy into customer service.

By using this website, you agree to our use of cookies.

Hume