Episode 22 Listener Questions + Emotion Science News | The Feelings Lab
Published on May 31, 2022
Join Hume AI CEO Dr. Alan Cowen and podcast host Matt Forte as they venture through the best pod-listener questions we've received so far this season: a veritable emotion science "mailbag." Can people who understand the emotions of others better interpret emotions conveyed through music? How should we responsibly address the ethics around emotion AI data collection and usage? Is there a healthy level of emotional expressivity conducive to emotional well-being? Are video calls bad for brainstorming? Do lobsters or hermit crabs have feelings? Tune in to hear the answer to these questions and more.
We begin with Dr. Alan Cowen, Hume AI CEO, and Matt Forte discussing recent scientific findings regarding video calls and how they change the way we think and communicate.
Hume AI CEO Dr. Alan Cowen and Matt Forte discuss how scientists grapple with emotional experience in animals, going back to Darwin's observations of "purposeless behaviors" among animals which he attributed to emotional expression.
Dr. Alan Cowen, CEO of Hume AI, describes The Hume Initiative, a not-for-profit developing concrete guidelines for the use of empathic AI.
Hume AI CEO Dr. Alan Cowen and Matt Forte discuss how expressing emotion may be critical to our mental health and well-being and to healthy relationships.
Hume AI CEO Dr. Alan Cowen and Matt Forte discuss how expressing emotion may be critical to our mental health and well-being and to healthy relationships.
Does the use of English-language terms to describe emotions contribute a Western bias in emotion science and empathic AI? Dr. Alan Cowen shares the importance of cross-cultural data.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.
How can emotionally intelligent voice AI support our mental health?
Recent advances in voice-to-voice AI, like EVI 2, offer emotionally intelligent interactions, picking up on vocal cues related to mental and physical health, which could enhance both clinical care and daily well-being.