Episode 2 Embarrassment | The Feelings Lab
Published on Oct 4, 2021
In this week's episode of The Feelings Lab, we're discussing embarrassment! Returning hosts Dr. Alan Cowen, Dr. Dacher Keltner, and Matt Forte will be joined by guest host Dr. Jessica Tracy (Director of the Emotion and Self Lab at the University of British Columbia) and special guest, comedian Ali Kolbert (as seen on The Tonight Show).
Begin by hearing Dr. Dacher Keltner describe the concept of emotional contagion and how the feeling of embarrassment helps strengthen our collective identity.
Next, hear about this in practice as guest Ali Kolbert explains how emotions seem to spread across comedy club crowds—and how stepping over the line can cause backlash.
Later in the episode, Dr. Alan Cowen, Hume's Chief Scientist, comments on how the feeling of embarrassment has evolved across time and age. Specifically, social media may make the embarrassment we experience in our everyday lives feel worse than it used to.
And discover that while animals show recognizable displays of submission, psychologist Dr. Jessica Tracy explains — your dog’s expression of remorse may not be one of them.
All this and more can be found in our full episode, available on Apple and Spotify
Subscribe, and tell a friend to subscribe!
Subscribe
Sign up now to get notified of any updates or new articles.
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.