Episode 20 Empathy and User Research | The Feelings Lab
Published on Apr 27, 2022
Join Dr. Alan Cowen, CEO of Hume, and dscout CEO Michael Winnick with host Matt Forte as they discuss Empathy and User Research. How can we foster empathy with users at the speed and scale needed to drive innovation in AI? How does empathy help researchers design products and product experiences? How can we get “out of the lab” to study the messy reality of user experience in the real world? How can technology strengthen human bonds and connections, not replace them?
We begin with Dr. Alan Cowen clarifying the differences between studying human behavior in academia vs. user research in the real world.
Dr. Alan Cowen and dscout CEO and founder Michael Winnick discuss screen time: the cognitive overhead of constant interaction with our screens, the subsequent reduced sensory experiences as a result, and the need for a better digital portal.
Dr. Alan Cowen and dscout CEO and founder Michael Winnick discuss a shared goal of utilizing technology to help strengthen human bonds and connections, not replace them.
Michael Winnick and Dr. Alan Cowen on the future of user research, and the promise of AI's ability to scale traditionally qualitative and unstructured user research data to qualify useful signals from it.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.
How can emotionally intelligent voice AI support our mental health?
Recent advances in voice-to-voice AI, like EVI 2, offer emotionally intelligent interactions, picking up on vocal cues related to mental and physical health, which could enhance both clinical care and daily well-being.