Episode 11 Compassion and Robots | The Feelings Lab
Published on Feb 1, 2022
In the Season 2 Premiere of The Feelings Lab, join Hume AI CEO, Dr. Alan Cowen and Embodied CEO, Dr. Paolo Pirjanian, with host Matt Forte as they discuss "Compassion and Robots."
What will it take to assuage some people's fear of robots? Can robots empathize? Can they deliver therapies, aid in child development, and give us deeper insight into ourselves? We discuss what it will take to make robots compassionate, and how the future of AI may hinge on this central challenge.
Dr. Paolo Pirjanian, CEO of Embodied, starts us off by noting how curious robots can help humans think through our own questions and even reflect on our feelings.
Next hear Dr. Alan Cowen, CEO of Hume AI, discuss how giving robots the empathic abilities needed to care for human well-being will help us avoid the outcomes that people are most afraid of.
From R2D2 to Her, Dr. Alan Cowen, CEO of Hume AI, and Dr. Paolo Pirjanian, CEO of Embodied, reflect on what sci-fi has gotten right and wrong about the future of robots.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.