Episode 23 Well-being | The Feelings Lab
Published on Jun 7, 2022
Can AI teach itself to improve our well-being? Join Dr. Alan Cowen, CEO of Hume AI, Dr. Dacher Keltner, professor of psychology at the University of California, Berkeley and founding director of the Greater Good Science Center, and podcast host Matt Forte as they discuss how the future of technology hinges on the measurement of human well-being.
We begin with Dr. Dacher Keltner discussing the overall effects of new technologies on the well-being of Gen Z.
Dr. Alan Cowen, Hume AI CEO, discusses how technology companies are not simply seeking to maximize engagement at all costs, and points to developments on the horizon for considering human well-being.
Dr. Alan Cowen, Hume AI CEO, elaborates on how well-being is the ultimate key to the ethical deployment of empathic AI.
Dr. Alan Cowen, Hume AI CEO, describes how AI technologies can incorporate self-report and objective indicators of user well-being.
All this and more can be found in our full episode, available on Apple and Spotify
Subscribe, and tell a friend to subscribe!
Subscribe
Sign up now to get notified of any updates or new articles.
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.