Episode 17 Empathy and Augmented Reality | The Feelings Lab
Published on Mar 22, 2022
Join Dr. Alan Cowen, CEO of Hume AI, and Arjun Nagedran, Co-Founder & CTO of Mursion, with Matt Forte in our episode on empathy and augmented reality (AR). We discuss how technology is poised to bring more empathy into the world, bridging physical, cultural, and demographic divides. Can augmented reality reveal and help us combat our prejudices? Can it serve as a liaison between people who communicate differently? Learn how companies like Mursion are already using AI-enhanced AR to coach people on how to be more empathic and communicate better in the workplace.
We begin with Dr. Alan Cowen discussing how we can preserve empathy in a world of digital communication. Text, voice, and even video calls are imperfect conduits of the wide array of expressive signals we rely upon to empathize with others. Using augmented reality, can we augment human empathy?
Arjun Nagendran, CTO of Mursion, details the many flavors of extended reality (XR), from augmented to virtual, and the promise of on-demand reality augmentation to help humans arrive at better decisions and more personal connections.
Dr. Alan Cowen explains that despite varied and individualized ways of displaying emotional expression across cultures, there is an underlying universality in the meaning of most expressions, even subtle ones, that augmented reality could draw upon to enhance the communication of emotion across cultures.
Arjun Nagendran, CTO of Mursion shares how AI-enabled avatars can provide a realistic training ground for social interaction, a safe space free of judgment that provides immediate feedback and actionable measurement. By blending AI with human-in-the-loop control, we can scale individualized interaction between teachers and learners.
All this and more can be found in our full episode, available on Apple and Spotify
Subscribe, and tell a friend to subscribe!
Subscribe
Sign up now to get notified of any updates or new articles.
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.