Episode 17 Empathy and Augmented Reality | The Feelings Lab
Published on Mar 22, 2022
Join Dr. Alan Cowen, CEO of Hume AI, and Arjun Nagedran, Co-Founder & CTO of Mursion, with Matt Forte in our episode on empathy and augmented reality (AR). We discuss how technology is poised to bring more empathy into the world, bridging physical, cultural, and demographic divides. Can augmented reality reveal and help us combat our prejudices? Can it serve as a liaison between people who communicate differently? Learn how companies like Mursion are already using AI-enhanced AR to coach people on how to be more empathic and communicate better in the workplace.
We begin with Dr. Alan Cowen discussing how we can preserve empathy in a world of digital communication. Text, voice, and even video calls are imperfect conduits of the wide array of expressive signals we rely upon to empathize with others. Using augmented reality, can we augment human empathy?
Arjun Nagendran, CTO of Mursion, details the many flavors of extended reality (XR), from augmented to virtual, and the promise of on-demand reality augmentation to help humans arrive at better decisions and more personal connections.
Dr. Alan Cowen explains that despite varied and individualized ways of displaying emotional expression across cultures, there is an underlying universality in the meaning of most expressions, even subtle ones, that augmented reality could draw upon to enhance the communication of emotion across cultures.
Arjun Nagendran, CTO of Mursion shares how AI-enabled avatars can provide a realistic training ground for social interaction, a safe space free of judgment that provides immediate feedback and actionable measurement. By blending AI with human-in-the-loop control, we can scale individualized interaction between teachers and learners.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.
How can emotionally intelligent voice AI support our mental health?
Recent advances in voice-to-voice AI, like EVI 2, offer emotionally intelligent interactions, picking up on vocal cues related to mental and physical health, which could enhance both clinical care and daily well-being.
Are emotional expressions universal?
Do people around the world express themselves in the same way? Does a smile mean the same thing worldwide? And how about a chuckle, a sigh, or a grimace? These questions about the cross-cultural universality of expressions are among the more important and long-standing in behavioral sciences like psychology and anthropology—and central to the study of emotion.