Episode 21 Pain and Personalized Medicine | The Feelings Lab
Published on May 10, 2022
Join Hume AI CEO Dr. Alan Cowen, Dr. Daniel Barron, Harvard Medical School psychiatrist and Director of the Pain Intervention & Digital Research Program, and Matt Forte as they discuss pain and personalized medicine. Different people express and describe their pain differently, but how these signals are understood can have life-altering implications. We discuss the different kinds of pain: acute vs. chronic pain, central vs. peripheral, the enigma of phantom limb pain, and how physicians evaluate pain syndromes and their treatment. Can pain be measured objectively? Is there a role for quantitative tools in treating pain? Can AI help us reduce bias in how pain is diagnosed and treated?
We begin with psychiatrist Dr. Daniel Barron and Hume AI CEO Dr. Alan Cowen discussing how culture affects the way people think about, describe, and express their pain.
Psychiatrist Dr. Daniel Barron discusses the need for tools to help patients communicate their pain symptoms to doctors.
Psychiatrist Dr. Daniel Barron discusses how we can zero in on the data that will help physicians measure pain symptoms in a more unbiased, objective, and personalized fashion.
Hume AI CEO Dr. Alan Cowen and psychiatrist Dr. Daniel Barron explain how digital tools that surface quantitative information could help clinicians arrive at more reliable recommendations for patients.
All this and more can be found in our full episode, available on Apple and Spotify
Subscribe, and tell a friend to subscribe!
Subscribe
Sign up now to get notified of any updates or new articles.
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.