Episode 21 Pain and Personalized Medicine | The Feelings Lab
Published on May 10, 2022
Join Hume AI CEO Dr. Alan Cowen, Dr. Daniel Barron, Harvard Medical School psychiatrist and Director of the Pain Intervention & Digital Research Program, and Matt Forte as they discuss pain and personalized medicine. Different people express and describe their pain differently, but how these signals are understood can have life-altering implications. We discuss the different kinds of pain: acute vs. chronic pain, central vs. peripheral, the enigma of phantom limb pain, and how physicians evaluate pain syndromes and their treatment. Can pain be measured objectively? Is there a role for quantitative tools in treating pain? Can AI help us reduce bias in how pain is diagnosed and treated?
We begin with psychiatrist Dr. Daniel Barron and Hume AI CEO Dr. Alan Cowen discussing how culture affects the way people think about, describe, and express their pain.
Psychiatrist Dr. Daniel Barron discusses the need for tools to help patients communicate their pain symptoms to doctors.
Psychiatrist Dr. Daniel Barron discusses how we can zero in on the data that will help physicians measure pain symptoms in a more unbiased, objective, and personalized fashion.
Hume AI CEO Dr. Alan Cowen and psychiatrist Dr. Daniel Barron explain how digital tools that surface quantitative information could help clinicians arrive at more reliable recommendations for patients.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Introducing Hume’s Empathic Voice Interface (EVI) API
Last month, we released the demo of our Empathic Voice Interface (EVI). The first emotionally intelligent voice AI API is finally here! EVI does a lot more than stitch together transcription, LLMs, and text-to-speech. With a new empathic LLM (eLLM) that processes your tone of voice, EVI unlocks new capabilities like knowing when to speak, generating more empathic language, and intelligently modulating its own tune, rhythm, and timbre. EVI is the first voice AI that really sounds like it understands you.
What is semantic space theory?
Our models and products are built on a cutting-edge approach to understanding emotion: semantic space theory (SST), which uses computational methods and data-driven approaches to map the full spectrum of our feelings