Episode 15 Awe and Digital Art | The Feelings Lab
Published on Mar 1, 2022
In a world where you can sketch a character and it will come to life, how will AI transform the arts? Join Dr. Alan Cowen, CEO of Hume, Richard Kerris, VP of Omniverse Platform Development at NVIDIA, and host Matt Forte as they discuss the present and future of “Awe in Digital Art.” How does art evoke emotion? Can AI help us unlock imaginative universes that allow us to experience the sublime in a newly immersive, personalized, and educational fashion? We discuss how AI tools that give individual artists the creative power of a movie studio will affect the future of art and entertainment, and whether we should grieve the extinction of physical puppets from films like Jurassic Park by Steven Spielberg.
NVIDIA's Richard Kerris describes how AI is poised to play a significant role in helping artists tell their stories, especially as the demand for content increases in our hyperconnected world.
Hume AI CEO Dr. Alan Cowen explains how art conveys nuanced emotions at every level of form and style, and how AI can help artists to push emotive boundaries and extend the imagination.
Dr. Alan Cowen and Richard Kerris discuss how AI is increasing our capacity to turn imagination into reality, turning language and sketches into immersive experiences of film and art by simulating the physical and social laws that govern our world.
Next hear Dr. Alan Cowen and Richard Kerris share their predictions for how the future of AI will transform art, enabling ground-breaking extensions of reality and human imagination that will evoke greater and more frequent experiences of awe and the sublime.
All this and more can be found in our full episode, available on Apple and Spotify
Subscribe, and tell a friend to subscribe!
Subscribe
Sign up now to get notified of any updates or new articles.
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.