EVI Web Search Demo: The First Interactive Voice AI Podcast
Published on May 15, 2024
Hume’s Empathic Voice Interface (EVI) is now the first voice AI API capable of native web search.
The first conversational voice AI podcast
To showcase our Empathic Voice Interface's (EVI) new ultra-fast web search functionalities, we’re introducing Chatter, the first interactive voice AI podcast. Chatter uses real-time web search to provide daily news updates — users can interrupt the conversational AI host to switch topics, or dig deeper into their favorite stories.
Speak with voice AI
Experience an early window into the future of interactive media here: https://chatter.hume.ai/
Imagine what you can build with empathic voice AI and web search:
-
Smart shopping assistants: seamlessly search for product reviews, compare prices, and find the best deals—all through voice commands.
-
Dynamic educational tools: create interactive learning experiences that use web search to find educational content tailored to student’s unique needs.
-
On-demand travel advisors: develop voice assistants that provide real-time travel tips, from restaurant reviews to local attractions, easily offer users up-to-date recommendations.
Chatter is just one exciting example of what’s possible with web search - the potential for innovative voice AI applications is limitless. Developers can start building today: platform.hume.ai
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.