EVI Web Search Demo: The First Interactive Voice AI Podcast
Published on May 15, 2024
Hume’s Empathic Voice Interface (EVI) is now the first voice AI API capable of native web search.
The first conversational voice AI podcast
To showcase our Empathic Voice Interface's (EVI) new ultra-fast web search functionalities, we’re introducing Chatter, the first interactive voice AI podcast. Chatter uses real-time web search to provide daily news updates — users can interrupt the conversational AI host to switch topics, or dig deeper into their favorite stories.
Speak with voice AI
Experience an early window into the future of interactive media here: https://chatter.hume.ai/
Imagine what you can build with empathic voice AI and web search:
-
Smart shopping assistants: seamlessly search for product reviews, compare prices, and find the best deals—all through voice commands.
-
Dynamic educational tools: create interactive learning experiences that use web search to find educational content tailored to student’s unique needs.
-
On-demand travel advisors: develop voice assistants that provide real-time travel tips, from restaurant reviews to local attractions, easily offer users up-to-date recommendations.
Chatter is just one exciting example of what’s possible with web search - the potential for innovative voice AI applications is limitless. Developers can start building today: platform.hume.ai
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Are emotional expressions universal?
Do people around the world express themselves in the same way? Does a smile mean the same thing worldwide? And how about a chuckle, a sigh, or a grimace? These questions about the cross-cultural universality of expressions are among the more important and long-standing in behavioral sciences like psychology and anthropology—and central to the study of emotion.
How can artificial intelligence achieve the level of emotional intelligence required to understand what makes us happy? As AI becomes increasingly integrated into our daily lives, the need for AI to understand emotional behaviors and what they signal about our intentions and preferences has never been more critical.
For AI to enhance our emotional well-being and engage with us meaningfully, it needs to understand the way we express ourselves and respond appropriately. This capability lies at the heart of a field of AI research that focuses on machine learning models capable of identifying and categorizing emotion-related behaviors. However, this area of research is frequently misunderstood, often sensationalized under the umbrella term "emotion AI"--AI that can “detect” emotions, an impossible form of mind-reading.