Hume Powers Projects at UC Berkeley LLM Hackathon
Published on Aug 11, 2023
UC Berkeley hosted the world's largest AI hackathon on June 17-18, with over 1200 students developing diverse applications using large language models and open-source APIs. Hume's APIs were used alongside LLMs in 57 of the 240 projects and three of the 12 finalists, ranging from healthcare devices to education tools, indicating its wide applicability and versatility.
Read on to learn about some highlighted projects:
mila - Recognize signs of postpartum depression
The team behind mila, winner of the Best Use of Hume recognition, designed a device to better support mothers with postpartum depression (PPD), a healthcare gap that affects four out of five new moms in the United States. Motivated by their own medical and motherhood experiences, the developers emphasized a need for emotional awareness in AI and technology. “There’s a lot of people using [AI] in ways that are fast and fun, but empathy and human connection is very important and meaningful,” one member stated, noting Hume’s mission of elevating human well-being. By implementing Hume’s APIs to translate real-time audio input of daily conversations into expressions associated with certain emotions, mila and its companion app detect early indicators of PPD. The tool connects mothers with healthcare professionals to better pinpoint actions, tailor follow-ups, and support mental well-being.
“As a multidisciplinary team, passionate about women’s health, our project is driven by the core belief of amplifying the voices of women and supporting their mental well-being during the transformative journey of motherhood.”
The winners, in addition to winning $1,000, enjoyed a conversation with our CEO, Dr. Alan Cowen.
EdGauge - Improve student focus and retention
Educators can take advantage of EdGauge to evaluate student comprehension of academic material through real-time feedback and advice. The hackathon team, inspired by their time as teaching assistants in college, wanted to provide educators a tool to receive feedback on student performance. By reading expressions in a classroom and using Hume’s platform to tag expressions that relate to confusion, boredom, and concentration, teachers can personalize and modify strategies to maximize learning.
Polysphere - Elevate your music listening experience
Polysphere tracks users’ real-time reactions to songs, providing tailored song recommendations and connecting compatible users. Hume’s facial expression measurement capabilities incorporate the complexity of the listening experience, capturing the range of emotions that music can evoke. The developers hope to improve the quality of content consumed, foster friendships, and revolutionize the discovery and sharing of music.
Violet - Access intelligent, empathic AI therapists
Violet, a voice-enabled AI therapist, involves users in genuine and organic conversations to assess the user’s mental well-being. The team incorporated Hume’s APIs into OpenAI’s GPT-4 to generate an emotionally intelligent and powerful virtual counselor. Hume’s technology allows Violet to surpass mere audio transcription, comprehending users’ facial expressions and speech patterns to assess topics of interest and select effective therapeutic approaches.
All projects can be viewed on the Hackathon DevPost website.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
00/00
We’re introducing Voice Control, a novel interpretability-based method that brings precise control to AI voice customization without the risks of voice cloning. Our tool gives developers control over 10 voice dimensions, labeled “masculine/feminine,” “assertiveness,” “buoyancy,” “confidence,” “enthusiasm,” “nasality,” “relaxedness,” “smoothness,” “tepidity,” and “tightness.” Unlike prompt-based approaches, Voice Control enables continuous adjustments along these dimensions, allowing for precise control and making voice modifications reproducible across sessions.
Hume AI creates emotionally intelligent voice interactions with Claude
Hume AI trained its speech-language foundation model to verbalize Claude responses, powering natural, empathic voice conversations that help developers build trust with users in healthcare, customer service, and consumer applications.
How EverFriends.ai uses empathic AI for eldercare
To truly connect with users and provide a natural, empathic experience, EverFriends.ai needed an AI solution capable of understanding and responding to emotional cues. They found their answer in Hume's Empathic Voice Interface (EVI). EVI merges generative language and voice into a single model trained specifically for emotional intelligence, enabling it to emphasize the right words, laugh or sigh at appropriate times, and much more, guided by language prompting to suit any particular use case.