Introducing Hume’s Empathic Voice Interface (EVI), the first conversational AI with emotional intelligence. Try it here: demo.hume.ai EVI understands the user’s tone of voice, which adds meaning to every word, and uses it to guide its own language and speech. Developers can use this API as a voice interface for any application. ✨EVI has a number of unique empathic capabilities ✨ 1. Responds with human-like tones of voice based on your expressions. 2. Reacts to your expressions with language that addresses your needs and maximizes satisfaction. 3. EVI knows when to speak, because it uses your tone of voice for state of the art end-of-turn detection. 4. Stops when interrupted, but can always pick up where it left off. 5. Learns to make you happy by applying your reactions to self-improve over time. And includes fast, reliable transcription and text-to-speech and can hook into any LLM. EVI will be publicly available in April. If you’re a developer interested in earlier access to the API fill out this form: https://lnkd.in/gCADKxfH If you’re interested in working on EVI and aligning AI with human well-being, we’re hiring: https://lnkd.in/gaDnibAc
Hume AI
Research Services
Empathic AI research lab building multimodal AI with emotional intelligence. Experience our API at demo.hume.ai
About us
Empathic AI research lab building multimodal AI with emotional intelligence: demo.hume.ai
- Website
-
http://hume.ai
External link for Hume AI
- Industry
- Research Services
- Company size
- 11-50 employees
- Headquarters
- New York
- Type
- Privately Held
- Founded
- 2021
Locations
-
Primary
New York, US
Employees at Hume AI
Updates
-
EVI, the frontier voice AI with emotional intelligence, is now a lot smarter—and available as an iOS app! 📲 Our new app is part of the next phase of our journey to optimize AI for human well-being. Featuring a bold new and improved AI voice named Kora 💁♀️ and integrating Claude 3.5 Sonnet into its responses, EVI is ready to listen, answer, and explore → https://apple.co/3zaFVkV Created by leading emotion scientists and AI researchers at Hume AI, EVI is the first AI with emotional intelligence. It speaks like a human and understands you better than any chatbot 🗣️ Our new voice Kora is also available to developers through the EVI API. Start building at http://beta.hume.ai #voiceai #empathicai #emotionallyintelligentai #sonnet #anthropic
-
The EVI API now supports tool use with Anthropic models! Developers can execute functions during chats with the Claude 3 models: 🌸 Haiku 🖋 Sonnet 💎 Opus Try creating an action-capable voice AI today! View our tool use docs: https://lnkd.in/eCriU-JB Create an EVI in our playground: https://lnkd.in/emAedv8P
-
It’s now easier than ever to build your own voice AI web app. Customize EVI and deploy it one click with our Vercel Next.js template! → https://lnkd.in/ecXUiV72
-
Our CEO Alan Cowen on why emotionally intelligent voice interfaces are the key to AI that understands our needs. Listen to the full conversation from Vercel Ship here: https://lnkd.in/ex_GqTpy
-
Our EVI API now supports chat resumability and persistence! This allows developers to: ⏯️ Pick up chats right from where users left off 📚 Maintain context across sessions 💡 Build personalized voice AI that remembers past conversations Check it out in our docs: https://lnkd.in/eQccaWjS
-
EVI can now handle inbound calls 📲 Start building: https://lnkd.in/eygz7pp9
-
EVI just became the only voice API capable of native web search. To celebrate, we're launching Chatter, the first interactive AI podcast. Try it here: https://chatter.hume.ai/ Web search is just one built-in tool in EVI’s toolkit. Create your own ultra-capable empathic voice AI today: beta.hume.ai/playground
-
Our empathic voice interface can take any application to the next level. Experience EVI's impact through a demo from one of our customers!
A conversation with AI about neuroscience, mental health, Greek philosophy, and politics. Unscripted, unedited, uncut. Powered by Hume AI and Anthropic in Thumos Care. Coming to you soon!
Thumos Care Demo - Neuroscience and Philosophy
https://www.loom.com
-
The EVI API now supports tool use through function calling! Developers can specify custom functions that EVI invokes automatically during the conversation. For example, EVI can update databases, schedule appointments, check the weather, or take actions while it's talking. See our docs to start building today: https://lnkd.in/eCriU-JB