Forget OpenAI's ChatGPT, Hume AI's Empathetic Voice Interface (EVI) Might Be the Next Big Thing in AI! – Analytics India Magazine
Hume AI has introduced a conversational AI named Empathic Voice Interface (EVI), with emotional intelligence. EVI sets itself apart by comprehending the user’s tone of voice, adding depth to every interaction and tailoring its responses accordingly.
Interestingly, it almost feels like you are talking to a human.
Click here to check it out for yourself.
✨EVI has a number of unique empathic capabilities
1. Responds with human-like tones of voice based on your expressions
2. Reacts to your expressions with language that addresses your needs and maximizes satisfaction
3. EVI knows when to speak, because it uses your tone of voice…
EVI is a new AI system that understands and generates expressive speech, trained on millions of human conversations. Developers can now seamlessly integrate EVI into various applications using Hume’s API, offering a unique voice interface experience.
EVI boasts several distinctive empathic capabilities:
In addition to its empathic features, EVI offers fast, reliable transcription and text-to-speech capabilities, making it versatile and adaptable to various scenarios. It seamlessly integrates with any Language Model Library (LLM), adding to its flexibility and utility.
EVI is set to be publicly available in April, offering developers an innovative tool to create immersive and empathetic voice interfaces. Developers eager for early access to the EVI API can express their interest by filling out the form at https://bit.ly/evi-waitlist.
Founded in 2021, Hume is a research lab and technology company with a mission to ensure that artificial intelligence is built to serve human goals and emotional well-being. It is founded by Alan Cowen, a former researcher at Google AI.
“We believe voice interfaces will soon be the default way we interact with AI. Speech is four times faster than typing; frees up the eyes and hands; and carries more information in its tune, rhythm, and timbre. That’s why we built the first AI with emotional intelligence to understand the voice beyond words. Based on your voice, it can better predict when to speak, what to say, and how to say it,” wrote Cowen on LinkedIn.
The company raised a $50 million Series B funding from EQT Group, Union Square Ventures, Nat Friedman, Daniel Gross, Northwell Holdings, Comcast Ventures, LG Technology Ventures, and Metaplanet.
OpenAI is currently working on a Voice Engine, according to a user on X. Voice Engine will include features like voice and speech recognition, processing voice commands, and converting between text and speech.
It will also have automatic speech and voice recognition and generation, along with creating and generating voice and audio outputs based on natural language prompts, speech, visual prompts, images, and video
In the episode of Unconfuse Me with Bill Gates, Altman pointed out that OpenAI is on ‘this long, continuous curve’ to create newer and better models. He highlighted the importance of multimodality as the key aspect of GPT-5 that enables it to process video input and generate new videos while confirming that the work on the model has already begun.
Altman also spoke at length with Gates about how GPT-5 would emphasise on customisation and personalisation. “The ability to know about you, your email, your calendar, how you like appointments booked, connected to other outside data sources—all of that. Those will be some of the most important areas of improvement,” said Altman.
Last year, OpenAI launched a voice assistant in the ChatGPT app on Android and iOS, enabling users to engage in back-and-forth conversations. The ChatGPT Voice feature includes diverse voices such as Ember, Sky, Breeze, and Cove.
OpenAI recently partnered with Figure AI to build generative AI powered Humanoids. In a recent video released by Figure, the humanoid robot Figure 01 was seen perfectly holding a natural conversation with a human, passing him the apple.
Emotional Intelligence Matters
Conversational AI chatbot that understands emotional intelligence is the future. “Chatbots that are polite, and understand sentiment, emotion, etc give rise to better businesses. Chatbots that are closer to human beings, emotional and sentiment, bring commercial profits along, which is quite motivating,” said IIT Bombay professor and computer scientist Pushpak Bhattacharyya in an exclusive interview with AIM.
Access all our open Survey & Awards Nomination forms in one place >>
Discover special offers, top stories, upcoming events, and more.
Stay Connected with a larger ecosystem of data science and ML Professionals
The company is seeking data scientists and engineers to further bolster its tech team.
And we are nowhere close; the current Neuralink has about 1,000 electrodes implanted in the
Decision Point’s expertise has garnered recognition from industry giants such as Microsoft and the Promotion
AI experts agree more on the topic than they wish to accept.
Over the next four years, the strategic partnership will support US$500 million in overall business
The king of NoSQL is now looking to attract more customers with its real-time database
Almost feels like you are talking to a human
NVIDIA has commenced the shipment of H200 GPUs today, which will soon be accessible through
OpenAI launched the GPT Store on March 12 for ChatGPT Plus, Team, and Enterprise users.
© Analytics India Magazine Pvt Ltd & AIM Media House LLC 2024
The Belamy, our weekly Newsletter is a rage. Just enter your email below.