TNS
VOXPOP
What news from AWS re:Invent last week will have the most impact on you?
Amazon Q, an AI chatbot for explaining how AWS works.
0%
Super-fast S3 Express storage.
0%
New Graviton 4 processor instances.
0%
Emily Freeman leaving AWS.
0%
I don't use AWS, so none of this will affect me.
0%
Edge Computing / Tech Life

Startup Wants to Diagnose Disease by Analyzing Emotions in Your Voice

Oct 23rd, 2016 5:12am by
Featued image for: Startup Wants to Diagnose Disease by Analyzing Emotions in Your Voice

Beyond harnessing language in order to communicate ideas, our voices are a treasure trove of subtle bits of emotional information. One’s words may say something quite neutral on its own, but it’s the intonation of the voice — the variation in pitch — that may belie how one truly feels.

Those subtle tones found in a voice may actually also reveal clues about one’s health. This seemingly odd notion is something that emotional analytics startup Beyond Verbal is betting on with a recent spinoff of its patented “emotion engine” software.

Dubbed the Beyond mHealth Research Platform, the project will collect and analyze human voices for “biomarkers” that may one day help diagnose conditions like depression, heart disease or even neurological diseases like Parkinson’s.

The Israel-based company has been working on the technology for the last two years, with the intention of creating a tool that will someday be used by medical professionals and research institutions to detect “acoustic abnormalities” that may be early indicators of ill health.

Compared to conventional methods of disease diagnosis, the company sees monitoring the voice as a more “passive, non-invasive and continuous” solution. They also foresee that with an expanding Internet of Things populated by a myriad of smart devices, the possibility of having those gadgets respond intelligently to the emotions embedded within our voices is an enormous untapped opportunity.

But can our voices really give convincing indications about our health? According to Venture Beat, the company is already collaborating with partners like the Scripps Research Institute, Mayo Clinic, and Jerusalem’s Hadassah Medical Center to source voice samples to be used in further researching the links between voice and various medical conditions, like post-traumatic stress disorder (PTSD) and neurological disorders.

“One of the things that we’ve been doing is research with the Mayo Clinic on the ability to detect some heart problems just by analyzing the tone of voice,” said Beyond Verbal CEO Yuval Mor. “And when we realized that these types of correlations exist, we’ve decided to [expand] the work we’ve been doing.”

Building upon over two decades of research and data culled from more than 2.5 million voice samples in 40 languages, the company’s previous forays into the emotion analytics field included the Moodies app, designed to decode a wide range of human emotions in real-time, as a person is speaking — as well as a voice-driven emotion analytics API that allows developers to integrate this emotion-sensing technology into other products.

More Personalized Digital Experiences

While the idea of a machine that knows how you feel in real-time may be an unsettling one for some, emotional intelligence in machines is seen by many as something essential in personalizing our digital experiences further, especially if companies are to continue exploring voice-only interfacing with our smart devices.

Emotional data can be parlayed into more tailored responses from our gadgets, allowing users to make better-informed choices, whether it’s choosing the right restaurant to fit the mood of the moment, or a prompt to go see a doctor if you’re feeling — and sounding — off. It could even get us to more accurately decipher what other humans are feeling, very likely improving relationships and overall quality of life.

Besides the health applications, businesses and advertisers could also use the technology to hone in even more accurately on what makes consumers tick. Of course, emotion analytics would be an integral part in building an emotionally intelligent machine that not only has cognitive reasoning, but emotional reasoning as well — considered by many affective computing experts to be vital for true machine intelligence.

“We believe that emotions understanding can dramatically alter the way we interact with machines and with each other,” Beyond Verbal’s VP of marketing Dan Emodi told The Next Web. “Allowing a machine to understand – what is for every practical purpose – the most important medium in human communications opens up boundless opportunities.”

In addition to enlisting the participation of research institutions around the world to help develop this new platform, the company is also creating a disease-detecting, emotion analytics API that could be used in smart homes and cars, as well as integrating it on digital assistants like Siri. Ultimately, it’s an intriguing development that may herald a new kind of medicine that not only treats physical symptoms but also recognizes the important role that emotions play in our health.

Image: Moodies app, Beyond Verbal.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.