Determining a person’s emotions based solely on their facial expressions isn’t always easy, nor are the conclusions drawn always accurate. However, new technology coming from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) can measure even the subtlest changes in breathing and heart rhythm, allowing the researchers to detect whether a person is happy, sad, excited, or angry.
CSAIL’s new device, dubbed the EQ-Radio, extracts its data from wireless signals, making it more convenient and efficient than existing methods within the global emotion-detection space, which rely on on-body sensors or facial-recognition technology.
“[EQ-Radio] sends wireless signals that reflect off of a person’s body and back to the device. Its beat-extraction algorithms break the reflections into individual heartbeats and analyze the small variations in heartbeat intervals to determine their levels of arousal and positive affect,” says MIT professor and project lead Dina Katabi, who co-wrote a paper on the topic with PhD students Mingmin Zhao and Fadel Adib.
These measurements are used to determine the emotion. When the signals show low arousal and negative affect, the device registers the emotion as sad. Conversely, high arousal and positive affect is interpreted as excited.
Correlations will, of course, vary depending on the subject, but by understanding how the human heartbeat reacts across various emotional states, EQ-Radio is able to detect primary emotions with 87 percent accuracy.
“By recovering measurements of the heart valves actually opening and closing at a millisecond time-scale, this system can literally detect if someone’s heart skips a beat,” says Adib.
EQ-Radio reveals how wireless signals can reliably gather information on human behavior that is not immediately apparent, which could have useful applications within the entertainment and consumer-behavior industries, as well as immense potential for use within healthcare and diagnostics.
“Our work shows that wireless signals can capture information about human behavior that is not always visible to the naked eye,” says Katabi. “We believe that our results could pave the way for future technologies that could help monitor and diagnose conditions like depression and anxiety.”