Artificial intelligence (AI) is all about getting a machine to mimic a human in every way: thought, speech, movement. That's why one of the tests for AI is the Turing test: whether a robot can fool a human into thinking it is conversing with another of its own species.
An integral part of accomplishing this is making the AI recognize human emotions. So one research lab is working on the next iteration of virtual assistants, those that can recognize and react to emotional cues. SRI International, the birthplace of Siri, is working on better chatbots and phone assistants that can detect agitation, confusion, and other emotional states, and respond accordingly.
The technology, called SenSay Analytics, is envisioned to analyze human behaviour that can indicate emotion, like typing patterns, speech tone, facial expressions, and body movements. This would then be used to tailor the machine's reaction. For example, a virtual assistant, via phone or face-to-face, would slow down when the customer is confused, or try to explain what it is doing. Current tech can actually detect emotion already, but it is the reaction side that SRI is trying to polish.
Being like Her
Ever watch Her? The movie actually depicts the logical endgame of this technology, where a natural language assistant can interact and predict the needs of the user, without a humanoid body.
Although the film also showed the imperfection of human-like AI, the movie illustrates the need for adequate language and emotional understanding in the technology. In the future, our chatbots would have to be able to break down the nuances of language and human emotion, in order to actually understand what humans are saying.
These developments from SRI could be a piece of that puzzle, interpreting verbal and non-verbal cues in order to gauge and react to emotion.