Despite the common preconception that creating emotionally intelligent computers is something that won't happen until far into the future, computers can already augment — and in some cases even replace — emotional intelligence (EQ). In fact, while we may think of human emotional intelligence as our own unique province, computers are proving that this isn't necessarily the case.

One of the primary reasons humans sometimes fail to read the emotions of others accurately, readily believe lies, or miss social cues, is due to our focus on our own emotions. Computers, on the other hand, have no emotional entanglements of their own, which gives them an advantage. They can therefore focus entirely on the task of perceiving emotions in human subjects.

As University College London professor of business psychology Tomas Chamorro-Premuzic explains:

Robots do not need to be able to feel in order to act in an emotionally intelligent manner. In fact, contrary to what people think, even in humans high EQ is associated with lower rather than higher emotionality. [High EQ] is about controlling one’s impulses and inhibiting strong emotions in order to act rationally and minimize emotional interference.

Affective computing teaches computers to observe and interpret emotions in humans through body posture, facial features, gestures, physical states, and speech patterns. The next step is simply interpretation through trial and error, which is an ideal area for machine learning and AI.

Researchers are learning to replicate human emotions in robots for a variety of applications. One example? Consider Wall Street stock traders, who have to make split-second decisions with millions of dollars of other peoples' money. It's a high pressure environment, and emotional health of employees isn't typically optimal. This can lead to life-changing errors in judgment. Now, large businesses like Bank of America and JPMorgan Chase are partnering with tech companies to monitor the emotional health of traders in hopes of preventing serious mistakes, improving performance, and ensuring compliance.

Click to View Full Infographic

Bloomberg reports that several banks are collaborating with Humanyze, a startup that produces badges with sensors and microphones for monitoring activity, speech, and stress patterns in real time, then transmitting relevant data to management. Managers can then take prompt action, reinforcing good behavior and assisting traders who may need intervention to avoid mistakes.

Robots and Relationships

As it turns out, emotional robots also excel in customer service. Sony has announced plans to create customer service robots that will develop emotional bonds with customers. SoftBank's Pepper— billed as an emotional polyglot robot and interactive humanoid— is another robot that has serious customer service potential. And apps like Cogito use AI to guide human agents in using more emotional intelligence as they work with customers, a clever reverse-application of the technology.

There's also interpersonal potential that's quite a bit more personal: The Future of Sex Report, technology will be changing how we experience the human body  — sexual intercourse included — and fully functional sex robots that are emotionally responsive will be one of the most life-changing developments. But what could that mean for human relationships with machines, and their place within our society? Not to mention our relationships with each other? It begs the question: Why have a human significant other when you could have the perfect robot partner who has the emotional intelligence to meet your needs (and no needs of its own)?

In the meantime, more immediate uses for such robots will likely be geared toward therapeutic applications, possibly assisting humans who experience sexual dysfunction. Although it might be tempting to fear the worst of a future in which machines can detect our emotions, the reality is that emotional intelligence is already helping computing systems augment and surpass human capabilities.


Share This Article