Alexander Krivitsky via Unsplash / Futurism
Does Not Computer

Microsoft Scientist: Emotion-Reading AI Is Doomed To Fail

byDan Robitzski

Why do AI engineers keep reinventing garbage science?

Affective AI

Artificial Intelligence developers have an uncanny knack for reinventing bunk pseudoscience. Whether it’s resuscitating phrenology as facial recognition that can supposedly determine someone’s personality or claiming to universally detect emotions based on appearance, the AI field has a long history of claiming to do the impossible.

The challenge is that building an algorithm to detect someone’s emotions ignores cultural differences and other important factors, Microsoft and University of South California Annenberg researcher Kate Crawford argues in The Atlantic. In an adapted segment of her book, “Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence,” Crawford lays out the complicated and flawed history of scientists trying to tie emotion to specific facial movements — and how AI algorithms attempting to do the same are essentially doomed to fail.

Try, Try Again

Scientists have been trying for decades to codify the facial expressions linked to different emotions, Crawford wrote, and yet it’s never worked. Researchers repeatedly tried to create idealized images of people making an expression corresponding to a specific emotion, but photo matches that made sense in one culture fell apart in others — so training an algorithm to do the same comes across as an exercise in futility.

Take, for instance, the Transportation Security Administration’s facial expression screening algorithm, SPOT, which Crawford wrote was meant to automatically spot terrorists after 9/11 by pinpointing travelers expressing stress, fear, or deception. Despite spending $900 million on the algorithm, there’s no evidence suggesting it ever worked.

Advertisement

“This is the danger of automating emotion recognition. These tools can take us back to the phrenological past, when spurious claims were used to support existing systems of power,” Crawford wrote. “Emotions are complicated, and they develop and change in relation to our cultures and histories — all the manifold contexts that live outside the AI frame.”

READ MORE: Artificial Intelligence Is Misreading Human Emotion [The Atlantic]

More on emotion AI: New AI Detects Your Emotions By Scanning You With Radio Signals


As a Futurism reader, we invite you join the Singularity Global Community, our parent company’s forum to discuss futuristic science & technology with like-minded people from all over the world. It’s free to join, sign up now!

Advertisement

Share This Article

Copyright ©, Singularity Education Group All Rights Reserved. See our User Agreement, Privacy Policy and Cookie Statement. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.