Reading Sheep

One of today's more popular artificially intelligent (AI) androids comes from the TV series "MARVEL's Agents of S.H.I.E.L.D." Those of you who followed the latest season's story — no spoilers here! — probably love or hate ADA by now. One of the most interesting things about this fictional AI character is that it can read people's emotions. Thanks to researchers from the University of Cambridge, this AI ability might soon make the jump from sci-fi to reality.

The first step in creating such a system is training an algorithm on simpler facial expressions and just one specific emotion or feeling. To that end, the Cambridge team focused on using a machine learning algorithm to figure out if a sheep is in pain, and this week, they presented their research at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C.

Image credit: Robinson, et al./University of Cambridge

The system they developed, the Sheep Pain Facial Expression Scale (SPFES), was trained using a dataset of 500 sheep photographs to learn how to identify five distinct features of a sheep's face when the animal is in pain. The algorithm then ranks the features on a scale of 1 to 10 to determine the severity of the pain. Early tests showed that the SPFES could estimate pain levels with an 80 percent accuracy.

Humane and Human

SPFES was a departure for Peter Robinson, the Cambridge professor leading the research, as he typically focuses on systems designed to read human facial expressions. “There’s been much more study over the years with people,” Robinson explained in a press release.“But a lot of the earlier work on the faces of animals was actually done by Darwin, who argued that all humans and many animals show emotion through remarkably similar behaviors, so we thought there would likely be crossover between animals and our work in human faces.”

Click to View Full Infographic

As co-author Marwa Mahmoud explained, “The interesting part is that you can see a clear analogy between these actions in the sheep’s faces and similar facial actions in humans when they are in pain – there is a similarity in terms of the muscles in their faces and in our faces.”

Next, the team hopes to teach SPFES how to read sheep facial expressions from moving images, as well as train the system to work when a sheep isn't looking directly at a camera. Even as is, though, the algorithm could improve the quality of life of livestock like sheep by facilitating the early detection of painful conditions that require quick treatment, adding it to the growing list of practical and humane applications for AI.

Additional developments could lead to systems that are able to accurately recognize and react to human emotions, further blurring the line between natural and artificial intelligences.


Share This Article