A new algorithm can see right through you — and me, too. Researchers have designed a computer system that detects how much pain patients are actually in. The system rates pain levels based on facial expressions, allowing physicians to know how to treat patients. It works by analyzing facial microexpressions and then calibrating the system for the patient in question, offering a base level of objectivity that’s not as easily possible for humans.
Discerning an accurate pain level matters because there are many patients experiencing serious pain that need help — and many people who struggle with addiction and illicitly seek out painkillers, which are highly addictive and potentially dangerous. If doctors have a reliable way to determine which patients truly need the medications, they can worry less about the knotty problem of determining who’s lying and who’s being honest about their pain.
In an attempt to render the rating pain objective, the researchers trained an algorithm, which they called “DeepFaceLIFT,” by giving it videos of people with shoulder pain wincing and grimacing after making different movements. The people in the videos then rated their pain levels. The results were published in the Journal of Machine Learning Research.
DeepFaceLIFT eventually learned to use the subtle differences in facial expressions to estimate pain levels, with movement around the nose and mouth revealing the most information about pain. While artificial intelligence has been used before to analyze expressions of pain, this could be the first system that can give personalized results based on a person’s age, sex, and skin complexion. The researchers determined that factoring in these individual characteristics yielded more accurate results than those attained from a one-size-fits-all system.
Although this algorithm may have some useful applications, it cannot yet replace the judgment of human doctors for several reasons. For example, it was trained using images with ideal photography and lighting conditions, but working in real-world conditions it might be less accurate.
Additionally, the task is inherently difficult because people experience and exhibit pain differently. How it is interpreted somewhat varies from person to person, and its expression is also dependent on culture and how long the sufferer has been enduring the pain. It’s therefore no surprise that self-reported pain scores often differ from doctor estimates.
Despite this challenges, the researchers will continue to train the program in the hopes that it will get even better at spotting pain.