Apparently, the data that trained previous systems was no good.

Waving Hello

A team of scientists think they've finally figured out how to get artificial intelligence to recognize the gestures we use in everyday life. There are speech recognition bots out there, but until they can decipher what the hell it is we're doing with our hands, even the most sophisticated AI will be missing a crucial aspect of human communication.

The problem, it turns out, was the data being fed into these algorithms. But with better sensors, the Nanyang Technological University team thinks they've got it sorted out.

Power Glove

Sometimes, scientists will use computer vision (read: the tech-y verbiage for "video footage") to train AI, but it helps to supplement that data with the spatial information from people making gestures while wearing motion-capture gloves.

Those devices can be clunky — it turns out the data they provided was all-but-useless compared to the fine motions that AI would have to interpret. So the Nanyang Tech team did away with the heavy robotics and coated their performers' hands with a stretchy, form-fitting sensor instead, according to research published in the journal Nature Electronics.

Use The Force

The resulting AI isn't exactly a master conversationalist, and it's way too soon to expect any algorithm to truly comprehend what we say. But the team does know it at least understood what gestures meant: they were able to guide a robot through a maze using nothing but fine hand gestures.

It's always cool to learn that AI is starting to catch up, but for the wild-armed Italians among us (hi), it'll still be quite some time before machines understand what it is we're going on about.

READ MORE: Scientists develop artificial intelligence system for high precision recognition of hand gestures [Nanyang Technological University]

More on AI: This System Lets You Fly a Drone With Arm Gestures


Share This Article