In Brief
Could AI determine what you're looking at right now? One new study shows that it's possible.

Japanese scientists know what you’re looking at — but don’t worry, there’s no need to close your other browser tabs yet. Using an artificial intelligence (AI) system alongside fMRI scans, researchers were able to create an apparently mind-reading AI — “or perhaps at this point just mind skimming,” said Umut Güçlü, a researcher at Radboud University in the Netherlands who was not involved in the research, to New Scientist

The system is actually similar to AI technologies that have been used successfully to caption images. To do this for someone’s brain, the AI first needs an image of their brain taken with a fMRI scanner while the person is looking at an image. These scans show activity in the brain through blood flow.

Three women sitting on a bench and looking at a wall covered in black and white images of various people. A mind-reading AI might be able to describe what each woman was seeing.
How accurately could a mind-reading AI tell what you’re looking at? Image Credit: geralt / pixabay

The mind-reading AI isn’t always completely correct; in one of the tests, it thought a participant was looking at scissors, when they were looking at a clock. Yet even when wrong, it sometimes came tantalizingly close. For example, when one person being scanned was looking at an image of a man is kayaking in a river, the AI captioned it: A man is surfing in the ocean on his surf board.

In other cases, the AI was spot on: when the image was of a group of people standing next to each other, or of a black and white dog, the system was absolutely right.

The system presently has its limits. Images from fMRI don’t record all activity in the brain, and so there are boundaries to how detailed these captions can be. This method also requires a participant to lie in a large machine, making it poorly suited for use anywhere but in a medical facility.

While at-home applications might be far off, this type of technology could be used to support the development of brain-computer interfaces (BCIs). Emerging BCI tech uses small electrodes, as opposed to fMRI machines, to monitor brain activity. This research could potentially support these efforts and one day allow humans, with the help of their mind-reading AI, to control computers with only their minds. We’re nowhere near these abilities, but we can almost picture it now — and our AI would probably see it, too.