The brain-computer interface was able to generate faces as people pictured them.

In Your Head

A new brain-computer interface system represents an important step forward in interpreting people's neural activity.

After reading the brain waves of people who were instructed to focus on a picture of a person, an AI algorithm generated images based on the faces they were looking at, according to Psychology Today. While it doesn't mean that computers are going to read your mind any time soon, it is an important development in neuroadaptive technology.

Practice, Practice

The actual research, published earlier this month in the journal Nature Scientific Reports, involved a number of steps to first train then test the algorithm.

First, participants were shown a series of faces and asked to focus on those that fit a certain descriptor while wearing a device that analyzed their brain waves, according to the paper. That brain data was then used to train the algorithm on what signals corresponded with different characteristics, so that during tests it could then generate a new face that fit whatever criteria a participant thought about without explicitly communicating.

Cyborg Creations

The individual characteristics were fairly general, like "young," "smile," and "male," so the system wasn't interpreting a complete facial image from participants but rather picking up on one specific characteristic then generating a new face that happened fit the bill.

It's a fascinating development in brain-computer interfaces, as it means that machines can now generate brand-new information — in the form of a realistic human face — based on a simple human thought.

READ MORE: New Brain-Computer Interface Transforms Thoughts to Images [Psychology Today]

More on BCI: China Unveils First Chip Designed Specifically for Mind-Reading


Share This Article