(A Sensory Substitution Device in action – Photo by Sasson Tiram)

If you grew up in the late eighties you probably remember the one-hit wonder “Ponderous” by Two Nu, a spoken-word odyssey describing a dream about a girl who could talk with her eyes. “And she’d say, ‘Can you SEE what I’m saying?’”

Being able to see through your ears may be no dream, though, according to recent research from the Hebrew University of Jerusalem: in the future, you may be able to HEAR what she’s seeing.

A team lead by Professor Amir Amedi has been researching the brain’s ability to translate audio inputs into sensory data processed by, what have previously been considered, regions of the brain dedicated to working with visual input. Amedi has found that, in fact, the area where data is processed has more to do with the type of information itself and less to do with how it was sensed.

This work has lead to new techniques that might allow visually-impaired people to “see” by using digital images translated into soundscapes by computers.

This is not all merely theoretical, however, nor restricted only to the blind. In fact, there is already an app available for iPhone and Android called EyeMusic which makes use of the Hebrew University research to translate visuals seen by the phone’s built-in camera into soundscapes that are played through the headphones.

(Can you hear the red one? – Photo by Wapster, CC-licensed, some rights reserved)

Although there is no particular magic to translating a digital image into arbitrary sound patterns — it’s all ones and zeros in the code — the real legerdemain happens in the brain itself. Amedi’s team observed congenitally blind test subjects using the Sensory Substitution Devices (SSD) to “see” numeric symbols. Despite their brains having never had the opportunity to develop a distinct visual recognition or distinction between letters and numerals, the “Visual Form Number Area” in the right inferior temporal gyrus was activated, demonstrating a distinction between words and numbers.

The finding suggests that development of the areas of the brain previously considered to be sensorily linked might, in fact, have more to do with contextual processing of data rather than the senses themselves.

In some ways, this is not a particularly surprising discovery. It’s long been known, for example, that Braille readers still use the “visual” areas of the brain (PDF) even though the inputs are coming from their sense of touch rather than sight. And previous work by Amedi has determined that SSD renditions of images of other people are processed in an area known for processing visual perceptions of the human body and body parts.

The discovery does have implications for neuroscience and evolutionary theory, providing new insights on how areas of the brain take on specializations and connect with one another. Reading skills, for example, might have less to do with visual symbol processing and more to do with language processing, a finding which could impact both brain science and learning theory.

The exciting possibility of sound-based “vision,” however, is that it can occur in real-time and using cheap, common devices for input and processing. Using nothing more than a pair of headphones and a common smartphone, EyeMusic can allow users to recognize facial expressions, determine colors and observe body postures. According to PsyPost, with intensive training, blind users of audio SSD programs are even able to read using the soundscape mode. According to National Geographic, the program takes around 70 hours to master. Visuals are rendered as musical notes, with the scene “playing” from left to right, higher notes indicating higher objects and different instruments representing different colors.

Although the resulting cacophony might have little harmony and bear little superficial resemblance to a tableau, it’s enough to simply present consistent cues to the brain: the processing centers train themselves how to connect the information into a coherent picture, just as they do with inputs from our eyes.

EyeMusic is free to download. Few users who are not visually impaired are likely to put in the 70 hours required to master the program, but plug in a pair of headphones, hold your phone up to the window, and close your eyes: see the world through your ears for a few minutes. You’re not dreaming, it’s science.


By FQTQ Contributor Scott Wilson


Share This Article