Drone technology just got a step closer to becoming fully self-navigating: Taking a page out of a bat’s playbook, engineers developed a rig that lets drones chart out their surroundings using echolocation.
Purdue University engineers equipped a drone with a speaker and an array of four microphones and let it loose in a series of rooms, according to research published last month in the SIAM Journal on Applied Algebra and Geometry. And while self-navigating drones are neat, the engineers said in a press release their work could lead to better car backup cameras or help people with disabilities navigate.
Bats echolocate by basically screaming into the air . They detect how long it takes the sound waves to return and whether they were perturbed along the way, which would suggest food or an obstacle nearby.
The drone works in much the same way. By measuring how long it took the sound waves to bounce around and return to each microphone, the drone was able to accurately map out the room’s walls and avoid collisions.
By cross-checking each microphone’s readings against the others, the drone was able to successfully rule out false positives — “ghost walls” — as it flew around.
For now, the four-microphone setup is good for mapping out rooms. The next step would be to scale up the microphone array to detect smaller obstacles. Or, since a car can’t fly around the room to see what’s behind it, to map out its surroundings without needing to move.
READ MORE: With a speaker and four microphones, drones can echolocate like bats [Purdue University]
More on echolocation: A New AI Algorithm Can Track Your Movements Through A Wall