An estimated 1.3 billion people across the globe live with some sort of vision impairment, and of those, 36 million are blind.
Now, Google has released an artificial intelligence-powered app designed to serve as a helpful pair of “eyes” for those people, providing them with a level of independence they may have previously lacked — and showing the world yet another way AI can help people with disabilities.
Google named the app Lookout, and the way it works is simple: open the app on your phone, and listen while Lookout audibly describes whatever the phone’s camera is pointed at.
The app features three modes designed for specific situations.
Explore mode is useful for navigating a new setting. Google suggests in a blog post that people with severe visual impairments might choose to wear their phones around their necks, perhaps in a lanyard or in their shirt pocket, so that Explore can provide them with constant updates on their surroundings as they navigate the world.
Shopping mode is useful for scanning barcodes or reading currency — a person with a visual impairment might use this mode if they aren’t sure whether the bill they’re holding is worth $10 or $20.
Quick Read mode, meanwhile, is pretty self-explanatory. A person points their phone at text — anything from a sign in a grocery store to a piece of mail — and Lookout reads it to them.
While Google points out in the blog post that the app “will not always be 100 percent perfect,” Lookout has the potential to dramatically improve the quality of life for people with visual impairments.
That’ll be especially true if Google expands Lookout to include more languages, locations, and devices — it currently speaks only English and is only available for U.S. owners of Pixel devices — which the company claims in the blog post it plans to do soon.
READ MORE: With Lookout, discover your surroundings with the help of AI [Google]