Joseph Paul Cohen is a Ph.D Candidate at the University of Massachusetts Boston, and he has created a remarkable smartphone app. It’s called BlindTool. On his site, Cohen states that he hopes that the application will, “demonstrate the recent advances in computer vision to the public and provide a sense of vision to the blind.”
Users can wave their phone around until the phone vibrates, showing that it has encountered an object it understands. Not only does the app tell the user what it sees, it vibrates based on its level of confidence.
The app can understand 1,000 objects based on ImageNet, an online image database.
The convolutional neural network (CNN) inside the app is basically an artificial neural network that is inspired by biological processes.
You can download BlindTool here. Currently, it is only available for Android, and notably, the app is free. “I’m releasing this tool for the blind free because I hope this app will help people live a better life,” Cohen added in his release.
Once the sighted person accepts the request, a live audio-video connection will be set up between the two. The sighted user can tell the blind person what she sees when the blind user points his phone at something using the rear-facing camera.
Although Be My Eyes has already benefitted users, it doesn’t allow for the independence that BlindTool offers. However, some users may feel more comfortable relying on a person rather than just their phone itself.
UPDATE: A previous version of this article incorrectly provided a bio. for Jake Cohen. This error has been corrected.