The EyeRing system, invented at MIT Media Lab’s Fluid Interfaces Group, is a chunky ring device with a camera module mounted on it that can aid the blind by providing audio responses about what is in front of them. For example, in the EyeRing video (below), a man is shopping and commands the ring to detect the color of the shirt he holds. The image is sent through the system and the result is translated into words; the EyeRing responds, “grey.”
When the camera snaps a photo, it is sent to a Bluetooth-linked smartphone. A special Android app processes the image using computer-vision algorithms and then a text-to-speech module to communicate the results through earphones. So far, the device is capable of detecting currency type, color, and the amount of open space ahead (the “Virtual Walking Cane”).
Although commercialization is likely at least two years away, the potential for this type of technology to help the blind to “see” what is in front of them is huge. The team is currently working on the next prototype, incorporating more advanced capabilities for the device, such as potentially reading non-braille words, taking real-time video, and adding sensors and a microphone. The design will also be streamlined to be smaller and have a lower center of gravity. While finger-worn devices are not new, most of the existing ones have been designed for people with sight, so this is truly an exciting breakthrough for the visually impaired.