It seems that there have been quite a lot of new technologies that are coming up directed at those who are visually impaired or blind, and that can only be a good thing! We had previously reported on a “virtual cane” that made use of sonar to help map out obstacles and objects, and only a couple of hours ago we wrote a post on a prototype high-tech glasses that makes use of augmented reality to help the visually impaired “see”. Now we have the EYE 21, developed by engineers from the Research Center for Graphic Technologies at Spain’s Universitat Politècnica de València (UPV), which seems to work somewhat similarly to the “virtual cane”.
Much like the “virtual cane”, EYE 21 relies on the use of sound to help guide those who are visually impaired or blind. The sunglasses come with two built-in micro video cameras, a computer and a pair of headphones. The cameras then analyze the space in front of the wearer and then proceeds to create a three dimensional model of it, and sounds are then assigned to various surfaces in that space, which are then played back through the headphones.
Users who have tested out the prototypes have reported that they are able to “hear space” thanks to their brain which has managed to turn those sounds into shapes. We have no idea when the EYE 21 will be put to mass production, but at least it’s in its prototype stage, rather than just being a concept.
Filed in Blind, Spain and Visually Impaired.
. Read more about