The breakthrough technology, which uses sensory substitution devices (SSDs) is called the vOICe system, and converts images taken on a mobile phone into soundscapes by allocating musical notes and pitches to various shapes.
Meanwhile, an accompanying app, called EyeMusic, adds colour to these shapes, deepening the sensory experience.
During tests, blind participants were attached with head-mounted cameras before being wired to a computer and microphone.
As the visually-impaired participants roamed about the room, the camera took a photo which was then converted into sound, using pitch for height and loudness for brightness as it scanned the room.
The Daily Mail reports that a rising bright line is converted into a rising tone, a bright spot – such as a lamp – is a beep, a brightly filled rectangle – such as a window during daylight – becomes a noise burst and a vertical grid – such as waffle or trellis – is converted into a rhythm.
The app’s users are trained to identify which pitch relates to which height or brightness so that they can effectively ‘see’ their surroundings via sounds.
Neuroscientist Professor Amir Amedi from the Hebrew University of Jerusalem has been training blind people how to use this technology since 2007.
“The idea is to replace information from a missing sense by using input from a different sense,” explained Professor Amedi. “It’s just like bats and dolphins use sounds and echolocation to ‘see’ using their ears.”