/* ---- Google Analytics Code Below */

Sunday, January 07, 2018

Advances in Sensory Substitution

Been reading about this for many years,  are real advances here?   Article looks at the history, technology and likely advances.

Feeling Sounds, Hearing Sights   By Gregory Mone 

Communications of the ACM, Vol. 61 No. 1, Pages 15-17

 In a 2016 video, Saqib Shaikh, a Microsoft Research software engineer, walks out of London's Clapham Station Underground stop, turns, and crosses a street, then stops suddenly when he hears an unexpected noise. Shaikh, who lost his sight when he was seven years old and walks with the aid of the standard white cane, reaches up and swipes the earpiece of his glasses.

The video then shifts to the view from his eyewear, a pair of smart glasses that capture high-quality still images and videos. That simple swipe instructed the glasses, an experimental prototype designed by a company called Pivothead, to snap a still photo. Microsoft software analyzed the picture, then translated the findings into auditory feedback. Through the smart glasses, which include a small speaker, Shaikh hears the results from an automated voice: "I think it's a man jumping in the air doing a trick on a skateboard."

The Pivothead smart glasses and Microsoft AI technology belong to a broader class of what have become known as sensory substitution technologies, apps and devices that collect visual, auditory, and in some cases haptic stimuli, and feed the information to the user through another sensory channel. While the utility of these devices has long been debated in the vision- and hearing-impaired communities, recent advances suggest that sensory substitution technologies are finally starting to deliver on their promise.  .... "

No comments: