Wednesday, May 1, 2024

New smart glasses help blind people ‘see’ using sound

Assistive technology is an extensive research field that involves designing technologies to enable an individual with a sensory disability to overcome various physical, social, and accessibility challenges they face in their daily lives. One central area within this field is developing assistive technology for people who are blind or have low vision (BLV), which affects a person’s ability to perform everyday tasks and engage in social activities and interactions.

Therefore, a broad area of assistive technology research focuses on using feedback from different modalities, such as vision, touch, and sound, to enhance the sensory abilities of people with BLV.

Now, researchers at the University of Technology Sydney (UTS) have developed cutting-edge technology known as “acoustic touch” that helps blind people ‘see’ using sound. This technology could have a huge impact on the quality of life of people who are blind or have low vision.

“Smart glasses typically use computer vision and other sensory information to translate the wearer’s surroundings into computer-synthesized speech,” said Distinguished Professor Chin-Teng Lin, one of the study’s co-authors. “However, acoustic touch technology sonifies objects, creating unique sound representations as they enter the device’s field of view. For example, the sound of rustling leaves might signify a plant, or a buzzing sound might represent a mobile phone.”

Inspired by human echolocation training, researchers explored the concept of “acoustic touch,” which involves using smart glasses to convert objects into distinct sound auditory icons when the object enters the glasses’ field of view. This approach is different from traditional systems and is aimed at providing a wearable spatial audio solution for assisting people who are blind in finding objects.

Researchers developed a wearable Foveated Audio Device (FAD) to study the efficacy and usability of using acoustic touch to search, memorize, and reach items. The system consists of the NReal Light AR Glasses and an OPPO Find X3 Pro Android phone. The NReal glasses were chosen because of the weight (88g), computer vision support, 6-degree-of-freedom inertial measurement unit (IMU), binaural speakers, and compatibility with the Android Unity SDK. The FAD was developed using the Unity Game Engine 2022 that managed the audio input and camera/head-tracking output of the NReal Glasses.

The team’s evaluation study involved 14 participants, seven blind or low-visioned and seven blindfolded sighted participants, who served as a control group. The results showed that the wearable device, which was equipped with acoustic touch technology, significantly enhanced the ability of blind or low-vision individuals to recognize and reach for objects without causing too much mental effort.

“The auditory feedback empowers users to identify and reach for objects with remarkable accuracy,” Dr Zhu said in a statement. “Our findings indicate that acoustic touch has the potential to offer a wearable and effective method of sensory augmentation for the visually impaired community.”

The advancements in assistive technology are crucial in overcoming the challenges faced by individuals who are blind or have low vision, such as locating specific household items and personal belongings. The acoustic touch technology is a promising solution that has the potential to open new doors for such individuals, enhancing their independence and quality of life.

With continuous advancements, acoustic touch technology could become an integral part of assistive technologies, supporting individuals to access their environment more efficiently and effectively than ever before.