Identifying grocery items can be daunting for people with visual impairment as it is crucial for simple and complex decision-making. While AI has improved visual recognition capabilities, real-world application of these advanced technologies remains challenging and error-prone.
Now, researchers from the National University of Singapore’s School of Computing have introduced a wearable assistive device called AiSee, which aims to overcome these limitations through state-of-the-art AI technologies.
AiSee, developed in 2018 and upgraded over a span of five years, helps people with visual impairment ‘see’ objects around them with the help of artificial intelligence (AI).
The new wearable device looks like a regular set of bone-conduction earphones connected by a band that goes around the back of the neck. It’s designed to help users avoid feeling self-conscious, unlike some other more noticeable smart wearables.
One of the earphones on AiSee has a 13-megapixel camera facing forward and capturing the user’s field of view, while the other has a touchpad on its outer surface. The device has a microprocessor and a lithium battery located at the back and connects wirelessly to the internet.
After snapping a photo of the object of interest, the system uses cloud-based AI algorithms to analyze and identify the object. It also allows users to pose questions to learn more about the object. The AI-powered assistant is equipped with advanced speech-to-text and text-to-speech recognition technology, which enables it to comprehend and respond to users’ queries promptly and accurately.
Additionally, unlike most wearable assistive devices, AiSee operates as a self-contained system and can function independently without the need for any additional devices.
The headphone of AiSee uses bone conduction technology to transmit sound through the bones of the skull, making it possible for visually impaired individuals to receive auditory information while also being aware of their surroundings. This is particularly vital for visually impaired people as environmental sounds provide essential information for decision-making, especially in situations involving safety considerations.
“At present, visually impaired people in Singapore do not have access to assistive AI technology of this level of sophistication. Therefore, we believe that AiSee has the potential to empower visually impaired people to accomplish tasks that currently require assistance independently. Our next step is to make AiSee affordable and accessible to the masses. To achieve this, we are making further enhancements, including a more ergonomic design and a faster processing unit,” explained lead researcher of Project AiSee Associate Professor Suranga Nanayakkara.
Assoc Prof Nanayakkara and his team are discussing user testing with SG Enable in Singapore to improve AiSee’s features and performance. B.P. De Silva Holdings Pte Ltd has contributed S$150,000 to support the project and foster inclusivity and accessibility. Its philanthropic endeavor also reflects its belief in the transformative power of technology to address societal challenges and create a more equitable and inclusive world.