Wednesday, April 24, 2024

Quadriplegic patient uses brain signals to feed himself with two prosthetic arms

After two years of research and study, a group of researchers from the Applied Physics Laboratory of Johns Hopkins University and Johns Hopkins Medicine is able to teach a quadriplegic patient to control two robotic arms with his brain – and even feed himself.

As part of a clinical trial, the team implanted six electrode arrays into both sides of the brain of Robert “Buz” Chmielewski, a quadriplegic patient with minimal movement and feeling in his hands and fingers. With the help of them, the researchers planned to teach Chmielewski to use two advanced prosthetic arms – and not only to control them but also to feel what the robotic arms are touching.

Chmielewski managed to learn how to move the robotic arms in a little less than a year, and now he has already managed to master more precise actions – to cope with a knife and fork. In a video given below, we can see smart software puts his utensils in roughly the right spot, and then Buz uses his brain signals to cut the food with a knife and fork. Once he is done cutting, the software then brings the food near his mouth, where he again uses brain signals to bring the food the last several inches to his mouth so he can eat it.

For the first time, the team demonstrated simultaneous control of two of the prosthetic limbs through a brain-machine interface developed by APL. Very little is known about how exactly the interface works, but usually, such devices work on the basis of an algorithm that learns to read and recognize the activity that appears in response to the presentation of a certain action and then use it to control the robotic arm. The researchers also clarified that when controlling the prosthetic arms, Chmielewski receives a certain sensory response, which allows him to interact more effectively with the environment.

Our ultimate goal is to make activities such as eating easy to accomplish, having the robot do one part of the work and leaving the user, in this case, Buz, in charge of the details: which food to eat, where to cut, how big the cut piece should be,” explained Handelman, an APL senior roboticist specializing in human-machine teaming. “By combining brain-computer interface signals with robotics and artificial intelligence, we allow the human to focus on the parts of the task that matter most.

https://www.youtube.com/watch?v=x615GSqicZE

The next steps for this effort include not only expanding the number and types of activities of daily living that Buz can demonstrate with this form of human-machine collaboration “but also providing him with additional sensory feedback as he’s conducting these tasks so that he won’t have to rely entirely on the vision to know if he’s succeeding,” said Tenore, an APL neuroscientist and principal investigator for the Smart Prosthetics study.