For more than 30 years—following an accident in his teens—Robert “Buz” Chmielewski has been a quadriplegic with minimal movement and feeling in his hands and fingers. But last month he was able to manipulate two prosthetic arms with his brain and feed himself dessert.
Buz’s accomplishment marks a big step toward restoring function and autonomy for patients affected by an illness or injury that results in the partial or total loss of use of all four limbs and torso.
“It’s pretty cool,” said Chmielewski, whose sense of accomplishment was unmistakable after using his thoughts to command the robotic limbs to cut and feed him a piece of golden sponge cake. “I wanted to be able to do more of it,” he said.
Nearly two years ago, Chmielewski underwent a 10-hour brain surgery at the Johns Hopkins Hospital in Baltimore as part of a clinical trial originally spearheaded by the Defense Advanced Research Projects Agency and leveraging advanced prosthetic limbs developed by the Johns Hopkins Applied Physics Laboratory. Its goal was to allow participants to control assistive devices, and enable perception of physical stimuli (touching the limbs) using neurosignals from the brain.
Surgeons implanted six electrode arrays into both sides of his brain, and within months he was able to demonstrate, for the first time, simultaneous control of two of the prosthetic limbs through a brain-machine interface developed by APL.
Researchers were impressed with his progress during the first year of testing and wanted to further push the bounds of what could be accomplished. Using an internal research grant from APL, the team launched a parallel line of inquiry—termed “Smart Prosthetics”—to develop strategies for providing advanced robot control and sensory feedback from both hands at the same time using neural stimulation. That team included Francesco Tenore, David Handelman, Andrew Badger, Matthew Fifer, and Luke Osborn from APL, as well as Tessy Thomas, Robert Nickl, Nathan Crone, Gabriela Cantarero, and Pablo Celnik from the School of Medicine.
https://youtube.com/watch?v=x615GSqicZE%3Fcolor%3Dwhite
They set out to develop a closed-loop system that merges artificial intelligence, robotics, and a brain-machine interface. In the instance of Chmielewski serving himself dessert, the system enabled him to control the movements necessary to cut food with a fork and knife and feed himself.
“Our ultimate goal is to make activities such as eating easy to accomplish, having the robot do one part of the work and leaving the user, in this case Buz, in charge of the details: which food to eat, where to cut, how big the cut piece should be,” explained Handelman, an APL senior roboticist specializing in human-machine teaming. “By combining brain-computer interface signals with robotics and artificial intelligence, we allow the human to focus on the parts of the task that matter most.”
Tenore, an APL neuroscientist and principal investigator for the Smart Prosthetics study, said the next steps for this effort include not only expanding the number and types of activities of daily living that Buz can demonstrate with this form of human-machine collaboration, but also providing him with additional sensory feedback as he completes tasks so that he won’t have to rely on vision to know if he’s succeeding.
“The idea is that he’d experience this the same way that uninjured people can ‘feel’ how they’re tying their shoelaces, for example, without having to look at what they’re doing,” Tenore said.
In an interview just before Thanksgiving—the traditional launch of a food-heavy holiday season—Buz reflected on the significance of this research for individuals with limited mobility. Disabilities like his take away a person’s independence, he said, particularly their ability to eat by themselves.
Source: Read Full Article