Recent advances in neural science, robotics and software have allowed scientists to develop a robotic system that responds to muscle movement signals from a partially paralyzed person relayed through a brain-machine interface. Human and robot work as a team to make certain tasks a breeze.
Two robotic arms – a fork in one hand, a knife in the other – frame a seated man, seated at a table, with a piece of cake on a plate. A computerized voice announces each action: “move the fork towards the food” and “retract the knife”. Partially paralyzed, the man makes subtle movements with his right and left fists at certain prompts, such as “select cut location”, so the machine cuts out a bite-sized piece. Now: “bring food to mouth” and another subtle gesture to align the fork with his mouth.
In less than 90 seconds, a person with severely limited upper body mobility, who hasn’t been able to use their fingers for about 30 years, fed themselves a dessert using their mind and intelligent robotic hands .
A team led by researchers from the Johns Hopkins Applied Physics Laboratory (APL), in Laurel, Maryland, and the Department of Physical Medicine and Rehabilitation (PMR), Johns Hopkins School of Medicine, published an article in the journal Frontiers in Neurorobotics who described this latest feat using a brain-machine interface (BMI) and a pair of modular prosthetic limbs.
Also sometimes referred to as a brain-computer interface, BMI systems provide a direct communication link between the brain and a computer, which decodes neural signals and “translates” them to perform a variety of external functions, from moving a cursor on a screen to now enjoy a piece of cake. In this particular experiment, muscle movement signals from the brain helped control the robotic prostheses.
A new approach
The study is based on more than 15 years of research in neurosciences, robotics and software, carried out by the APL in collaboration with the Department of PMR, within the framework of the Revolutionizing Prosthetics Program, which was originally sponsored by the US Defense Advanced Research Project Agency (DARPA). The new paper describes an innovative model of shared control that allows a human to maneuver a pair of robotic prostheses with minimal mental input.
“This shared control approach aims to leverage the intrinsic capabilities of the brain-machine interface and the robotic system, creating a ‘best of both worlds’ environment where the user can customize the behavior of an intelligent prosthesis,” said Dr. Francesco Tenore, a senior project manager in APL’s Exploratory Research and Development Department. The paper’s lead author, Tenore, focuses on neural interface and applied neuroscience research.
“Although our results are preliminary, we are excited to give users with limited abilities a real sense of control over increasingly intelligent assistive machines,” he added.
Help people with disabilities
One of the most important advances in robotics demonstrated in the article is the combination of robot autonomy with limited human intervention, with the machine doing most of the work while allowing the user to customize behavior. of the robot to his liking, according to Dr. David Handelman, first author of the article. author and senior roboticist in the Intelligent Systems Branch of the Exploratory Research and Development Department of the APL.
“For robots to perform human-like tasks for people with reduced functionality, they will need human-like dexterity. Human-like dexterity requires complex control of a complex robot skeleton,” a- he explained, “Our goal is to make it easy for the user to control the few most important elements for specific tasks.”
Dr Pablo Celnik, the project’s principal investigator in the PMR department, said: “The human-computer interaction demonstrated in this project points to the potential capabilities that can be developed to help people with disabilities.
close the loop
While the DARPA program officially ended in August 2020, the APL and Johns Hopkins School of Medicine team continues to collaborate with colleagues from other institutions to demonstrate and explore the potential of the technology.
The next iteration of the system could incorporate previous research that found sensory stimulation of amputees allowed them not only to perceive their phantom limb, but also to use muscle movement signals from the brain to control a prosthesis. The theory is that adding sensory feedback, delivered directly to a person’s brain, can help them perform certain tasks without requiring the constant visual feedback in the ongoing experience.
“This research is a great example of that philosophy where we knew we had all the tools to demonstrate that complex bimanual activity of daily living that non-disabled people take for granted,” Tenore said. “Many challenges remain, including improving task execution, in terms of accuracy and timing, and closed-loop control without the constant need for visual feedback.”
Celnik added, “Future research will explore the limits of these interactions even beyond the basic activities of daily living.”