Follow for more talkers

Robotic arms allow paralyzed man to feed himself for first time in 30 years

The user can personalize the behavior of a smart prosthesis.

Avatar photo

Published

on

By Danny Halpin via SWNS

A partially-paralyzed man has been able to feed himself for the first time in 30 years after a pair of robotic arms were directly connected to his brain.

The feat is part of an experiment by American researchers who hope to assist people with disabilities by developing robots that can be operated with brain signals and require limited physical movement.

In the 90-second experiment, the man sits at a table with a piece of cake on a plate. Two robotic arms, a fork in one, and a knife in the other, cut the cake and deliver it to the man’s mouth.

A computerized voice gives a running commentary, voicing the instructions it receives from the man’s brain saying “moving fork to food” and “retracting knife."

The man moves his arms slightly to cut the cake and bring it to his mouth while the computer says: “Moving food to mouth.”

He is using a brain-machine interface (BMI) which creates a direct communication link between his brain and the robotic arms.

In the 90-second experiment, the man sits at a table with a piece of cake on a plate robotic arms hold a fork and knife.
(John Hopkins Medicine via SWNS)

Sometimes called a brain-computer interface, the system is used to translate different neural signals into physical functions such as moving a cursor on a screen.

In this particular experiment, the BMI reads the man’s muscle movement signals and translates them into physical movement in the robotic arms, allowing him to enjoy a piece of cake.

The study was published in the journal Frontiers in Neurorobotics and was carried out by researchers at the Johns Hopkins Applied Physics Laboratory (APL) and the Physical Medicine and Rehabilitation (PMR).

It builds on more than 15 years of research in neural science, robotics, and software and is the latest project in Johns Hopkins’s Revolutionising Prosthetics program, sponsored by the US Defense Advanced Research Project Agency (DARPA).

The new paper sets out a model, called the shared control approach, that links smart robotics with the human mind so people can use prosthetic limbs with little mental or physical effort.

Dr. Francesco Tenore, an APL project manager and the paper’s senior author, said: “This shared control approach is intended to leverage the intrinsic capabilities of the brain-machine interface and the robotic system, creating a best of both worlds environment where the user can personalize the behavior of a smart prosthesis.

“Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines.”

Senior roboticist Dr. David Handelman, the paper’s first author, said the work demonstrates one of the most important advances in robotics, which is to combine robot autonomy with limited human input.

This means the machine does most of the work while the user is able to control the robot’s movement to suit their needs.

A brain-machine interface (BMI) creates direct communication between the brain and the robotic arms. (John Hopkins Medicine via SWNS)

Dr. Handelman added: “In order for robots to perform human-like tasks for people with reduced functionality, they will require human-like dexterity.

“Human-like dexterity requires complex control of a complex robot-skeleton. Our goal is to make it easy for the user to control the few things that matter most for specific tasks.”

Dr. Pablo Celnik, principal investigator in the PMR department said: “The human-machine interaction demonstrated in this project denotes the potential capabilities that can be developed to help people with disabilities.

“Future research will explore the boundaries of these interactions, even beyond basic activities of daily living.”

Despite the DARPA program officially ending in August 2020, the Johns Hopkins team continues to collaborate with colleagues at other institutions to explore the potential of this technology.

The next step may be to include previous research that found providing sensory stimulation to amputees allowed them not only to perceive a phantom limb but to operate it using muscle movement signals from the brain without any visual cue.

This would improve on the current experiment which requires the user to be able to see where the robotic arm is moving at all times.

Dr. Tenore added: “This research is a great example of this philosophy where we knew we had all the tools to demonstrate this complex bimanual activity of daily living that non-disabled people take for granted.

“Many challenges still lie ahead, including improved task execution, in terms of both accuracy and timing, and closed-loop control without the constant need for visual feedback.”

Stories and infographics by ‘Talker Research’ are available to download & ready to use. Stories and videos by ‘Talker News’ are managed by SWNS. To license content for editorial or commercial use and to see the full scope of SWNS content, please email [email protected] or submit an inquiry via our contact form.

Top Talkers