Brain-computer interface helps patient with locked-in syndrome communicate

For the first time, a patient in a completely confined state due to amyotrophic lateral sclerosis (ALS) was able to communicate verbally using a brain-computer interface, according to a new study.

This technology allowed the patient, a 37-year-old man with ALS, to communicate by forming words and sentences, despite having no voluntary muscle control. The system involved implanting a device with microelectrodes into the patient’s skin brainand using custom computer software to help translate his brain signals.

ALS – also known as motor neuron disease or Lou Gehrig’s disease – is a rare neurodegenerative disease that affects the neurons responsible for controlling voluntary muscle movement. According to National Institute of Neurological Disorders and Stroke (NINDS)this disease causes the degeneration and eventual death of these nerve cells, affecting a person’s ability to walk, talk, chew and swallow.

As the disease worsens, affected individuals eventually lose the ability to breathe without the aid of a ventilator or other device and paralyzes almost all of their muscles. When people develop paralysis of all their muscles except the muscles that control eye movements, this is called a “locked state”. To communicate, people in a state of confinement must use communication assistance and enhancement devices.

Related: 10 things you didn’t know about the brain

Many of these devices are controlled by eye movement or by still functioning facial muscles. (For example, Stephan Hawking used a device that allowed him to communicate by moving his cheek muscle, according to Wired.) But once a person with ALS also loses the ability to move those muscles, they enter a “completely locked-in state” that prevents them from communicating with family, caregivers, and the rest of the outside world.

The patient in the new study (known as patient K1) had lost the ability to walk and talk by the end of 2015, according to the study published Tuesday (March 22) in the journal Nature Communications. He began using an eye-tracking-based communication device the following year, but eventually could no longer fix his gaze well enough to use it and was limited to “yes” or “no” communication. Anticipating that he might lose all remaining visual control in the near future and go into a completely locked-in state, he asked his family to help him find another way to communicate with them.

The family of patient K1 contacted two of the study authors, Dr Niels Birbaumer from the Institute of Medical Psychology and Behavioral Neurobiology at the University of Tübingen in Germany, and Dr Ujwal Chaudhary from the nonprofit organization non-profit ALS Voice in Mössingen, Germany. , which helped configure patient K1 with a non-invasive brain-computer interface system that enabled communication with the remaining eye movement he had. When he finally lost the ability to move his eyes as well, their team implanted the microelectrode device into his brain as part of the brain-computer interface.

The system works by using “auditory neurofeedback”, which means that the patient has to “match” the frequency of their brain waves to a certain tone, word or phrase. Matching and holding the frequency at a certain level (for 500 milliseconds) allowed him to get a positive or negative response from the system.

As communication with patients in a fully enclosed state was never possible, the team did not know whether or not the system would work for patient K1. In fact, “nobody believed that communication was possible in a completely locked state,” Birbaumer told Live Science.

Yet, about 3 months after surgery, patient K1 was able to successfully use neurofeedback to control the brain-computer interface. About half a month later, he started picking out letters and spelling out words and phrases, even thanking the authors and spelling out, “Boys, it works so easy.”

According to another team member and co-author of the study, Dr. Jonas Zimmermann of the Wyss Center for Bio and Neuroengineering in Geneva, Switzerland, this showed how patient K1 “was able to use motor areas brain to communicate, even though he wasn’t really able to move at all.” Importantly, Chaudhary said the system allowed patient K1 to “provide specific instructions on how they should be cared for,” giving them a voice around their needs, wants and well-being.

Although patient K1 was able to use the brain-computer interface based on neurofeedback to communicate with his family, the system is not perfect. It always requires constant monitoring, otherwise it may encounter technical errors.

Without the study team’s supervision, Zimmermann said “the system could get stuck in a loop (rejecting all options, or always selecting the first letter, or just randomly selecting letters).” The team is currently working on other ways to fix this, such as allowing the system to detect these malfunctions and automatically shut down when they occur.

The authors also noted that the patient in this case had been trained with a neurofeedback system before losing full muscle function, and so it is unclear how well the brain-computer interface system would work if the researchers had started the training when the patient was already in a completely blocked state.

At the Wyss Center, Zimmermann said researchers are also working on a new, fully implantable system that doesn’t need an external computer to operate, called ABILITY. This system, which is currently undergoing preclinical verification, will help improve usability and make it easier to set up and use the system, he said.

The researchers hope that this technology can one day provide a much better experience for locked-in patients and allow them to have a say in decisions about their care. “However, much more work on the technology needs to be done before it is widely available,” Zimmerman said.

Originally posted on Live Science.