Brain-computer interface to end the isolation of locked patients

Confinement syndrome is a rare neurological disorder. It is often triggered by amyotrophic lateral sclerosis (ALS), an incurable degenerative disease of the motor nervous system. Affected individuals run the risk of losing complete muscle control while consciousness and mental functions remain intact. This means that those affected can see and hear, but usually only the eyelids remain as a means of communication. A decisive facilitation of communication with confined patients is expected from brain-computer interface (BCI) technologies. These are based on the discovery that even imagining behavior triggers measurable changes in the electrical activity of the brain. For example, imagining moving a hand or a foot activates the motor cortex.

Brain-computer interface technologies are divided into invasive and non-invasive methods. In non-invasive methods, brain activity is measured using electrodes manually attached to the scalp. Measurements are based on electroencephalography (EEG), which has the disadvantage of low signal resolution and limited accuracy.

Invasive brain-computer interface technologies

Invasive methods can compensate for these weaknesses by using electrodes for electrocorticogram measurements, implanting them on the motor cortex. To date, the applicability of invasive brain-computer interface technologies still lacks the desired miniaturization and high spatial resolution, as it requires a large number of measurement points in a small space. Moreover, the software cannot yet be applied by subjects alone. For EEG-based and intracortical systems, calibration must be performed repeatedly to bring the algorithms back to the current state of the day, explains Professor Gernot Muller-Putz from Institute of Neurotechnology at Graz University of TechnologyAustria.

He is currently conducting research within the European Research Consortium INTRECOM, which aims to solve these problems. The implantable technology will be able to decode speech in real time from brain signals. Confined patients will thus have for the first time a complete and easy-to-use communication system with which they can talk and control a computer cursor.

Decoding of articulator movements for brain-computer interfaces (c) University Medical Center Utrecht / RIBS

Introducing a Behavior

The consortium of research and industry partners is led by Professor Nick Ramsey of Dutch University Medical Center UMC Utrecht. He has already shown in preliminary work that an attempted hand movement can be detected and used as a mouse click. It works similarly to assistive technology, in which individual letters are scanned and the patient can select and click on letters, says Professor Müller-Putz.

Himself just completed the EU project Feel Your Reach, in which he was able to calculate the trajectories of arm movements presented with a certain probability from the EEG signals. This technology still needs to be refined in the current project. At Graz University of Technology, Austria, the focus so far has been on non-invasive brain-computer interface technologies. Together with Professor Ramsey, Müller-Putz is now working for the first time with electrocorticogram (ECoG) measurements. Here, the material on which the electrodes are attached – the so-called network – sits directly on the motor cortex.

Two research approaches

To safely advance the research, the research partners are taking two approaches: The Ramsey team wants to generate speech from a speech attempt, which means that researchers assess the person’s attempt to produce the individual sounds of a spoken word. This way they can read what the person is trying to say from the brain signals in real time.

The Müller-Putz team focuses on any additional form of communication that can be described using cursor movements, from simply selecting icons on the screen to cursor movements and choices that the patient can control. .

The hardware for the brain-computer interface consists of an array of electrodes – called an array – and a biosignal amplifier. While the electrode array is placed over the motor areas, the biosignal amplifier is implanted in the skull bone. The latter has the task of processing the data and transmitting it wirelessly to external computers for analysis and decoding.

Miniaturization vs high resolution

Among the technical challenges is the aforementioned miniaturization, a prerequisite for implantation. When recording brain signals, high spatial resolution is required. This means a very large number of measuring points in relation to the size of the network. The smaller the array, the more densely the electrodes should be arranged. Time resolution is measured in the millisecond range. High spatial resolution and high temporal resolution are fundamental to decoding speech in real time.

To convert brain signals into speech, algorithms are used to extract parameters from the measurement data. These describe whether the mouth wants to generate sounds or whether the hand wants to move the cursor. Ultimately, the system still needs to be integrated into software that works without technical experts in a home application. To do this, the system must be easy to use and robust, while taking advantage of the latest technologies based on AI and self-learning.

Industrial partners

Two industrial partners of the consortium are responsible for the design of the equipment: the Swiss company Wyss Center for Bio- and Neuroengineering designs biosignal amplifier and German medical device manufacturer CorTec will develop parts of implantable electronics that record brain signals: custom high-resolution ECoG electrodes with high-channel wiring.

“The individual components already exist in different designs. We will now refine them and put together different things for the first time so that we can implement them appropriately. That’s the exciting part,” says Müller-Putz. The brain-computer interface will be tested on two people with confinement syndrome in Utrecht and Graz.

About the INTRECOM project

The project should start in the fall. Professor Müller-Putz is currently working on the preparations and is still looking for interested postdocs and doctoral students for the team at the Institute of Neurotechnology at the Graz University of Technology, Austria.

Intracranial Neuro Telemetry to REstore COMmunication (INTRECOM) was selected by the European Innovation Council (Pathfinder programme) and is funded by the EU with nearly four million euros. The project will run from fall 2022 to fall 2026.