Novel electrocorticography decoder for the brain-machine interface

Scientists have recently developed a mind-reading system to decode neural signals from the brain during arm movement. An individual can use the method to control a robotic arm via a brain-machine interface (BMI).

experimental paradigm. Subjects were instructed to perform reaching and grasping movements to designate target locations in three-dimensional space. (a) Subjects A and B received the visual cue in the form of a real tennis ball at one of four pseudo-random locations. (b) Subjects A and B received the visual cue in the form of a virtual reality clip showing a five-step sequence of a reaching and grasping motion. Image credit: Korea Advanced Institute of Science and Technology.

The study was published in the journal Applied Soft Computing.

An IMC is a device that converts nerve signals into commands to control a machine, such as a robotic limb or a computer. Electroencephalography (EEG) and electrocorticography (ECoG) are the two main methods for monitoring neural signals in IMCs.

EEG displays signals from electrodes on the surface of the scalp and is widely used because it is inexpensive, non-invasive, safe, and simple to use. However, because EEG has low spatial resolution and identifies irrelevant neural signals, it is difficult to infer people’s intentions.

ECoG, on the other hand, is an invasive method that involves placing electrodes directly on the surface of the cerebral cortex below the scalp. Compared to EEG, ECoG has much higher spatial resolution and less background noise to monitor neural signals. However, this method has many drawbacks.

ECoG is primarily used to find potential sources of epileptic seizures, which means that electrodes are placed in different locations for different patients and may not be in the optimal regions of the brain to detect sensory signals and movement. This inconsistency makes it difficult to decode brain signals to predict movement.

Jaeseung Jeong, Professor and Brain Specialist, Korea Advanced Institute of Science and Technology

Professor Jeong’s team has developed a novel technique of decoding ECoG neural signals during arm movement to address these issues. The system is designed based on a machine learning system to analyze and predict neural signals known as “echo state network” and Gaussian distribution, a mathematical probability model.

Scientists recorded ECoG signals from four epileptic patients as they performed a reach-and-grasp task in the study. Only 22% to 44% of ECoG electrodes were placed in regions of the brain that control movement because the electrodes were positioned according to the potential sources of each patient’s seizures.

Participants received visual cues during the movement task, either by placing a real tennis ball in front of them or by wearing a virtual reality headset that displayed a clip of a human arm stretched forward in plain view. first person.

While wearing motion sensors on their wrists and fingers, participants had to reach forward, grab an object, then return their hand and release the object. They were then asked to imagine reaching forward without moving their arms in a second task.

The researchers analyzed signals from ECoG electrodes during real and imaginary arm movements to see if the new system could predict the direction of movement based on neural signals.

The researchers found that in real and virtual tasks, the new decoder effectively classified arm movements in 24 directions in three-dimensional space, with results at least five times more accurate than chance. They also demonstrated that the new ECoG decoder could regulate the movements of a robotic arm using a computer simulation.

The results suggest that the new machine learning-based BCI system was successful in interpreting the direction of predicted movements using ECoG signals. The researchers intend to improve the efficiency and accuracy of the decoder. It could be used in a real-time IMC device in the future to help people with motor or sensory impairments.

The current study was supported by the 2021 KAIST Global Singularity Research Program, the brain research program of the National Research Foundation of Korea, financially supported by the Ministry of Science, ICT and Planning Future, and the Basic Science Research Program through the National Research Foundation. of Korea supported by the Ministry of Education.

Journal reference:

Kim, H.-H. & Jeong, J. (2022) An electrocorticographic arm motion decoder for brain-machine interface using an echo state network and Gaussian readout. Applied Soft Computing. doi.org/10.1016/j.asoc.2021.108393.

Source: https://www.kaist.ac.kr/en/