
A screenshot from a video demonstrating the EEG-based BCI in use controlling a robotic hand. [Image courtesy of Carnegie Mellon University]
Professor Bin He is an experienced investigator of non-invasive BCI technology, particularly ones based on electroencephalography (EEG). These surgery-free solutions provide adaptability across a range of environments. Previous milestones within He’s group include flying a drone, controlling a robotic arm and controlling a robotic hand with BCI.
A new study in Nature Communications outlines He’s lab’s work with non-invasive BCI to deliver real-time decoding of individual finger movement intentions and control of a dexterous robotic hand at the finger level.
“Improving hand function is a top priority for both impaired and able-bodied individuals, as even small gains can meaningfully enhance ability and quality of life,” said He, a professor of biomedical engineering at Carnegie Mellon University. “However, real-time decoding of dexterous individual finger movements using noninvasive brain signals has remained an elusive goal, largely due to the limited spatial resolution of EEG.”
According to Carnegie Mellon, it marks a first-of-its-kind achievement for EEG-based BCI. He’s group employed a real-time, non-invasive robotic control system that utilized movement execution and motor imagery of finger movements. This drove corresponding robotic finger motions.
By just thinking, human subjects could perform two- and three-finger control tasks. The lab hopes to bring this to more refined finger-level tasks, like typing.
“The insights gained from this study hold immense potential to elevate the clinical relevance of noninvasive BCIs and enable applications across a broader population,” He said. “Our study highlights the transformative potential of EEG-based BCIs and their application beyond basic communication to intricate motor control.”