BCI technology has become one of the hottest areas of medtech. Companies are developing a multitude of methods with their own systems that would allow patients to control a computer with their brain. Such technology could enable immobile people to control a mouse cursor, keyboard, mobile device/tablet, wheelchair or prosthetic device by only thinking.
“My goal is to give abilities back to those that have lost them, and eventually, to improve how all of us interact with technology and each other — the ultimate human-machine interface,” Norman told Medical Design & Outsourcing. “And what’s more human than our brain, the organ that contains our every memory, thought and intention?”
Norman’s background includes designing haptic interfaces for teleoperated robotics and exoskeleton robotics for assisting in motor recovery following neurological injuries like stroke. During his Ph.D. studies, he used BCIs to teach patients to create brain states through a process called neurofeedback. He said the method produced strong results but is limited in terms of efficacy and cost/availability.
However, his experiences with BCIs led him to seek ways to build better ones. That includes improvements to hardware, software and protocols, he said. Norman spent the past five years developing ultrasound-based hardware for reading brain signals. This method is now moving into human use, he said.
“At AE studio, we are pushing the boundaries of BCI software,” said Norman. “We develop open-source and free tools to lower the barriers of entry to contribute to neurotechnology and maximally accelerate researchers from all backgrounds. We also work with industry leaders in BCI hardware manufacturing to unlock every bit of performance we can.”
What is BCI, and why is it gaining popularity?
Norman considers BCIs to be “a pretty simple concept at their core.”
They require a method of sensing biophysical effects that take place when brain activity changes. Using an electrode, you may sense the voltage potential created by neurons as they “spike.” Behavior, or attempted behavior like the movement of a person’s hand, must be measured. Machine learning methods then find the correlation between brain patterns and behavior.
“Once this ‘decoder’ is good enough, we can measure the brain state to infer the intended behavior without measuring behavior directly,” said Norman. “We then turn that intended behavior into commands. … Over time, the user learns and adapts to control the BCI more directly.”
BCIs represent “the final frontier” in human-machine interaction, Norman said. With a significant portion of the world already spending large chunks of their lives connected to the internet, it’s now a matter of “waking up to the possibilities.” Advances in machine learning, cloud architecture and edge computing power mean BCIs are possible. They’re no longer science fiction — they’re just science.
“Much like the internet, they have the potential to completely change the way we live, work and interact,” said Norman. “And, yet, we’re just scratching the surface of what’s possible today. That’s exciting.”
Who are the players in the space?
Some big hitters have put their resources behind developing BCI technology. That includes Elon Musk’s Neuralink and Meta’s Reality Labs. However, a number of companies without the backing of a billionaire like Musk or a Mark Zuckerberg have paved the way themselves.
Testing of Blackrock Neurotech’s technology has been ongoing in human patients for nearly 20 years. In 2021, Blackrock received FDA breakthrough device designation for its MoveAgain BCI system. It provides immobile patients with the ability to control a range of devices by only thinking. Blackrock is one example of companies collaborating with AE to advance BCI technology.
Meanwhile, Synchron develops the catheter-delivered Stentrode brain-computer interface (BCI) implant. The company believes they’re the only BCI company tapping into blood vessels to capture signals from the brain. Synchron has trials for its technology ongoing at multiple locations, with human implants already performed.
Neuralink and Reality Labs “capture headlines, sure, but groups like Paradromics, Synchron and Blackrock Neurotech, are arguably better poised to capture the first generation of BCI users,” said Norman. “While Neuralink is still pushing for their first in-human tests, Blackrock devices have been implanted in nearly 40 humans already, and Synchron is in the thick of their first clinical trial.”
Still, Norman says the first generation of BCI users will be restricted to those with severe forms of neurological injury and disease. That’s because BCI bandwidth is too slow or costs are too high, he said. However, that’s changing.
“The real winners will be those that create the BCI hardware and software that justifies use by people with mild to moderate neurological injury and disease and/or psychiatric and cognitive disorders, and eventually, enables use for all people,” said Norman. “Most groups making strides in building the next generation of BCI are not making headlines — yet.”
What is AE Studio, and how is it contributing to BCI development?
Founded in 2016, AE’s goal is to increase human agency with technology. It aims to develop a BCI operating system for maximally increasing human agency. Norman said the company wants to avoid a future in which BCI’s “crowning achievement” is increased consumer spending.
AE helps startups and enterprise clients develop their software with data science and software development consulting. Some revenues are funneled toward developing BCI software, Norman said.
For BCI software, AE builds models that Norman said are “robust, more efficiently calibrated, more easily scaled and more easily deployed.” Data in the BCI field are complex, he said. Experts estimate that approximately 86 billion neurons exist in the brain, with each neuron connected to around 1,000 nearby neurons.
Today’s BCIs interface with “just” hundreds of thousands of neurons, Norman said. However, those neurons can drift in and out of view of an electrode, forcing the BCI out of calibration. He explained that most BCI users must recalibrate every few hours, One of AE’s projects stabilizes decoders over long periods of time for less time calibrating, which includes a tedious exercise of various instructed behaviors.
Norman said the true potential of a platform technology exists in the user’s data. However, one of AE’s areas of focus centers around protecting privacy in this sense.
“At AE, we are building tools today, while user numbers are small, to protect all users as their numbers grow,” said Norman. “As one example, we are currently building a tool that allows us to train sophisticated machine-learning models that benefit from many users while never requiring the data to leave the user’s device. When privacy enables performance, everybody wins.”
The future of BCIs
Norman said the first generation of commercial BCIs will focus on motor intention. An example of a first step would be replacing a computer mouse with a BCI-controlled cursor. Still, questions remain about bringing BCI control to replace something like a keyboard due to difficulties replicating or improving upon the speed of ten fingers on a keyboard.
“Silent speech” is the holy grail in BCI, Norman said, because speech remains the fastest method of communication between humans. There are still plenty of hurdles to clear before silent speech can be integrated into a commercial BCI.
Another class of BCIs relies on the ability to “write” information into the brain instead of reading it out. If BCIs extend there, Norman said, cochlear implants for the deaf represent “likely the most impactful BCI modality” to date. Thus, restoring vision to the blind offers another space into which BCI could move.
Norman explained that, as BCI hardware scales to interact with the brain, so too will their application space.
“The coming generation of BCIs will be all-purpose, vastly improving their value proposition,” said Norman. “And if enough people have BCIs, we can test many more approaches that we’ve yet to even imagine.”