‘Neurograins’ Could be the Next Brain-Computer Interfaces

For people with brain and spinal injuries, these systems could eventually restore communication and movement, allowing them to live more independently. But currently, they’re not all that practical. Most require clunky set-ups and can’t be used outside of a research lab. People outfitted with brain implants are also limited in the types of actions they can perform because of the relatively small number of neurons the implants can record from at once. The most common brain chip used, the Utah array, is a bed of 100 silicon needles, each with an electrode at the tip that sticks into the brain tissue. One of these arrays is about the size of Abraham Lincoln’s face on a US penny and can record activity from a few hundred surrounding neurons.

But many of the brain functions that researchers are interested in—like memory, language, and decision making—involve networks of neurons that are widely distributed throughout the brain. “To understand how these functions really work, you need to study them at the systems level,” says Chantel Prat, an associate professor of psychology at the University of Washington who is not involved in the neurograins project. Her work involves non-invasive brain-computer interfaces that are worn on the head rather than implanted.

The ability to record from many more neurons could enable much finer motor control and expand what’s currently possible with brain-controlled devices. Researchers could also use them in animals to learn how different brain regions speak to each other. “When it comes to how brains work, the whole really is more important than the sum of the parts,” she says.

Florian Solzbacher, co-founder and president of Blackrock Neurotech, the company that manufactures the Utah array, says a distributed neural implant system might not be necessary for many near-term uses, like enabling basic motor functions or the use of a computer. However, more futuristic applications, like restoring memory or cognition, would almost certainly require a more complicated set-up. “Obviously, the Holy Grail would be a technology that could record from as many neurons as possible throughout the entire brain, the surface and the depth,” he says. “Do you need that in its entire complexity right now? Probably not. But in terms of understanding the brain and looking at future applications, the more information we have, the better.”

Smaller sensors could also mean less damage to the brain, he continues. Current arrays, even though already tiny, can cause inflammation and scarring around the implant site. “Typically, the smaller you make something, the less likely it is to be detected by the immune system as a foreign object,” says Solzbacher, who wasn’t involved in the Brown study. When the body detects a foreign object like a splinter, it tries to either dissolve and destroy it, or encapsulate it with scar tissue.

But while smaller may be better, it isn’t necessarily foolproof, Solzbacher cautions. Even miniscule implants could trigger an immune response, so the neurograins will also need to be made of biocompatible materials. A major hurdle with developing brain implants has been trying to minimize harm while building a long-lasting implant, to avoid the risk of replacement surgeries. Current arrays last around six years, but many stop working much sooner because of scar tissue.

If neurograins are the answer, there’s still the question of how to get them in the brain. In their rodent experiment, the Brown researchers removed a large portion of the rat’s skull, which, for obvious reasons, wouldn’t be ideal in humans. Current implanted arrays require drilling a hole into a patient’s head, but the Brown team wants to avoid invasive brain surgery entirely. To do that, they’re developing a technique to insert the neurograins involving thin needles that would be threaded into the skull with a special device. (Neuralink is pursuing a similar “sewing machine”-like robot for delivering its coin-shaped brain implant.)

Source

Author: showrunner