Features
Sophisticated Technology Graphic
Taking full advantage of increasingly fast, smart and sophisticated technology to control complex systems will require direct interface with the brain.

When a cyber-warrior defends a complex computer network, or a pilot commands a team of unmanned vehicles, or a submarine officer interacts with intricate sensor systems, they are often limited by conventional interfaces: their fingers, eyes and ears. And while commercial technology has opened new possibilities for controlling complex systems — touch screens, mixed reality and voice control — taking full advantage of these increasingly fast, smart and sophisticated systems will require direct interface with the brain.

With a recent award from the Defense Advanced Research Projects Agency (DARPA), researchers from the Johns Hopkins University Applied Physics Laboratory (APL) will develop a brain-machine interface (BMI) that will enable the control of complex systems at the speed of thought.

Today’s most promising BMI systems rely on microelectrodes that are surgically implanted into the brain. These systems closely monitor single neurons and neural populations, extract features from neural signals that reflect the user’s intent, translate these features into actions — such as typing on a virtual keyboard, controlling a prosthetic limb, or piloting a simulated aircraft — and send feedback to the brain through stimulation, as explained by APL research scientists at the DARPA Artificial Intelligence Colloquium in March.

“Implanted BMIs are beginning to prove useful for clinical populations — where the risks of brain surgery may be justified — to improve mood, memory, pain, communication and mobility. But only nonsurgical approaches can scale to the wider population,” said APL Program Manager Michael Wolmetz, who has a doctorate in cognitive science. “Fundamental limitations in the physics and neuroscience behind today’s best nonsurgical methods tell us that major advances are needed to approach the spatial resolution, temporal resolution and signal quality of invasive BMIs in a noninvasive, portable system.”

To make those major advances, APL researchers have embarked on a four-year project as part of DARPA’s Next-Generation Nonsurgical Neurotechnology (N3) program, with the ambitious goal of developing a completely noninvasive, bidirectional neural interface. The APL team is using coherent optical approaches to measure changes in the properties of the light that occur during scattering from tissue during neural activity. This effort is based on one of the Laboratory’s recent discoveries: a breakthrough in coherent imaging — the ability to detect both the magnitude and phase of the scattered light — that makes it possible to noninvasively record optical signatures of neural activity at high spatial and temporal resolutions that approach those achieved using invasive techniques.

“Although the underlying mechanism of this fast optical signal is still being explored, it is attributed to the motion of neural tissue that occurs during the firing of neurons,” explained Principal Investigator David Blodgett, the chief scientist in APL’s Research and Exploratory Development Department. “Because this is an intrinsic property of the tissue, the time latency between the onset of neural activity and optical detection is comparable to that for invasive recordings.”

APL scientists have been exploring this approach over the last four years and see a path to a wearable BMI device. “Although noninvasive techniques may never achieve single neuron detection of neural activity, they can provide full brain coverage that may open up even more capabilities,” Blodgett said.

The Laboratory will demonstrate the efficacy of a noninvasive BMI through a series of increasingly complex demonstrations that will progress through the human control of a prosthetic limb, move to the control of an unmanned aerial vehicle swarm and culminate with use in a real-world cyber defense scenario.

Source