A few weeks ago I was brushing my teeth and trying to remember who made “La Bamba” a big hit back in the late 1950s. I knew the singer had died in a plane crash with Buddy Holly; if I’d been downstairs I would have gone straight to Google. But even if I’d had a spoken-language Internet interface in the bathroom, my mouth was full of toothpaste. I realized that what I really want is an implant in my head, directly coupled into my brain, providing a wireless Internet connection.
In my line of work, an effective brain-computer interface is a perennial vision. But I’m starting to think that by 2020 we might actually have wireless Internet interfaces that ordinary people will feel comfortable having implanted in their heads-just as ordinary people are today comfortable with going to the mall to have laser eye surgery. All the signs-early experimental successes, societal demand for improved health care, and military research thrusts-point in that direction.
Remote-controlled rats are perhaps the most stunning evidence of this trend. Last year, John Chapin and his colleagues at the State University of New York’s Downstate Medical Center in Brooklyn reported installing brain implants that stimulate areas of the rat cortex where signals are normally received from the whiskers. Left/right cues from a laptop computer made the rats feel as if their whiskers had brushed into obstacles, prompting them to turn in the appropriate directions. To impel the rats up difficult inclines, a second implant stimulated pleasure centers in their brains.
This experiment built on the 1999 efforts of Chapin and Miguel Nicolelis at Duke University that enabled rats to mentally induce a robot arm to release water. First, a computer recorded the patterns of neural firing in key areas of the rats’ brains when the rodents pressed a lever that controlled the robot arm. Once the computer learned the neural pattern associated with lever-pushing, it moved the robot arm when it detected the rats merely “thinking” about doing so. In later versions of this technology, monkeys were able to control a more sophisticated robot arm as though it were their own.
Machine-neuron connections are working in people, too. Thousands of once deaf people can understand conversations thanks to cochlear implants. A tiny microphone in the ear picks up sound, and a small package of electronics translates this into direct stimulation of neurons in the cochlea. More recently, there have been reports of human trials in which comparable (though much more crude and early-stage) visual implants enabled blind patients to perceive something of their surroundings. And a handful of quadriplegic patients have neural implants that let them control computers by “thinking” about moving particular muscles.
Why am I confident that brain-Internet interfaces will become a reality? Because it’s not really such a vast leap from here to a thought-activated Google search: these human-tested technologies already give us the components that we would need to directly connect the Internet to a person’s brain. And because there are both medical and military pulls on related technologies. On the medical side, besides the urgency of providing physical and mental prostheses to patients with severe injuries, baby boomers are getting older, and their nervous systems are starting to fall apart. There will be increased demand for patching up deteriorating nervous subsystems-and baby boomers have always gotten what they demand. At minimum, this will drive the development of direct visual interfaces that by 2010 will help blind people as much as today’s cochlear implants help deaf people.
And on the military side, direct neural control of complex machines is a long-term goal. The U.S. Defense Advanced Research Projects Agency has a brain-machine interface program aimed at creating next-generation wireless interfaces between neural systems and, initially, prosthetics and other biomedical devices.
Just as the modern laptop was inconceivable when the standard computer interface was the punch card, it’s hard to imagine how a brain-Internet interface will feel. As brain-imaging technologies continue their rapid advance, we will get a better understanding of where in the brain to insert signals so that they will be meaningful-just as the control signals for the rats were inserted into neurons normally triggered by whiskers.
We still need broad advances, of course. We need algorithms that can track the behavior of brain cells as they adapt to the interface, and we’ll need better understanding of brain regions that serve as centers of meaning. But we’ll get there. And when we do, we won’t “see” an image similar to today’s Web pages. Rather, the information contained in a Web server will make us feel as though “Ritchie Valens” just popped into our heads.