Brain-computer interfaces merge humans with AI
Artificial intelligence is already an integral part of everyday life. Order an Uber ride to take you to the airport for your flight, and you’ve used AI to summon a car and get on the airplane. Maybe you checked traffic beforehand to see how soon you needed to leave. That’s another use of AI. Ever search for something on the internet and the search engine tries to guess what you’ll type? AI again. There’s really no escaping it, and as we become increasingly connected, brain-computer interfaces (BCIs) are the logical next step in technological innovation.
As Elon Musk noted recently, “How much smarter are you with a phone or computer or without? You’re vastly smarter, actually. You can answer any question pretty much instantly. You can remember flawlessly. Your phone can remember videos, pictures perfectly. Your phone is already an extension of you. You’re already a cyborg.” The only thing missing is that your brain is not directly hooked up to the phone. But that might not be the case for very long.
Keeping Up with AI
Musk’s company Neuralink is developing high-bandwidth brain-computer interfaces with the goal of using “neural lace,” an ultra-thin mesh studded with electrodes implanted in the skull to monitor brain function. Musk said in September on Joe Rogan’s podcast that neural lace “will enable anyone to have superhuman cognition.” Musk worries that if people don’t embrace brain-computer interfaces, they will be left behind by AI. Whereas AI is continuously learning and can get smarter with age, humans have a point where their brain function begins to decline. Neural lace would aim to combat that slowdown.
Palmer Luckey, who developed the earliest versions of the Oculus Rift at 17, has become well-known for his desire to enhance his body with technology. In a recent interview with Wired, Luckey spoke of his transhumanist inclinations, describing an attempt to improve his reflexes by bypassing his nervous system and sending signals from his brain to his extremities electronically.
Similarly, self-described “mindful cyborg,” Chris Dancy has taken the use of technology to optimise his life to such great lengths that he is known as “the Most Connected Man on Earth.” He has more than 20,000 sensors that he wears, swallows, and places around his house, measuring everything from his posture, calories burnt, his blood pressure, and how he sleeps, to the temperature of his home, how he spends his time online, and the brightness and the colors of his lights. He believes that the data he collects helps him identify the best, most helpful parts of his life and maximize them.
Medical Uses
Brain-computer interfaces already play a key role in medical technology. In early trials, BrainGate has allowed patients suffering from ALS, stroke, and spinal cord injuries to control computer cursors with their thoughts. By simply imagining their hands moving the cursor while their brains were hooked up to BrainGate’s interface of micro-electrodes hooked up to the brain, patients with tetraplegia (full or partial loss of use of all four limbs) were able to reach targets on a computer screen with calibration times of less than a minute.
“In the past few years, our team has demonstrated that people with tetraplegia can use the investigational BrainGate BCI to gain multidimensional control of a robotic arm, to point-and-click on a computer screen to type 39-plus correct characters per minute, and even to move their own arm and hand again — all simply by thinking about that movement,” Dr. Leigh Hochberg, director of the BrainGate consortium, told Science Daily.
The rapid calibration shows computers can read neural signals in real time, allowing patients increased communication, mobility, and independence without needing a team of doctors and computer scientists around to ensure the system is working.
BrainPort Technologies began its work with a device to help stroke victims regain their sense of balance. In 2010, it stopped selling its Balance Plus to concentrate on Vision Pro, which uses a video camera attached to a headband to translate digital information into electrical patterns on the tongue. Blind users “see” with their tongues by interpreting bubble-like patterns to determine the size, shape, location, and motion of objects around them. The training course takes 10 hours over a three-day period, after which users can operate independently.
At The University of Southern California’s Center for Neural Engineering, brain research is being conducted to create a memory prosthesis that could take the place of the hippocampus in damaged brains, such as those of people suffering from Alzheimer’s disease. The prosthetic uses people’s memory patterns to help the brain encode and recall memory. Studies have shown a 37 per cent increase over baseline in episodic memory, new information that is stored for a short time. Episodic memory is often lost in patients with Alzheimer’s, stroke, and head injury.
Military Uses
The US Defense Advanced Research Projects Agency (DARPA) has invested heavily in brain-computer interfaces. One $90 million project is aimed at developing a “brain chip” to plug the brain into a computer, allowing for heightened senses. Another focusses on targeted neuroplasticity training, in which neurostimulation would allow brain synapses to fire more quickly. This could increase the speed at which people learn, allowing for better grasp of foreign languages, marksmanship, cryptography, and intelligence analysis.
Defence contractors Raytheon and Lockheed Martin are developing exoskeletons to give soldiers increased strength and flexibility. In the future, these could be wired with brain-computer interfaces to further enhance human capability. The interfaces could improve efficiency in the OODA loops of military personnel. In the loops, pilots or soldiers observe their surroundings, orientate themselves by analysing the data they’ve collected, make a decision on a course of action, and act on that decision. With their senses augmented by computers, they’d be able to arrive at correct decisions more quickly than with brain power alone.
As they’re primarily developed for use in the field, such interfaces will need to be much more compact than current models. In this way, whatever technology is developed for military use will improve on existing interfaces, with those improvements making their back into the private sector. While we might not all turn into RoboCop, we might just find ourselves cosying up to computers way more than we do even now. If you can’t beat ‘em, join ‘em.