Science-fiction can sometimes be a good guide to the future. In the film Upgrade (2018) Grey Trace, the main character, is shot in the neck. His wife is shot dead. Trace wakes up to discover that not only has he lost his wife, but he now faces a future as a wheelchair-bound quadriplegic.
He is implanted with a computer chip called Stem designed by famous tech innovator Eron Keen – any similarity with Elon Musk must be coincidental – which will let him walk again. Stem turns out to be an artificial intelligence (AI) and can “talk” to him in a way no one else can hear. It can even take over control of his body. You can guess the rest of the story.
后来，格雷被植入了由著名科技创新者伊隆·基恩（Eron Keen）设计的一款名为“Stem”的电脑芯片（如果伊隆·基恩的名字与伊隆·马斯克（Elon Musk）有任何相似之处纯属巧合），这让格雷重新恢复了行走。Stem是一种人工智能（AI），它可以用别人听不到的方式与他“交谈”，甚至可以控制他的身体。接下来的故事你可能已经猜到。
The reality of being a cyborg in 2019 is much less dramatic – but still incredible. In 2012, as part of a research programme led by Jennifer Collinger, a biomedical engineer at the University of Pittsburgh, and funded by the US government’s Defense Advanced Research Projects Agency (Darpa), Jan Scheuermann became one of a tiny handful of people to be implanted with a brain-computer interface. The 53-year-old woman, a quadriplegic due to the effects of a degenerative disorder, has two cables attached to box-like sockets in her head, which connect to what looks like a video game console.
现在，2019年中，混合有机体和电子机器的“赛博格”半机器人（生化人）技术的发展并没有那么激动人心，但仍然令人难以忘怀。2012年，谢尔曼（Jan Scheuermann）成为极少数植入脑-机接口的人。这个项目是匹兹堡大学（University of Pittsburgh）生物医学工程师科林格（Jennifer Collinger）领导的研究项目的一部分，该项目由美国国防高级研究计划局（Darpa）资助。53岁的谢尔曼因退行性疾病导致四肢瘫痪，她的头部有两个盒子状的插座，连接着一个看起来像游戏机样的东西。
Scheuermann can use this brain-computer interface to control a robotic arm with her thoughts, well enough to feed herself chocolate. Three years later she successfully flew a fighter aircraft in a computer simulator.
Darpa has been funding research into these interfaces since the 1970s, and now wants to go one step closer to the kind of world glimpsed in Upgrade. The goal of the Next-Generation Nonsurgical Neurotechnology (N3) programme launched earlier this year is to remove the need for electrodes, cables and brain surgery.
美国国防高级研究计划局自20世纪70年代以来一直在资助关于脑-机接口的研究，现在想要进一步接近电影《升级》中所看到的场景。今年早些时候，启动的下一代非手术神经技术（Next-Generation Nonsurgical Neurotechnology，简称N3）项目，目标是在未来消除对电极、连接线和脑部手术的需求。
Al Emondi, who manages the programme, has given scientists from six of the USA’s leading research institutes the task of developing a piece of hardware capable of reading your thoughts from the outside of your head and small enough to be embedded into a baseball cap or headrest. In an approach that has been compared to telepathy – or the creation of “a true brain-computer interface”, according to Emondi – the device has to be bi-directional, able to transmit information back to the brain in a form that the brain will understand.
Emondi has given the scientists only four years to take the new technology from the laboratory to the point it can be tested on humans. Even Elon Musk’s plan for an Upgrade-style brain–computer interface, Neuralink, still requires risky surgery to embed the chip in the brain, even if it does replace cables with a form of wireless communication.
“The ability to really change the world doesn't happen often in a career,” says Emondi. “If we can build a neural interface that’s not invasive, we will have opened up the door to a whole new ecosystem that doesn’t exist right now.”
“The most common applications are to help people who have lost the ability to move their arms and quadriplegics, paraplegics,” says Jacob Robinson, an electrical and computer engineer at Rice University, Houston, Texas, and the principal researcher of one of the teams. “Imagine then, if we can have the same kind of ability to communicate with our machines but without surgery, then we open up this technology to a broad user base, people who are otherwise able-bodied who just want faster ways to communicate with their devices.”
德克萨斯州休斯顿莱斯大学（Rice University）的电气和计算机工程师，也是研究小组的首席研究员的罗宾逊（Jacob Robinson）说：“这个系统最常见的应用是帮助那些双臂麻痹、四肢瘫痪和截肢的人，想象一下，如果能在不做手术的情况下，拥有与机器沟通的能力，那么我们就可以向广大用户群开放这项技术，那些身体健全的人也可以用更快的方式与他们的设备沟通。”
Some other researchers think our fascination with brain-computer interfaces is about something more profound. “The only way that humans have evolved to interact with the world is through our bodies, our muscles and our senses, and we’re pretty good at it,” says Michael Wolmetz, a human and machine intelligence research lead at Johns Hopkins Applied Physics Laboratory in Laurel, Maryland. “But it’s also a fundamental limitation on our ability to interact with the world. And the only way to get outside of that evolutionary constraint is to directly interface with the brain.”
研究人员认为，我们对脑-机接口的着迷也许有着更深层次的原因。马里兰州劳雷尔市约翰·霍普金斯（Johns Hopkins）应用物理实验室的人类和机器智能研究负责人沃尔梅茨（Michael Wolmetz）说：“人类进化到目前，我们与世界互动的唯一方式是通过身体、肌肉和感官，我们非常擅长这样做。但这也是我们与世界互动能力的基本限制。摆脱进化束缚，唯一方法就是直接与大脑互动。”
Despite its slightly unnerving strapline of “creating breakthrough technologies and capabilities for national security”, Darpa has a history of pioneering technologies that shape the world that we civilians live in. The development of the internet, GPS, virtual assistants like Apple’s Siri and now AI has all been sped up thanks to the dollars ploughed into these areas by the agency. Its funding of research into brain-computer interfaces suggests it could be a similarly game-changing technology. But it is not alone.
Musk’s Neuralink is just one of a number of projects attracted by the potential of brain-computer interfaces. Major technology firms including Intel are also working in this area.
And there are great rewards for those who manage to crack it – the market in neurological technology is expected to be worth $13.3bn (£10.95bn) in 2022.
Brain-computer interfaces are possible today only because in the 1800s scientists tried to understand the electrical activity that had been discovered in the brains of animals. During the 1920s, Hans Berger developed the electroencephalograph (EEEG) to detected electrical activity from the surface of the human skull and recorded it. Fifty years later computer scientist Jacques Vidal’s research at the University of California Los Angeles (UCLA) led him to coin the term “brain–computer interface”.
在今天脑-计算机接口之所以成为可能，是因为在19世纪，科学家们试图了解在动物大脑中发现的电活动。在20世纪20年代，伯杰（Hans Berger）发明了脑电图仪（EEEG）来检测人类头骨表面的电活动并记录下来。50年后，加州大学洛杉矶分校的计算机科学家维达尔（Jacques Vidal）进行了更深入的研究，并创造了“脑-机接口”（brain-computer interface, BCI）一词。
Scientists then had to wait for computing power, artificial intelligence and nanotechnology for their visions to be realised. In 2004, a quadriplegic patient was implanted with the first advanced computer interface after a stabbing left him paralysed from the neck down. This allowed him to play ping pong on a computer just by thinking about it.
Despite such successes, problems remain. “The quality of the information that you can transmit is limited by the number of channels,” says Robinson. “The interfaces require cutting a hole in the skull to put the electrode directly in contact with the brain. Your device might only operate for a limited amount of time before your body rejects it; or if the devices fail, it’s hard to get them out.”
To achieve the goal of an interface that works without the need for brain surgery, Emondi’s teams are exploring using combinations of techniques such as ultrasound, magnetic fields, electric fields and light to read our thoughts and/or write back. Problems include how you tell useful neural activity from the cacophony of other noise the brain emits. It has also got to be able to pick up the signals through the skull and the scalp.
“When you consider the problem of imaging through a scattering medium, millimetres in the skull is the equivalent of tens of metres in the ocean and kilometres in the atmosphere in terms of the clutter you have to face,” says David Blodgett, principal investigator for the team from Johns Hopkins University Applied Physics Laboratory team.
“But we still believe that we can get very useful information,” says Emondi.
Some teams are looking at what Emondi calls “minutely invasive surgery”. “You can still put something in the body, but you can’t do it through any surgical means,” he says. This means you have to eat something, inject it or squirt it up your nose. One team is looking at nanoparticles that act as “nanotransducers” when they reach their destination in the brain. These are very small particles the width of a human air that can transform external magnetic energy into an electric signal to the brain and vice versa. Another is looking at using viruses to inject DNA into to cells to alter them to do a similar job.
If these techniques work, then the performance of a minutely invasive interface should be able to match that of a chip surgically implanted into the body.
Then there is the challenge of getting the information from the device to the computer and delivering a response in a split second.
“If you were using a mouse with a computer, and you click it, and then you have wait to a second for it to do something, then that technology would never get off the ground,” says Emondi. “So, we’ve got to do something that’s going to be superfast.”
The interfaces need to have “high resolution” and enough “bandwidth”, or channels of communication, to fly a real drone rather than move a robotic arm.
But even if we can do it, how exactly do we communicate? Will we be communicating in words or in pictures? Will we be able to talk with a friend or pay bills online? How much will this be unique to each individual? No one really knows the answers to such questions because the rules haven’t been written yet.
“All new interfaces take some practice to get used to,” says Patrick Ganzer, co-investigator on the project at Battelle. “It’s hard to say how easy this new brain-computer interface will be to use. We don’t want users to have to learn hundreds of rules. One attractive option is to have outputs from the user’s brain-computer interface to communicate with a semi-autonomous device. The user will not need to control every single action but simply set a ‘process in motion’ in the computer system.”
Emondi goes further than this: “As the AI becomes better, the systems we are interoperating with are going to become more autonomous. Depending on the task, we may just have to say, ‘I want that ball’ and the robot goes and gets it itself.”
The film Upgrade may have hinted at a problem, however; who exactly is in control?
But there are some clues. “To date, most brain-computer interfaces have extracted detailed movement or muscle-related information from the brain activity even if the user is thinking more broadly about their goal,” says Jennifer Collinger. “We can detect in the brain activity which direction they want to move an object and when they want to close their hand and the resulting movement is a direct path to the object that enables them to pick it up. The user does not have to think ‘right’, ‘forward’, ‘down’.”
“The amount of mental effort required to operate a BCI varies between participants but has typically been greater for non-invasive interfaces. It remains to be seen whether any technologies that come out of N3 will allow the user to multi-task.”
There is an even more fundamental question than this. No one who is able-bodied has yet chosen to be embedded with an interface in order to play a video game like Fortnite or shop online – and no one knows whether their behaviour towards an interface would be different, nor whether it would change if the chip was in a baseball cap.
The ethical dilemmas are tremendous. “The benefits coming out of that technology have to outweigh the risks,” says Emondi. “But if you’re not trying to regain some function that you’ve lost then that’s different: that’s why non-invasive approaches are so interesting.
“But just because it’s not invasive technology doesn’t mean that you aren’t causing harm to an individual’s neural interface – microwaves are non-invasive, but they wouldn’t be a good thing,” he adds. “So, there are limits. With ultrasound, you have to work within certain pressure levels. If it’s electric fields, you have to be within certain power levels.”
The development of powerful brain-computer interfaces may even help humans survive the hypothetical technological singularity, when artificial intelligence surpasses human intelligence and is able to self-replicate itself. Humans could use technology to upgrade themselves to compete with these new rivals, or even merge with an AI, something Elon Musk has made explicit in his sales pitch for Neuralink.
“Our artificial intelligence systems are getting better and better,” says Wolmetz. “And there is a question of at what point humans become the weakest link in the systems that we use. In order to be able to keep up with the pace of innovation in artificial intelligence and machine learning, we may very well need to directly interface with these systems.”
In the end, it may not make any difference. At the end of the film Upgrade, Stem takes full control over Grey’s mind and body. The mechanic’s consciousness is left in idyllic dream state in which he isn’t paralysed, and his wife is alive.