A United Nations panel is warning that a technology Elon Musk champions could lead to human rights abuses. Brain-computer interfaces, which translate brain activity into commands that can control devices such as computers or robotic prosthetic limbs, promise to help people who are paralyzed move a cursor or a keyboard with their thoughts. Musk, a technology billionaire whom President-elect Donald Trump has tasked with co-leading a "Department of Government Efficiency," is the founder of Neuralink, a brain-computer interface company.
If a brain-computer interface is used inappropriately by the military, it could lead to human rights abuses, according to a report by an advisory committee to the United Nations Human Rights Council. How so? The technology “could effectively provide cognitive enhancement for military personnel, effectively merging human and machine intelligence,” the report says. The report, prepared for a meeting of the UN committee in Geneva earlier this week, examines the human rights implications of new and emerging technologies in the military. Brain-computer interfaces could be helpful on the battlefield. But the report also highlights concerns about privacy, free and informed consent and long-term violations of combatants’ physical and mental health. Coercive use, which could “severely undermine the dignity and autonomy of soldiers,” should be limited, it says. What BCI companies are saying: Brain-computer interface benefits far outweigh the risk of using it in war, Andreas Forsland, CEO of the California-based Cognixion Corporation, told Carmen. More than 50 million Americans with mental health conditions could benefit from the technology, as could the more than 2 million actively enrolled U.S. military members who have a higher risk for spinal cord injuries and paralysis than the general population, among others, he said.
“The ability to remotely monitor warfighter health and cognitive decision-making to reduce errors and casualties is an important consideration,” Forsland said. Tom Oxley, the CEO and founder of Synchron, said his company has no intention or interest in pursuing military applications of brain-computer interfaces.
“Our focus is developing a medical device for individuals with severe paralysis by enabling them to reconnect with their world through digital communication,” he told Carmen. What else? The report warns that AI tools to design living organisms could be used maliciously to create the blueprints of deadly or treatment-resistant pathogens. “It is paradoxical that while greater access to biotechnology on a population level has many advantages, it may also increase the likelihood and frequency of biosecurity threats due to either accidental or malicious use,” the report says. What's next: Using AI during warfare poses significant risks, the report concludes, and calls on governments to mitigate those risks by regulating the technology. The report doesn’t say whether nations use or plan to use brain-computer interfaces in the military. If they do, the report says, each nation should ensure that the use doesn’t violate human rights. Neuralink didn’t respond to a request for comment.
|