The idea of a device that can directly link your brain waves to communicate with an external device seems something quite futuristic, but the fact is, these components already exist. These so-called brain-computer interfaces (BCIs) or mind-machine interfaces (MMIs) can allow disabled people to control a prosthetic (such as an arm), merely by directing it with their thoughts.
Someday, such technology may help the elderly command their assistive exoskeletons. Researchers are also looking into how these devices might also help factory workers control robots with their minds, or drivers manage their self-driving cars. Entrepreneurs like Neuralink‘s Elon Musk are pushing to develop tiny, implantable BCIs in the next few years that will help augment natural human intelligence so that they aren’t completely overtaken by the rapid development of artificially intelligent machines.
It may be some time before brain-computer interfaces and other neurotechnologies become widely integrated into society. But an international and interdisciplinary group of scientists, clinicians, ethicists and AI experts are cautioning in a recent piece in Nature that neurotechnology’s impacts, best practices and ethics should be examined and discussed in greater depth, to prevent future problems from arising. They point out that while these technologies may have great benefits, without careful consideration and oversight they could also have profoundly negative repercussions.
“Such advances could revolutionize the treatment of many conditions, from brain injury and paralysis to epilepsy and schizophrenia, and transform the human experience for the better,” wrote the group of experts. “But the technology could also exacerbate social inequalities and offer corporations, hackers, governments or anyone else new ways to exploit and manipulate people. And it could profoundly alter some core human characteristics: private mental life, individual agency and an understanding of individuals as entities bound by their bodies.”
In particular, they are pointing to upcoming developments in brain-computer interfaces being spearheaded by companies like Neuralink and Kernel, creating wireless devices which will directly link humans to “powerful computational systems,” which have to ability to not only ‘read’ human brain activity, but also “write” neural data into the brain. According to the group, they foresee technological changes happening much faster than governments or societies can adapt, leading to an uncharted and potentially perilous territory where users and their data might not only be exploited for financial gain but even lose their sense of self.
In particular, the group identified four major areas in the emerging neurotechnology field that require “immediate action”: protecting privacy and establishing means of giving or refusing consent; protecting individual agency and identity in the face of neurotechnologically enabled collectivity; dealing with the social effects of human augmentation, and combating algorithmic bias.
It’s no surprise that privacy is at the top of the list. Giant corporations and advertisers already have access to an unimaginably vast treasure trove of personal information, which with the help of algorithms, helps them target ads with an already creepy accuracy. Neuromarketing is the next step: with BCIs reading (or even overwriting) our biological or emotional states, this even finer-grained neural data can be then used by companies for their benefit.
“We believe that citizens should have the ability — and right — to keep their neural data private,” wrote the group. “We propose that the sale, commercial transfer and use of neural data be strictly regulated.” Centralized processing of neural data should also be restricted. Alternative computational methods like “federated learning” (ie. mass, decentralized machine learning) and blockchain-based methods can be used to transparently track data so that the privacy of users can be better protected.
Besides privacy concerns, our very identities and nature of our self-awareness may be altered in unquantifiable ways by neurotechnology. These conditions may be further amplified if several brains are wired to work in unison, leading to a potential suppression of individual agency where resistance becomes futile.
“Neurotechnologies could clearly disrupt people’s sense of identity and agency, and shake core assumptions about the nature of the self and personal responsibility — legal or moral,” wrote the authors. “People could end up behaving in ways that they struggle to claim as their own if machine learning and brain-interfacing devices enable faster translation between an intention and an action.”
To tackle this, the group recommends enshrining the protection of human “neurorights” to ensure the integrity of mental and bodily identity, and our free will to choose without neural manipulation.
“Augmentation Arms Race”
The experts also anticipate a possible “augmentation arms race,” with governments creating “super-soldiers” or ordinary citizens augmenting themselves to gain new abilities or regain lost ones.
“The pressure to adopt enhancing neurotechnologies, such as those that allow people to radically expand their endurance or sensory or mental capacities, is likely to change societal norms, raise issues of equitable access and generate new forms of discrimination,” explained the group. In addition to the devices themselves, the authors emphasize that the algorithms underlying the systems must be corrected for any biases that may perpetuate further social inequality.
Ultimately, the group is calling for the international scientific community and governments to come together and discuss what “responsible neuroengineering” will look like: what’s permitted, what’s not and why? How are users and their data and identity protected? How can discrimination based on augmentation be prevented? As these ethical boundaries are debated and drawn, how might companies, institutions and society inculcate these values into developers, engineers and researchers in a way that benefits humanity? There’s no quick and easy answer, but it would behoove us to at least ask these questions, before forging ahead.
Image: Alex Iby