Navigating Neurotechnological Regulations

Font Size:

Researchers address the challenges regulators face in protecting the public amid the rapid growth of neurotechnologies.

Font Size:

They can “decode and alter our perception, behavior, emotion, cognition, and memory—arguably, the very core of what it means to be human.”

That was the warning the United Nations Educational, Scientific, and Cultural Organization issued in a 2023 report concerning neurotechnologies—devices that communicate with neural networks in the human brain. And after the company Neuralink reportedly announced that it has implemented its computer chip in the company’s first-ever human patient, the public is now calling for regulation of neurotechnologies.

And although the public’s attention has only recently shifted to regulating neurotechnologies,

researchers Marta Sosa Navarro & Salvador Dura-Bernal have long wrestled with this challenge. In a recent article, the two discuss the array of threats neurotechnologies could pose in the future, outline the practical challenges in addressing these threats, and summarize the regulatory state of neurotechnologies in the United States and abroad.

Neurotechnologies “read and modify the brain” through a process called neurostimulation—the use of electrical zaps into the skull. Neurostimulation can cause physical damage, such as burning of brain tissue from high electrical intensity. In addition to physical dangers associated with neurotechnologies, Navarro and Dura-Bernal raise a broad array of human rights concerns, including data privacy risks and abusive governments mentally altering vulnerable populations, such as prisoners.

Navarro and Dura-Bernal acknowledge that regulators face significant difficulties implementing solutions to these concerns.

For instance, Navarro and Dura-Bernal cite the rapid pace at which this technology is innovating as one of the key difficulties regulators face. Yet, Navarro and Dura-Bernal further note that regulator action is critical, as a growing number of private companies with a direct-to-consumer business model seek to sell products that are not yet properly regulated to the public.

Navarro and Dura-Bernal also note how the challenge of rapid innovation is compounded by the philosophical questions regulators must address to regulate neurotechnologies, including “what is thought?”

They detail how various scholars and philosophers differ significantly on defining thought, and how these differences can result in regulations that may not address the issues at stake from neurotechnologies.

For example, if a thought only means brain activity, this definition is likely insufficient to protect from abuses of neurotechnologies, Navarro and Dura-Bernal explain. Regulations that prevent a device from interfering with brain activity would not necessarily prevent a device from altering brain structure by destroying certain brain tissue connections to make someone forget a memory. Although this destruction might not affect an individual’s current brain activity, it may alter future thoughts by impeding memory recall. Consequently, Navarro and Dura-Bernal argue that a wider definition of thought that includes both brain activity and brain structure would better protect individuals.

Another challenge that Navarro and Dura-Bernal identify is defining how regulators would classify neurotechnology, which affects the regulations to which neurotechologies are subject. In the United States, neurotechnologies may be considered either a “wearable technology” or a “medical device,” they explain.

Medical devices typically carry higher safety standard and privacy regulations than wearable technologies, but this classification is dependent on the product’s intended use, the claims the manufacturer makes, and the product’s risks.

Navarro and Dura-Bernal also note that the U.S. Food and Drug Administration (FDA) has typically treated neurotechnologies as wearable technologies, but new draft guidance from FDA seems increasingly mindful of the pitfalls of neurotechnologies and the need for more stringent requirements.

In the European Union, however, regulators have considered neurotechnology as a medical device since April 2017, even including them when there is no “intended medical purpose.” The European Union also imposes manufacturer obligations, such as conforming to specified regulatory procedures, before releasing the product on the market. The EU calls upon member states to implement safety procedures to monitor neurotechnologies after they enter the market and define the circumstances when regulators could withdraw, recall, or limit their use.

In spite of these steps, some scholars have hailed Chile as the pioneer in neurotechnological legislation. In December 2020, Chile amended Article 19 of its constitution to include “the right to neuroprotection.” The nation is considering even further legislative action that would implement Article 19 and protect the right to mental integrity. Navarro and Dura-Bernal and other scholars, however, criticize these bills as being too ambiguous, underscoring the difficulties regulators may face in defining the contours of protection from neurotechnologies.

Still, these legislative actions have led the world to recognize a greater need for regulation. In response, the EU and United States created the “Trade and Technology Council,” which aims to address many of the challenges to maintain human rights and democratic freedom amid the growth in this emerging technology.

Yet, even with this new council, it remains to be seen if regulators can keep up with the rapid expansion of technology and prevent people from ending up like some neurotechnology animal test subjects.