Neurorights: safeguarding human autonomy and dignity in the age of neurotechnology

by | Oct 3, 2023

author profile picture

About Rhiannon Procktor

Rhiannon is a second year Law with French Law student at Exeter College, University of Oxford. She is particularly interested in the ways the law can and should respond to developing technologies in the fields of neuroscience and bioethics on a national and international level.

A ground-breaking recent publication has detailed the first use of fMRI – in conjunction with a novel brain computer interface – to “decode language”. This has enabled the reconstruction of human neural activity to “generate intelligible word sequences that recover the meaning of perceived speech.” The technology has the potential to develop into what the layperson may describe as the scientific possibility of reading minds. The dual nature of the applications of such technology are promising, yet at times appear dystopian.  For example, it could enable individuals with neurodegenerative conditions to communicate. Conversely, the surveillance potentials of such technology, through the ability to access neural data, may open the door to unprecedented human rights abuses.  

The concept of neurorights is relatively new, yet academic and political discussion is increasing rapidly. Notably, in 2021 the Chilean Government enacted a landmark constitutional amendment to protect citizens’ mental integrity and privacy in the face of developing neurotechnologies.  Their efforts have been hailed as a promising example by the NeuroRights Foundation for taking the first steps in conceptualising a “human rights framework to inform neurotechnology.”

The foundation itself advocates for five basic ‘neurorights’ to be legally protected and enforceable internationally, which are identified and defined as follows:

  1. Mental Privacy – data obtained from measuring neural activity should be kept private. If it has been stored, then an individual should have the right to request its deletion.  There should be strict regulation of the sale and commercial transfer of neural data.
  1. Personal Identity – boundaries must be developed to prohibit technology from disrupting the sense of self as a result of neurotechnology: i.e. blurring the line between an individual’s consciousness and external technological inputs.
  1. Free Will – individuals should have ultimate control over decision making and be free from the threat of unknown manipulation from external neurotechnology
  1. Fair Access to Mental Augmentation – a set of guidelines should be established both internationally and domestically which regulate the use of mental enhancement neurotechnologies. The guidelines should be rooted in the principle of justice and guarantee equality of access.
  1. Protection from Bias – countermeasures to combat bias should be the norm for algorithms in neurotechnology. Algorithm design should include input from user groups to foundationally address bias.

These proposed neurorights have significant implications. Would they be considered as a “new set” of human rights, afforded the same level of fundamentality as the right to life, access to justice or education? Deliberations of this nature raise the question as to whether it is appropriate to introduce new rights or if we should consider exploring avenues of adapting existing ones. Either way, the creation of such a framework would necessarily be influenced by different considerations, such as autonomy, liberty, and the role of the state.

These are multifaceted considerations which do not readily lend themselves to widespread fundamental agreement.  They are likely to involve extensive discussion on various issues, such as the extent of powers exercised by public and private bodies, the careful balancing act between societal progress and individual freedoms, and comparative ethics in military and medical applications.  That is certainly not to say that this approach is futile, rather, it highlights the nuanced cultural and ethical considerations that a truly global framework would have to confront. Nevertheless, the ‘rights’ advocated for by the foundation are indicative of the need to consider how society will adapt in the face of rapidly developing technology.

The law must confront this new frontier – both its immense opportunities and dystopian challenges – to develop a framework which is able to embrace the transformative potential of neurotechnology whilst protecting human dignity and autonomy. This challenge demands urgent attention from the global human rights community.

Want to learn more?

Share this:

Related Content


Submit a Comment