The Times Real Estate


.

  • Written by Laura Y. Cabrera, Associate Professor of Neuroethics, Penn State
New neurotechnology is blurring the lines around mental privacy – but are new human rights the answer?

Neurotechnologies – devices that interact directly with the brain or nervous system – were once dismissed as the stuff of science fiction. Not anymore.

Several companies are trying to develop brain-computer interfaces, or BCIs[1], in hopes of helping patients with severe paralysis or other neurological disorders. Entrepreneur Elon Musk’s company Neuralink, for example, recently received Food and Drug Administration approval to begin human testing[2] for a tiny brain implant[3] that can communicate with computers. There are also less invasive neurotechnologies, like EEG headsets[4] that sense electrical activity inside the wearer’s brain, covering a wide range of applications[5] from entertainment and wellness to education and the workplace.

Neurotechnology research and patents have soared at least twentyfold over the past two decades, according to a United Nations report[6], and devices are getting more powerful. Newer BCIs, for example, have the potential to collect brain and nervous system data[7] more directly, with higher resolution, in greater amounts, and in more pervasive ways.

However, these improvements have also raised concerns about mental privacy and human autonomy – questions I think about in my research on the ethical and social implications of brain science and neural engineering[8]. Who owns the generated data, and who should get access? Could this type of device threaten individuals’ ability to make independent decisions?

In July 2023, the U.N. agency for science and culture held a conference on the ethics of neurotechnology[9], calling for a framework to protect human rights. Some critics have even argued that societies should recognize a new category of human rights, “neurorights[10].” In 2021, Chile became the first country[11] whose constitution addresses concerns about neurotechnology.

Advances in neurotechnology do raise important privacy concerns. However, I believe these debates can overlook more fundamental threats to privacy.

A glimpse inside

Concerns about neurotechnology and privacy focus on the idea that an observer can “read” a person’s thoughts and feelings just from recordings of their brain activity.

It is true that some neurotechnologies can record brain activity with great specificity: for example, developments on high-density electrode arrays[12] that allow for high-resolution recording from multiple parts of the brain.

Someone standing outside the frame adjusts a glowing monitor hooked up to a computer.
Paradromics, an Austin-based company, is developing a brain-computer interface to aide disabled and nonverbal patients with communication. Julia Robinson for The Washington Post via Getty Images[13]

Researchers can make inferences about mental phenomena and interpret behavior based on this kind of information. However, “reading” the recorded brain activity is not straightforward. Data has already gone through filters and algorithms before the human eye gets the output.

Given these complexities, my colleague Daniel Susser[14] and I wrote a recent article in the American Journal of Bioethics – Neuroscience[15] asking whether some worries around mental privacy might be misplaced.

While neurotechnologies do raise significant privacy concerns, we argue that the risks are similar to those for more familiar data-collection technologies, such as everyday online surveillance[16]: the kind most people experience through internet browsers and advertising, or wearable devices. Even browser histories on personal computers are capable of revealing highly sensitive information.

It is also worth remembering that a key aspect of being human has always been inferring other people’s behaviors, thoughts and feelings. Brain activity alone does not tell the full story; other behavioral or physiological measures are also needed to reveal this type of information, as well as social context. A certain surge in brain activity might indicate either fear or excitement, for example.

However, that is not to say there’s no cause for concern. Researchers are exploring new directions in which multiple sensors – such as headbands, wrist sensors and room sensors – can be used to capture multiple kinds of behavioral and environmental data. Artificial intelligence could be used to combine that data into more powerful interpretations[17].

Think for yourself?

Another thought-provoking debate around neurotechnology deals with cognitive liberty. According to the Center for Cognitive Liberty & Ethics[18], founded in 1999, the term refers to “the right of each individual to think independently and autonomously, to use the full power of his or her mind, and to engage in multiple modes of thought.”

More recently, other researchers have resurfaced the idea, such as in legal scholar Nita Farahany’s[19] book “The Battle for Your Brain[20].” Proponents of cognitive liberty argue broadly for the need to protect individuals from having their mental processes manipulated or monitored without their consent. They argue that greater regulation of neurotechnology may be required to protect individuals’ freedom to determine their own inner thoughts and to control their own mental functions.

A man in a gray turtleneck stands with what looks like a black and white bike helmet on his head. Seung Wan Kang, founder and CEO of iMediSync Inc., displays the company’s iSyncWave, which allows people to measure their brainwaves at home, at CES 2023 in Las Vegas. Ethan Miller/Getty Images[21]

These are important freedoms, and there are certainly specific features – like those of novel BCI neurotechnology and nonmedical neurotechnology applications – that prompted important questions. Yet I would argue that the way cognitive freedom is discussed in these debates sees each individual person as an isolated, independent agent, neglecting the relational aspects[22] of who we are and how we think.

Thoughts do not simply spring out of nothing in someone’s head. For example, part of my mental process as I write this article is recollecting and reflecting on research from colleagues. I’m also reflecting on my own experiences: the many ways that who I am today is the combination of my upbringing, the society I grew up in, the schools I attended. Even the ads my web browser pushes on me can shape my thoughts.

How much are our thoughts uniquely ours? How much are my mental processes already being manipulated by other influences? And keeping that in mind, how should societies protect privacy and freedom?

I believe that acknowledging the extent to which our thoughts are already shaped and monitored by many different forces can help set priorities as neurotechnologies and AI become more common. Looking beyond novel technology to strengthen current privacy laws may give a more holistic view of the many threats to privacy, and what freedoms need defending.

References

  1. ^ brain-computer interfaces, or BCIs (theconversation.com)
  2. ^ to begin human testing (www.usatoday.com)
  3. ^ tiny brain implant (theconversation.com)
  4. ^ like EEG headsets (doi.org)
  5. ^ covering a wide range of applications (unesdoc.unesco.org)
  6. ^ according to a United Nations report (unesdoc.unesco.org)
  7. ^ collect brain and nervous system data (theconversation.com)
  8. ^ ethical and social implications of brain science and neural engineering (rockethics.psu.edu)
  9. ^ conference on the ethics of neurotechnology (www.unesco.org)
  10. ^ neurorights (neurorightsfoundation.org)
  11. ^ the first country (doi.org)
  12. ^ high-density electrode arrays (doi.org)
  13. ^ Julia Robinson for The Washington Post via Getty Images (www.gettyimages.com)
  14. ^ Daniel Susser (infosci.cornell.edu)
  15. ^ American Journal of Bioethics – Neuroscience (doi.org)
  16. ^ online surveillance (www.businessnewsdaily.com)
  17. ^ more powerful interpretations (braininitiative.nih.gov)
  18. ^ Center for Cognitive Liberty & Ethics (web.archive.org)
  19. ^ Nita Farahany’s (law.duke.edu)
  20. ^ The Battle for Your Brain (us.macmillan.com)
  21. ^ Ethan Miller/Getty Images (www.gettyimages.com)
  22. ^ neglecting the relational aspects (doi.org)

Authors: Laura Y. Cabrera, Associate Professor of Neuroethics, Penn State

Read more https://theconversation.com/new-neurotechnology-is-blurring-the-lines-around-mental-privacy-but-are-new-human-rights-the-answer-205446

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more