Dr. Robyn Bluhm, Associate Professor in the Department of Philosophy and Lyman Briggs College, and Dr. Laura Cabrera, Assistant Professor in the Center for Ethics and Humanities in the Life Sciences and the Department of Translational Neuroscience, are co-authors of an article in the Spring 2020 issue of IJFAB: International Journal of Feminist Approaches to Bioethics.
Their article, “Deep Brain Stimulation and Relational Agency: Negotiating Relationships,” appears in a special section on feminist neurotechnologies. From the article’s introduction:
In this commentary, we consider three aspects of [Timothy] Brown’s discussion of DBS and relational agency: (1) the importance of thinking critically about what it means to have a relationship with a DBS device; (2) how the development of “closed loop” implants might change the kinds of relationships that are possible; and (3) the need to consider how an individual’s relationship with their device is shaped by their relationship with others in their lives. We see ourselves as building on, or offering suggestions for further developing, Brown’s important paper.
Drs. Bluhm and Cabrera are co-investigators on the project “Is the Treatment Perceived to be Worse than the Disease?: Ethical Concerns and Attitudes towards Psychiatric Electroceutical Interventions,” funded by the NIH BRAIN Initiative. Deep brain stimulation (DBS) is one of four types of psychiatric electroceutical interventions (PEIs) included in the scope of the project.
The full text is available online via University of Toronto Press (MSU Library or other institutional access may be required to view this article).
Center Assistant Professor Dr. Laura Cabrera and co-authors Charles Sadle and Dr. Erin Purcell have an article in the August 2019 issue of Nature Biomedical Engineering. In “Neuroethical considerations of high-density electrode arrays,” the authors state that “the development of implantable electrode arrays that broadly and seamlessly integrate with brain tissue will require innovation that responsibly considers clinically relevant neuroethical concerns.”
The full text is available online via Nature (MSU Library or other institutional access may be required to view this article).
The search for a brain device capable of capturing recordings from thousands of neurons has been a primary goal of the government-sponsored BRAIN initiative. To succeed would require developing flexible materials for the electrodes, miniaturization of the electronics and fully wireless interaction. Yet this past summer, it was corporately funded Facebook and Elon Musk’s Neuralink that stepped forward with announcements regarding their respective technological investment to access and read our human brains.
Image description: A black and white graphic of a person’s head with an electric plug extending out of the brain and back of the head. Image source: Gordon Johnson from Pixabay
Elon Musk, the eccentric technology entrepreneur and CEO of Tesla and Space X, made a big announcement while at the California Academy of Sciences. This time it was not about commercial space travel or plans to revolutionize city driving. Instead Musk presented advances on a product under development at his company Neuralink. The product features a sophisticated neural implant which aims to record the activities of thousands of neurons in the brain, and write signals back into the brain to provide sensory feedback. Musk mentioned that this technology would be available to humans as early as next year.
Mark Zuckerberg’s Facebook is also funding brain research to develop a non-invasive wearable device that would allow people to type by simply imagining that they are talking. The company plans to demonstrate a prototype system by the end of the year.
These two corporate announcements raise important questions. Should we be concerned about the introduction of brain devices that have the capacity to read thousands of neurons and then send signals to our brains? The initial goal for both products is medical, to help paralyzed individuals use their thoughts to control a computer or smartphone, or in the case of Facebook to help those with disabling speech impairments. However, these products also are considered to be of interest to healthy individuals who might wish to “interact with today’s VR systems and tomorrow’s AR glasses.” Musk shared his vision to enable humans to “merge” with Artificial Intelligence (AI), enhancing them to reach superhuman intelligence levels.
Time will tell whether or not these grand visions, that currently veer into science fiction, will be matched by scientific progress. However, if they ultimately deliver on their promise, the products could change the lives of those affected by paralysis and other physical disabilities. Yet, if embraced by healthy individuals such technologies could radically transform what it means to be human. There are of course sound reasons to remain skeptical that they will be used. First off there are safety issues to be considered when implanting electrodes in the brain, including damage to the vasculature surrounding the implant as well as tissue response surrounding the device. And that is what is currently known about inserting brain-computer interfaces with only a couple of electrode channels. Consider what might happen with thousands of electrodes. There remain simply too many unknowns to endorse this intervention for human use in the next year or so. There also are salient issues regarding brain data collection, storage, and use, including concerns connected to privacy and ownership.
Image description: a black and grey illustration of a brain in two halves, one resembling a computer motherboard, the other containing abstract swirls and circles. Image source: Seanbatty from Pixabay
Beyond these concerns, we have to think about what happens when such developments are spearheaded by private companies. Privately funded development is at odds with the slow, careful approach to innovation that most medical developments rely upon, where human research subject regulations and safety measures are clear. It is the “move fast and break things” pace that energizes start-up companies and Silicon Valley entrepreneurs. The big swings at the heart of these entrepreneurial tech companies also bring considerable risks. When addressing sophisticated brain interfaces, the stakes are quite high. These products bring to mind scenarios from Black Mirror, a program that prompts a host of modern anxieties about technology. On one hand, the possibility of having a brain implant that allows hands-free device interaction seems exciting, but consider the level of information we then would be giving to these companies. It is one thing to track how individuals react to a social media post by clicking whether they “like” it or not, or by how many times it has been shared. It is another thing altogether to capture which parts of the brain are being activated without us having clicked anything. Can those companies be trusted with a direct window to our thoughts, especially when they have a questionable track record when it comes to transparency and accountability? Consider how long it took for Facebook to start addressing the use of customer’s personal information. It remains unclear just how much financial support Facebook is providing to its academic partners, or whether or not volunteers are aware of Facebook’s involvement in the funding-related research.
The U.S. Food and Drug Administration as well as academic partners to these enterprises may act as a moderating force on the tech industry, yet recent examples suggest that those kinds of checks and balances oftentimes fail. Thus, when we hear about developments by companies such as Facebook and Neuralink trying to access the thoughts in our brains, we need to hold on to a healthy skepticism and continue to pose important challenging questions.
Laura Cabrera, PhD, is an Assistant Professor in the Center for Ethics and Humanities in the Life Sciences and the Department of Translational Neuroscience at Michigan State University.
Join the discussion! Your comments and responses to this commentary are welcomed. The author will respond to all comments made by Thursday, September 26, 2019. With your participation, we hope to create discussions rich with insights from diverse perspectives.
You must provide your name and email address to leave a comment. Your email address will not be made public.
Visit The Conversation to read “It’s not my fault, my brain implant made me do it,” a collaborative article from Center Assistant Professor Dr. Laura Cabrera and College of Law Associate Professor Dr. Jennifer Carter-Johnson. They combine their neuroethics and legal expertise to address questions such as: “Where does responsibility lie if a person acts under the influence of their brain implant?” The article was also published in Scientific American.
In November 2017, Drs. Cabrera and Carter-Johnson participated in a Brews and Views event of the same name, “It’s not my fault: my brain implant made me do it.” Brews and Views events, moderated discussions addressing the most fascinating and provocative areas of bioscience and engineering, are a collaboration between the Institute for Quantitative Health Science and Engineering and the Center for Ethics and Humanities in the Life Sciences at Michigan State University.