Authors Aakash A. Dave and Dr. Laura Cabrera, Assistant Professor in the Center for Ethics, have an article in the December issue of the Journal of Cognitive Enhancement. The article, “Osteopathic Medical Students’ Attitudes Towards Different Modalities of Neuroenhancement: a Pilot Study,” was available online first in January of this year.
Abstract: The advancement of society has coincided with the development and use of technologies intended to improve cognitive function, which are collectively known as neuroenhancers. While several studies have assessed public perception towards the moral acceptability of pharmacological and device-based cognitive enhancers, just a few have compared perceptions across different modalities of cognitive enhancers. In this pilot study, 154 osteopathic medical students were asked to read one of six possible vignettes describing a certain type of improvement—therapy or above the norm—brought about by using one of three modalities—neurodevice, pill, or herbal supplement. Subjects answered questions that were designed to reveal their attitudes towards the given scenario. Our participants suggested that improvement using neurodevices and herbal supplements is more acceptable than when pills are used. We also found that acceptable attitudes towards cognitive enhancement were subserved by reasons such as “positive outcome from use” and “it’s safe” and unacceptable attitudes by reasons such as “safety concerns” and “no need.” Furthermore, a majority of participants would prefer to consult with a physician regarding the use of cognitive enhancers prior to accessing them. These results provide novel insights into pressing neuroethical issues and warrant further studying.
The full text is available online via Springer Link (MSU Library or other institutional access may be required to view this article).
Dr. Robyn Bluhm, Associate Professor in the Department of Philosophy and Lyman Briggs College, and Dr. Laura Cabrera, Assistant Professor in the Center for Ethics and Humanities in the Life Sciences and the Department of Translational Neuroscience, are co-authors of an article in the Spring 2020 issue of IJFAB: International Journal of Feminist Approaches to Bioethics.
Their article, “Deep Brain Stimulation and Relational Agency: Negotiating Relationships,” appears in a special section on feminist neurotechnologies. From the article’s introduction:
In this commentary, we consider three aspects of [Timothy] Brown’s discussion of DBS and relational agency: (1) the importance of thinking critically about what it means to have a relationship with a DBS device; (2) how the development of “closed loop” implants might change the kinds of relationships that are possible; and (3) the need to consider how an individual’s relationship with their device is shaped by their relationship with others in their lives. We see ourselves as building on, or offering suggestions for further developing, Brown’s important paper.
Drs. Bluhm and Cabrera are co-investigators on the project “Is the Treatment Perceived to be Worse than the Disease?: Ethical Concerns and Attitudes towards Psychiatric Electroceutical Interventions,” funded by the NIH BRAIN Initiative. Deep brain stimulation (DBS) is one of four types of psychiatric electroceutical interventions (PEIs) included in the scope of the project.
The full text is available online via University of Toronto Press (MSU Library or other institutional access may be required to view this article).
The search for a brain device capable of capturing recordings from thousands of neurons has been a primary goal of the government-sponsored BRAIN initiative. To succeed would require developing flexible materials for the electrodes, miniaturization of the electronics and fully wireless interaction. Yet this past summer, it was corporately funded Facebook and Elon Musk’s Neuralink that stepped forward with announcements regarding their respective technological investment to access and read our human brains.
Elon Musk, the eccentric technology entrepreneur and CEO of Tesla and Space X, made a big announcement while at the California Academy of Sciences. This time it was not about commercial space travel or plans to revolutionize city driving. Instead Musk presented advances on a product under development at his company Neuralink. The product features a sophisticated neural implant which aims to record the activities of thousands of neurons in the brain, and write signals back into the brain to provide sensory feedback. Musk mentioned that this technology would be available to humans as early as next year.
Mark Zuckerberg’s Facebook is also funding brain research to develop a non-invasive wearable device that would allow people to type by simply imagining that they are talking. The company plans to demonstrate a prototype system by the end of the year.
These two corporate announcements raise important questions. Should we be concerned about the introduction of brain devices that have the capacity to read thousands of neurons and then send signals to our brains? The initial goal for both products is medical, to help paralyzed individuals use their thoughts to control a computer or smartphone, or in the case of Facebook to help those with disabling speech impairments. However, these products also are considered to be of interest to healthy individuals who might wish to “interact with today’s VR systems and tomorrow’s AR glasses.” Musk shared his vision to enable humans to “merge” with Artificial Intelligence (AI), enhancing them to reach superhuman intelligence levels.
Time will tell whether or not these grand visions, that currently veer into science fiction, will be matched by scientific progress. However, if they ultimately deliver on their promise, the products could change the lives of those affected by paralysis and other physical disabilities. Yet, if embraced by healthy individuals such technologies could radically transform what it means to be human. There are of course sound reasons to remain skeptical that they will be used. First off there are safety issues to be considered when implanting electrodes in the brain, including damage to the vasculature surrounding the implant as well as tissue response surrounding the device. And that is what is currently known about inserting brain-computer interfaces with only a couple of electrode channels. Consider what might happen with thousands of electrodes. There remain simply too many unknowns to endorse this intervention for human use in the next year or so. There also are salient issues regarding brain data collection, storage, and use, including concerns connected to privacy and ownership.
Beyond these concerns, we have to think about what happens when such developments are spearheaded by private companies. Privately funded development is at odds with the slow, careful approach to innovation that most medical developments rely upon, where human research subject regulations and safety measures are clear. It is the “move fast and break things” pace that energizes start-up companies and Silicon Valley entrepreneurs. The big swings at the heart of these entrepreneurial tech companies also bring considerable risks. When addressing sophisticated brain interfaces, the stakes are quite high. These products bring to mind scenarios from Black Mirror, a program that prompts a host of modern anxieties about technology. On one hand, the possibility of having a brain implant that allows hands-free device interaction seems exciting, but consider the level of information we then would be giving to these companies. It is one thing to track how individuals react to a social media post by clicking whether they “like” it or not, or by how many times it has been shared. It is another thing altogether to capture which parts of the brain are being activated without us having clicked anything. Can those companies be trusted with a direct window to our thoughts, especially when they have a questionable track record when it comes to transparency and accountability? Consider how long it took for Facebook to start addressing the use of customer’s personal information. It remains unclear just how much financial support Facebook is providing to its academic partners, or whether or not volunteers are aware of Facebook’s involvement in the funding-related research.
The U.S. Food and Drug Administration as well as academic partners to these enterprises may act as a moderating force on the tech industry, yet recent examples suggest that those kinds of checks and balances oftentimes fail. Thus, when we hear about developments by companies such as Facebook and Neuralink trying to access the thoughts in our brains, we need to hold on to a healthy skepticism and continue to pose important challenging questions.
Laura Cabrera, PhD, is an Assistant Professor in the Center for Ethics and Humanities in the Life Sciences and the Department of Translational Neuroscience at Michigan State University.
Join the discussion! Your comments and responses to this commentary are welcomed. The author will respond to all comments made by Thursday, September 26, 2019. With your participation, we hope to create discussions rich with insights from diverse perspectives.
You must provide your name and email address to leave a comment. Your email address will not be made public.
A team led by Center Assistant Professor Dr. Laura Y. Cabrera will examine the ethical concerns, beliefs, and attitudes of psychiatrists, patients, and healthy members of the public, including caregivers, regarding the development and use of psychiatric electroceutical interventions (PEIs).
The U.S. National Institutes of Health BRAIN Initiative has awarded a four-year, $1,414,478 grant to the Michigan State University team, which also includes Professor Aaron M. McCright (Sociology), Associate Professor Robyn Bluhm (Philosophy and Lyman Briggs College) and Associate Professor Eric Achtyes (Director of the College of Human Medicine Division of Psychiatry and Behavioral Medicine).
Using electrical stimuli to treat psychiatric conditions, PEIs offer great promise in addressing the profound suffering related to such disorders. While PEIs have been available in various forms for years, divergent perceptions among medical professionals, patients, and the broader public have impeded their wider adoption in practice. Key stakeholders’ concerns, beliefs, and attitudes also might affect the future adoption of novel, more invasive PEIs. As new PEIs emerge in the neurotechnology landscape, it is urgent to understand such concerns and related social policy choices.
“This grant could not come at a better time, and we are grateful to the National Institutes of Health for recognizing the importance of this issue and supporting our proposal,” said Dr. Achtyes, who has seen firsthand the benefits of such treatments.
Dr. Cabrera, whose research focuses on neuroethics and is leading the effort as the Principal Investigator, said, “I am delighted for this exciting opportunity to lead our team of experts and work together towards the sustained ethical development and translation of this type of psychiatric treatment.”
The significance of this work lies in anticipating potential future policy challenges in ways that will both effectively safeguard sustained ethical PEI development and translation, and benefit individuals affected by mental health disorders.
“One strength of our project is that we have experts from philosophy, neuroethics, psychiatry, and sociology working closely together. So, the insights we generate will likely transcend typical disciplinary boundaries and hopefully will be more meaningful to key stakeholders,” said Dr. McCright.
Center Assistant Professor Dr. Laura Cabrera was an invited expert panelist at the Organisation for Economic Co-operation and Development (OECD) workshop “Minding Neurotechnology: Delivering Responsible Innovation for Health and Well Being,” held September 6-7 in Shanghai, China.
The workshop was focused on exploring some of the unique ethical, legal, and policy challenges raised by health-related applications of brain science and its integration into cutting edge neurotechonologies. Dr. Cabrera’s session on “Identifying gaps in neurotechnology governance: potential roles of the market and the public sector to ensure ‘technology robustness’” was focused on raising potential governance issues associated with emerging neurotechnologies that deserve shared consideration given their public attention as well as their potential economic and social implications.
Center Assistant Professor Dr. Laura Cabrera is one of the two International Neuroethics Society representatives who are a part of the IEEE BRAIN group. The IEEE Brain group is part of the IEEE (Institute of Electrical and Electronics Engineers), the world’s largest technical professional organization for the advancement of technology.
On August 31st, Dr. Cabrera joined the IEEE BRAIN Neuroethics subcommittee kick-off meeting at Georgetown University in Washington, D.C. This subcommittee includes philosophers, engineers, and neuroethicists, and it is tasked with putting together a document looking at ethical considerations for neurotechnologies.
What are key ethical concerns surrounding the use of psychiatric deep brain stimulation (DBS)? Are those concerns shared broadly for all aspects of DBS or alternatively are they specific to the intended targeted use of that intervention? Dr. Cabrera will discuss results from a recent study conducted by a multidisciplinary research team in which they examined ethical issues discussed in both the scientific and ethics literature around psychiatric DBS. Dr. Cabrera will make the case that understanding the ethics of DBS for psychiatric interventions provides important insight into the way in which ethical concerns for a single technology might vary depending on their intended use.
Join us for Dr. Cabrera’s lecture on Wednesday, February 15, 2017 from noon till 1 pm in person or online.
Dr. Cabrera is an Assistant Professor of Neuroethics at the Center for Ethics and Humanities in the Life Sciences. She is also a Faculty Affiliate at the National Core for Neuroethics at University of British Columbia. Her research focuses on the exploration of attitudes, perceptions and values of the general public toward neurotechnologies, as well as the normative implications of using neurotechnologies for medical and non-medical purposes. She received a BSc in Electrical and Communication Engineering from the Instituto Tecnológico de Estudios Superiores de Monterrey (ITESM) in Mexico City, an MA in Applied Ethics from Linköping University in Sweden, and a PhD in Applied Ethics from Charles Sturt University in Australia. Her career goal is to pursue interdisciplinary neuroethics scholarship, provide active leadership, and train and mentor future leaders in the field.
In person: This lecture will take place in C102 East Fee Hall on MSU’s East Lansing campus. Feel free to bring your lunch! Beverages and light snacks will be provided.