Human Microchip Implantation: A Bridge Too Far?

Listen to this story
Bioethics in the News purple and teal icon

This post is a part of our Bioethics in the News series

By Sabrina Ford, PhD

Technology. It invades every corner of our lives and for the most part improves the quality of life. From typing on a flat panel with a little TV screen attached, to a smartphone enabling users to share with others collected data that lives in the cloud. A CT/X-ray image of a C2 spinal fracture (aka Hangman’s Fracture) taken in the middle of the night at a small Midwestern rural hospital is sent to a West Coast spine surgeon, and within minutes, an expert opinion is returned to that rural hospital. Technology is convenient, pervasive, and unavoidable.

In the past 15 years, discussion and related controversy has taken place about a Radio Frequency Identification Device (RFID) or microchip that can be implanted in human bodies. That chip would contain, store, and update data about us. Might such an implant be a benefit or a risk? Some investors are betting on its appeal. The compound annual growth rate (CAGR) from 2020-2027 of the healthcare microchip is expected to grow by 22% and be valued over $6.4 million by 2027. RFID microchips (herein referred to as microchips) are already used for many things from your credit/debit card, to those efficient logistics used to move your Amazon package.

Illustrated cyborg eye with electronic circuits
Image description: An illustrated cyborg eye with electronic circuits. Image source: jemastock/Vecteezy.

Getting to the bridge

Implanted microchips are a terrifying idea to some of us—sufficiently frightening to harken images of robots and androids—the stuff of science fiction. For some of us, implanting something foreign in our bodies for the management of big data and convenience is disturbing. Another concern might be the potential breach of privacy and the surveillance of our daily life. If the chip contains medical, personal, social information, and GPS data, could we lose all autonomy? Do we maintain our autonomy if, with sufficient information, we consent to the decision? What will all the information be used or misused for? What if our employer, insurer, or a government entity decides to check on us?

These questions raise other concerns about autonomy. Enough employers considered compulsory microchips for their employees that in 2020, Michigan and several other states introduced and passed bills designed to prevent employers from forcing employees to accept microchip implants. This pre-emptive strike was against a growing technology, utilized perhaps to track safety, productivity and movement. As with many things in the United States, some vulnerable employees with microchips might be targeted, either unintentionally or intentionally, thereby putting them at further economic and social disadvantage.

Some have already crossed the bridge

It is estimated that currently approximately 10,000 people in the world have implanted microchips. Perhaps that doesn’t sound like many, but if investors are hedging their bets correctly, the technology is on its way to widespread adoption. A large number of those “cyborgs” reside in Sweden and employ the technology not for health care reasons, but instead use microchip implants to unlock their car doors, buy a coffee, or swipe into the gym. That rate of chip adoption makes sense in a society like Sweden, which is the second most cashless society (after Canada) in the world.

Photo of microchip being held between two fingers
Image description: A photo of an RFID implant held between two fingers. Image source: Dan Lane/Flickr Creative Commons.

Many argue that an RFID tag and implanted microchips can increase cybersecurity. Not being able to log into your computer without first swiping into the building and into your office door might offer a level of comforting protection against physical hacking in the workplace. In addition, many in healthcare delivery believe medical mistakes would be greatly reduced and quality of care increased if our medical charts were loaded on microchips, monitoring disease states like heart disease and diabetes, improving management of medications, and reducing surgical mistakes. If, with microchips, first responders or doctors had real-time access to accurate medical information there is potential to save lives in medical emergencies. The HITECH Act—or Health Information Technology for Economic and Clinical Health Act—calls for the interoperability of electronic health information for privacy and safety of the patient. As it is now, it doesn’t make sense for an individual to have different electronic health records in a number of physician offices. If our world were to be efficiently hyperconnected, one can argue that everyday life could be improved and streamlined.

A bridge too far

But would it be? We are covered, watched, followed, and violated through our digital footprint on a daily basis. Perhaps not necessarily with microchips, but pause to consider your actions today. You took your morning walk as public cameras captured your movement down the block, into the convenience store for a cup of coffee, where you used your debit card or smartphone to pay for the transaction, and that transaction was caught on the store camera. You then check your fitness wearable for heart rate, steps, route, and all that other good stuff. Later, you swipe in and out of the building as you stop into your office for a few hours, in and out of several doors, and log on to your computer—accessing various applications in the cloud—all the while answering your email and checking your calendar. Later in the day, you visit your doctor, either in person or via telemedicine, and she enters your ailments, diagnostic tests ordered, and electronic prescriptions into the electronic health record. As you wind down for the evening you make your market list in your favorite grocery store app, use your smart television to access your favorite shows, and access your books on a reading app. All of this is accomplished in the cloud, and on the “grid” in huge databases. Is this trek through the digital world so much different than a microchip that holds your digital footprint? You’ve left a day’s breadcrumb trail on almost every aspect of your life, and not even as consciously as Hansel and Gretel. As for implants in general, clearly Americans accept them, as witnessed by artificial joints, IUDs, cochlear implants – and don’t forget about those implants for hair and breasts.

Over the bridge

The described dilemma is that implanting a chip has the potential to be a violation of rights, yet the chip might equally offer safety and convenience. The implantable microchip is not fully developed and has a long way to go, but the technology is on its way. Microchips today are not sufficiently powerful to collect and communicate big data or to follow us all over the world the way our smartphones do. As with most technologies, the tipping point for implantable chips will come when they become so very useful that they’re simply hard to refuse.

ford-sabrina-2020

Sabrina Ford, PhD, is an Associate Professor in the Department of Obstetrics, Gynecology and Reproductive Biology and the Institute for Health Policy in the Michigan State University College of Human Medicine. Dr. Ford is also adjunct faculty with the Center for Ethics and Humanities in the Life Sciences.

Join the discussion! Your comments and responses to this commentary are welcomed. The author will respond to all comments made by Tuesday, March 16, 2021. With your participation, we hope to create discussions rich with insights from diverse perspectives.

You must provide your name and email address to leave a comment. Your email address will not be made public.

More Bioethics in the News from Dr. Ford: COVID-19 Vaccine: “Not throwing away my shot”Contemplating Fentanyl’s Double Duty

Click through to view references

Should we improve our memory with direct brain stimulation?

This post is a part of our Bioethics in the News seriesBioethics-in-the-News-logo

By Laura Cabrera, PhD

Should we be worried about the use of direct brain stimulation to improve memory? Well, it depends. If we think of people with treatment refractory memory conditions, or those situations where drugs are not helping the patient, such an approach might seem like the next sensible step. There is reason, however, to remain skeptical that this strategy should be used to improve the memories of people who function within a normal memory spectrum.

6313026708_f795677294_o
Image description: The illustration “Light Bulb” by Alvaro Tapia is a colorful abstract depiction of the human head/brain as a light bulb. Image source: Alvaro Tapia/Flickr Creative Commons

The quest to improve memory is hardly new. Throughout time people have engaged in ways to improve their memories, such as eating particular foods, employing mnemonic strategies, or taking certain drugs, but the quest does not end there. A recent New York Times article discussed findings from a direct brain stimulation study (Ezzyat et al., 2018) on the possibility of using brain stimulation to rescue functional networks and improve memory. In that study, 25 patients undergoing intracranial monitoring as part of clinical treatment for drug-resistant epilepsy were additionally recruited with the aim of assessing temporal cortex electrical stimulation on memory-related function.

The prospect of using brain stimulation to improve memory, initially introduced in the 1950s (Bickford et al., 1958) re-emerged in 2008 when a study using hypothalamic continuous deep brain stimulation (aka open-loop DBS) to treat a patient with morbid obesity revealed an increased recollection capacity in that same patient (Hamani et al., 2008). Subsequent studies have attempted to prove that direct brain stimulation is useful for memory improvement. However, the data on open-loop deep brain stimulation currently remains inconclusive.

The approach by Ezzyat and colleagues, wherein neural activity is monitored and decoded during a memory task, suggests an improvement over open-loop approaches. In this treatment modality stimulation is delivered in response to specific neural activity, detecting those times when the brain is unlikely to encode successfully and rescuing network activity to potentially improve overall performance.

In that study stimulation was triggered to respond exclusively to those patterns of neural activity associated with poor encoding, effectively rescuing episodes of poor memory and showing a 15% improvement in subsequent recall. Indeed, those results might sound promising, but this type of memory intervention raises a number of ethical issues.

Brain_memory
Image description: Computer memory components are shown inside a model of a human skull. Image source: © Michel Royon/Wikimedia Commons

In a very direct fashion memory is related to the core of who we are. It allows us to build an interpretation of ourselves and our environments, and in so doing gives us orientation in time as well as in our moral life. As surrealist Luis Bunuel put it, “Life without memory is no life at all … Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing …” Equally, memory plays a crucial role in cognition, learning, and performance, and as such it is not a surprise that many people feel particularly drawn to memory improvement strategies. Yet there are salient, concerning issues when directly meddling with the human brain, including those risks associated with deep electrode insertion such as infection, hemorrhage, seizure and hardware complications. One might reasonably question whether a 15% memory improvement is worth such high stakes risks?

Another concern is the potential for undesirable – but as yet undetermined – side effects. Those uncertainties are why it seems unlikely that such an approach will be used in healthy individuals or for mild memory dysfunction cases. Still and yet, closed-loop deep brain stimulation has alternative utility. It can be used to improve understanding about the specific brain target most centrally related to certain memory functions, and then use that information to employ less invasive interventions, such as transcranial magnetic stimulation (TMS).

The sorts of studies engaged by Ezzyat’s team and others overlook the fact that memories are not just physically located within the cranial cavity. We have external technologies such as photographs, videos, and agendas to help us remember, and so one might reasonably ask if we really need invasive brain implants to achieve the same ends? The brain’s plasticity is equally overlooked, erroneously assuming that the same brain targets will bring equivalent outcomes for healthy individuals as well as for those with memory impairments. Moreover, the identified interventions improve memory encoding, but do not help with the many errors to which memory is perplexingly prone, such as misattribution, suggestibility, and bias. For healthy individuals, addressing those common memory errors could potentially be more helpful than improving encoding with brain stimulation.

In addition, certain types of memory enhancement could bring new perspectives on one’s life, and even affect the ability to understand the past and imagine the future. In fact if we truly were to remember everything we encounter in our lives we might well be overburdened with memories, unable to focus on current experiences and afflicted by persistent memories of those things that we deem unimportant.

Open-loop neural implants already bring a different configuration of human agency and moral responsibility. Closed-loop implants with their ability to both stimulate and continuously monitor neural patterns bring further issues for consideration, such as neurosecurity (e.g. brain hacking) and mental privacy. Improved connectivity of this type of implant further enables the potential for malicious interference by criminals. Concerns about mental privacy figure prominently in other neurotechnologies, which, similar to brain implants, have the ability to access neural data correlated with intentions, thoughts, and behaviors. This enhanced proximity encroaches on the core of who we are as individuals, providing access to mental life that in the past was accessible only to oneself.

Finally, the media hype in itself is problematic. The New York Times’ article mentioned that the 15% improvement observed in the Ezzayt study was a noticeable memory boost. This sort of inflated media coverage does a disservice to the good intentions and professional rigor of scientists and engineers, and misleads the reader to be either overly-optimistic or overly-worried about the reported developments.

With these many considerations in mind, it is clear that direct brain stimulation will replace neither pharmaceuticals nor less invasive memory improvement options anytime soon. Those who crave memory improvement through memory intervention technologies might best be mindful of the aforementioned ethical and social considerations.

Laura Cabrera photoLaura Cabrera, PhD, is an Assistant Professor in the Center for Ethics and Humanities in the Life Sciences and the Department of Translational Science & Molecular Medicine at Michigan State University.

Join the discussion! Your comments and responses to this commentary are welcomed. The author will respond to all comments made by Thursday, May 10, 2018. With your participation, we hope to create discussions rich with insights from diverse perspectives.

You must provide your name and email address to leave a comment. Your email address will not be made public.

More Bioethics in the News from Dr. Cabrera: Can brain scans spot criminal intent?Forgetting about fear: A neuroethics perspective

Click through to view references

A Non-Standard Practice of Medicine

A Non-Standard Practice of Medicinebbag-icon-dec

Event Flyer

In the mid-1950s, physician James Burt began modifying episiotomy repair; two decades later, he offered ‘love surgery’ as an elective. In early 1989, shortly after several women accused him on national television of performing an experimental surgery on them without their consent, Burt relinquished his medical license. The popular media mostly portrayed Burt as practicing outside the norms of medical practice, allowed to do so by his peers. But this narrative fails to consider questions about routine medical innovation the Burt story brings forth. Historians (and bioethicists) have, for the most part, focused on infamous – think Tuskegee – unethical medical research. But what can the development of ‘love surgery’ tell us of about normative surgical development, routine medical innovation, and informed consent for routine procedures since the 1950s?

mar-18-bbagJoin us for Sarah B. Rodriguez’s lecture on Wednesday, March 18, 2015 from noon till 1 pm in person or online.

Sarah B. Rodriguez, PhD, is a lecturer in the Medical Humanities and Bioethics Program in the Feinberg School of Medicine and in the Global Health Studies Program in the Weinberg College of Arts and Sciences at Northwestern University. Her area of research is in women’s reproductive and sexual health since the early twentieth century. Her first book, Female Circumcision and Clitoridectomy: A History of a Medical Treatment, was published in the fall of 2014.

In person: This lecture will take place in C102 East Fee Hall on MSU’s East Lansing campus. Feel free to bring your lunch! Beverages and light snacks will be provided.

Online: Here are some instructions for your first time joining the webinar, or if you have attended or viewed them before, go to the meeting!

Can’t make it? All webinars are recorded! View our archive of recorded lectures (over 30 lectures and counting!).