Microaggressions have big impacts for disabled users online

Originally published by Cornell University Cornell Ann S. Bowers College of Computing and Information Science

By Patricia Waldron

In person, people with disabilities often experience microaggressions – comments or subtle insults based on stereotypes – but similar interactions, as well as new types of microaggressions, play out online as well.

A new study by researchers at Cornell and Michigan State University finds that those constant online slights add up. Interviews revealed that microaggressions affect their self-esteem and change how people with disabilities use social media. Ideally, digital tools will one day reduce the burden for marginalized users, but due to their subtlety, microaggressions can be hard for algorithms to detect.

“This paper brings a new perspective on how social interactions shape what equitable access means online and in the digital world,” said Sharon Heung, a doctoral student in information science. Heung presented the study, “Nothing Micro About It: Examining Ableist Microaggressions on Social Media,” Oct. 26 at ASSETS 2022, the Association for Computing Machinery SIGACCESS Conference on Computers and Accessibility.

Previously, little was known about online microaggressions. “If you look at the discourse around harms emanating from social media use by communities that are vulnerable, there is almost no work that focuses on people with disabilities,” said co-author Aditya Vashistha, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science. “It’s surprising because about one in seven people in the world has a disability.”

When microaggressions occur in live settings, they are often ephemeral, with few bystanders. In contrast, “when they happen on social media platforms, it’s happening in front of a large audience – the scale is completely different,” said Vashistha, “and then they live on, for people to see forever.”

Additionally, social media platforms can amplify microaggressions, potentially spreading misinformation. “Online microaggressions have the ability to shape the understandings of disability for a lot of people who are not even involved in the situation,” said co-author Megh Marathe, assistant professor of media, information, bioethics, and social justice at Michigan State. “We’re very concerned about how it’s shaping the way the broader audience thinks about disability and disabled people.”

Heung and co-author Mahika Phutane, a doctoral student in computer science, interviewed 20 volunteers who self-identified as having various disabilities, and who were active on social media platforms. They asked the participants to describe subtle discrimination and microaggressions they had experienced, and the impact on their lives.

Patronizing comments like, “You’re so inspiring,” were the most common, along with infantilizing posts, like “Oh, you live by yourself?” People also asked inappropriate questions about users’ personal lives and made assumptions about what the person could do or wear based on their disability. Some users were told they were lying about their disability, or that they didn’t have one, especially if that disability was invisible, such as a mental health condition.

The researchers categorized the responses into 12 types of microaggressions. Most fit into categories previously recognized in offline interactions, but two were unique to social media. The first was “ghosting,” or ignored posts. The second was ways that the platform was inaccessible for people with various disabilities. For example, some users said they felt unwelcome when people did not add alt text to photos or used text colors they couldn’t discern. One person with dwarfism said her posts were continually removed because she kept getting flagged as a minor.

After experiencing a microaggression, users had to make the tough choice of how to respond. Regardless of whether they ignored the comment, reported it, or used the opportunity to educate the other person, participants said it took an emotional toll and damaged their self esteem. Many took breaks from social media or limited the information they shared online to protect themselves.

“Addressing this problem is really hard,” said Phutane. “Social media is driven to promote engagement, right? If they educate the perpetrator, then that original post will just get more and more promoted.”

Most social media platforms already have moderation tools, but reporting systems are sometimes flawed, lack transparency, and can misidentify harassment. The participants proposed that platforms should automatically detect and delete microaggressions, or a bot could pop up with information about disabilities. 

However, microaggressions can be hard for automated systems to detect. Unlike hate speech, where algorithms can search for specific words, microaggressions are more nuanced and context-dependent.

Once the scope and types of microaggressions experienced by people from various marginalized groups are better understood, the researchers think tools can be developed to limit the burden of dealing with them. These issues are important to address, especially with the potential expansion of virtual reality and the “metaverse.”

“We need to be especially vigilant and conscious of how these real-world interactions get transferred over to online settings,” said co-author Shiri Azenkot, associate professor of information science at the Jacobs Technion-Cornell Institute at Cornell Tech and Cornell Bowers CIS. “It’s not just social media interactions – we’re also going to see more interactions in virtual spaces.”

This work was partially supported by the National Science Foundation Graduate Research Fellowship and the University of California President’s Postdoctoral Fellowship.

Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

Welcoming two new faculty to the Center for Bioethics and Social Justice

As 2021 comes to an end, we are excited to introduce two faculty members who will be joining the Center in 2022. Please join us in welcoming them to Michigan State University.

Jennifer McCurdy, PhD, will join the Center in January. Dr. McCurdy was most recently a Multicultural Postdoctoral Fellow in the Philosophy Department at the University of Alaska Anchorage. She is a critical social bioethicist whose work focuses on understanding and eliminating racial and colonial injustices in contemporary health settings and communities. Currently, Dr. McCurdy is working on a scoping review of Indigenous values in the bioethics literature, and she is co-leading a series of Hastings Center special reports on racism and bioethics.

Dr. McCurdy received her PhD in religious studies with emphasis in ethics, colonialism, and critical religious studies from the University of Denver and Iliff School of Theology in 2019. She also holds a Master of Humanities with emphasis in philosophy and bioethics, a Bachelor of Science in nursing, and an HEC-C (Healthcare Ethics Consultant-Certified).

Megh Marathe, PhD, will join the Center next fall with a joint appointment in the Department of Media and Information in the College of Communication Arts and Sciences. Dr. Marathe is currently a President’s Postdoctoral Fellow in the Department of Informatics at the University of California, Irvine. They received their PhD in information from the University of Michigan in 2021.

Dr. Marathe’s research seeks to foster dialogue between expert knowledge and lived experience in the domain of health. Their recent work showed that for both doctors and patients, the boundary between pathologic and normal events is fluid, dynamic, and porous in epilepsy and other episodic conditions. Calling an event a seizure affects the patient’s financial stability, social participation, and life aspirations, and hence, both patients and providers take an expedient approach to diagnosing seizures.

Dr. Marathe’s work advances the fields of information studies, disability studies, and science and technology studies, and generates practical implications for inclusive healthcare in the era of technologized medicine. They are actively seeking collaborators for new projects that: 1) support patients with childhood-onset cancer or epilepsy in the transition to adult care, and 2) examine how neural implants affect medical practice and patient experience. Visit Dr. Marathe’s website to learn more about their work.