Beyond Bias: How Hidden Racism in Medical Devices Legitimizes Oppression

by Ziggy

I spend a lot of time thinking about whether the famously lauded machine learning algorithms I tweak and create in my bioengineering classrooms will actually produce something that will truly serve Black and Brown communities, or simply reproduce heinous stigmas from white supremacy and colonialism. 

Having dark skin, I’ve had my fair share of soap dispenser fails. Soap dispensers are typically activated by light bouncing off our hands. Darker skin absorbs more light than lighter skin, so less light gets bounced off dark skin hands to be read by the sensor. This is a clear indication that these dispensers are not being tested on or designed for people with dark skin. If so, these dispensers would be adjusted for the ‘gain’, which is the amount of light needed to be reflected to activate the sensor. 

Computational, Physical, and Interpretation Bias

In more recent years, I've encountered nurses raving about the excellent oxygen saturation in my blood. I've then been shocked and disturbed seeing the blood oxygen metrics in my medical chart followed by the racialized disclaimer –  "if African American.”  Though it is easy to overlook, this 3 word notation on my medical file reveals the worlds of racism embedded in technological and diagnostic design.

Our medical devices are far from benign tools. Researcher Achuta Kadambi outlines three types of biases in medical devices; physical, computational, and interpretation bias. In my medical chart, the race correction reflects an interpretation bias. Physical biases include the failed soap dispensers I’ve experienced; I imagine many readers have as well. Another example is the pulse oximeter – which is the device that measures blood-oxygen levels and usually clip onto a person’s fingertip to get a reading. Neither the soap dispensers or oximeters are ‘calibrated’ properly to people with darker skin. Due to this physical bias, oximeters are possibly giving me a higher reading than what is accurate. There is evidence that the pulse oximeter’s “discriminatory design,” as Dr. Ruha Benjamin puts it, may be contributing to the racist disparities in COVID-19 deaths. Black people, being more likely to have darker skin, get an inflated oxygen saturation reading and due to this may not be receiving proper life-saving care. Another example of a physical bias in medical devices are respiratory masks like N95’s not fitting Asian peoples’ faces properly. 

The final example of physical bias that I want to discuss is remote plethysmography. It is a technology that measures heart rate by analyzing video. This can be live or recorded. This technology is typically programmed to pick up blush-like changes in the skin; however, it does not work as well for people with darker skin. Researchers like Kadambi are finding other methods to extract the signals. Kadambi’s UCLA laboratory is working on analyzing the video images with thermal wavelengths instead of visible light. A team at MIT created a method that reads tiny changes in head motion that occur when the heart beats. 

These are great steps that have been taken to mitigate bias in medical tech; however, it addresses the technical aspects of physical bias only. An unaddressed concern is that scientists and engineers may have a surface level understanding of how our devices are biased and can only offer technical fixes that cannot easily translate into better practice across the field of biomedical engineering. A larger movement where scientists and engineers understand the root cause of these biases how they are underpinned by entrenched medical racism is needed. 

Computational biases are also very pervasive and include a wide array of machine learning algorithms. Machine learning is a type of artificial intelligence that enables software to increase its accuracy at predicting outcomes without needing to be explicitly programmed to do so. In short, the software is able to learn. An example of computational bias that uses machine learning are algorithms used for detecting skin cancer. These algorithms do not work as well for people with dark skin because the databases for training the machine learning models do not contain information on race or skin type and the ones that do, tend to have very few images of dark skin

Our Bodies are Not a Problem

Computational or algorithmic biases can also be carceral. One example being medical hot-spotting. It uses Geographic Information Systems (GIS) technologies and spatial profiling to identify populations that are considered medically vulnerable. Benjamin notes that this medical hot-spotting reproduces classificatory stigma that restricts people’s life chances instead of helping “underserved” populations. These classificatory stigma come from the usage of terms such as “medically vulnerable,” “high utilizers,” and “socially disintegrated.” Socially disintegrated has been used to describe people who “tend to not engage in self care, have few family resources, and display dependent personalities.”  This focus on self care and personality by medical hot-spotting and their care teams have distracted from the root cause of higher healthcare cost for marginalized populations and often feeds into the idea that Black patients are especially non-compliant. Another issue with this categorization as Krupar and Ehler point out, is that using the phrase, “high utilizers” for people who are in need of the most healthcare resources has the potential to create more stigma and become like the racist and classist term, “welfare queen.” Finally, the term medically vulnerable functions is a euphemism which upholds ableist stereotypes by implying weakness or blame. It conceals that people with concurrent healthcare needs are often people who are members of communities targeted by multiple oppressions and/or reside in high pollution communities with limited access to resources which support health. These unjust conditions, which play a huge role in causing illness or health challenges, go unmentioned and are unaccounted for when terms like “medically vulnerable” are used to justify medical surveillance. 

Medical hot-spotting was inspired by the policing technique of compiling data and using GIS to find areas of high crime rates known as Compstat. It has been shown that the same zones are targeted by both this policing and medical hot-spotting technique. Healthcare adopted hot-spotting from policing in 2009 in Camden, New Jersey, when Dr. Jeffrey Brenner, serving as a citizen member of a police reform commission, repurposed his rejected crime map of Camden for health care delivery. It is worth noting that Brenner created this crime map with data from local hospital emergency departments. Kruper and Ehler also observe that through medical hot-spotting, disabled people’s bodies are considered problems that need fixing. This coupled with the fact that crime hot-spotting targets the same people creates a life under heightened surveillance under which disabled BIPOC many times must comply with treatment under the Medical Industrial Complex to avoid the violence of policing. There is unfortunate irony in this truth as the care these targeted people receive stem from policing. 

Often conversations around bias in tech and medical devices mention that it is “unintentional,” making it seem like an unavoidable accident. Racism is not an accident. It is quite deliberate and intentional in its goals. Due to their institutional power and privilege, scientists and physicians can be especially resistant to acknowledging their role in upholding colonial stigmas or even their lack of knowledge in certain areas. These dynamics perpetuate this narrative of being ‘unintentional.” In many cases, the intent of racist design is clear. Dr. John Hutchinson’s development of the spirometer in 1846 is one such case in which this was done quite deliberately. Dr. Hutchinson included a feature in the spirometer to “race correct,” this meant that the device and user assumed that Asian and Black people had a smaller lung capacity. Black and Asian people would have to do considerably worse on their spirometer tests than their white counterparts in order to receive care. This resulted in lower payouts from insurance companies. Dr. Hutchinson was hired by insurance companies to ensure that would happen.

Looking Onward to Liberatory Design

As deeply entrenched as these biases are in our world, there is still hope and resistance. There are dedicated and holistically informed researchers such as Ruha Benjamin and Achuta Kadambi working to build a world without carceral and racist technologies as well as a broader influx of interest in removing computational biases. Another glimmer of hope is that last year, The Lancet, which is one of the largest medical journals, published an article calling for the abolition of race correction

It is pivotal that we really grasp that race is socially constructed and still important. We must not create harmful inclusion for BIPOC, which is what the interpretation bias, race correction does. We can rid our world of these constructs of hierarchy that claim Black people are biologically inferior and “innately infirmed.” We are taught that science as we know it today is all about objective truth. This makes the exploitation, hierarchy, and violence that racist scientific constructs create so insidious. It makes racism out to be the natural order of life as opposed to the deliberate doings of white supremacy and colonialism. One example being physicians and scientists attributing the unequal racialized impacts of COVID-19 to genetics instead of lack of access to quality healthcare and, as mentioned earlier, biases in medical devices such as the oximeter. Instead, let's create something fresh like DIY laboratories where community members can share ideas and create spaces where “professional” and the so-called “lay-person” – those of us who are experts because we are the most impacted – are not so far apart.  

We need creative destruction, building and dismantling, in our journey to create medical devices which serve the needs of all. In Clifford Conner’s Tragedy of American Science, he uses the term “Big Science.” Big Science refers to the corporatization and militarization of science where corporations have their own scientists to prove that their products are harmless and neuroengineers create neuroenhancements for soldiers bolstering western imperialism. This is what leads to these discriminatory designs and violence against People of Color / People of the Global Majority. It is my hope that one day Big Science will be no more and we will, without romanticizing the past, get back to the curiosity of our ancestors.

Ziggy Waller