Recognizing and undoing bias in medical devices
It’s pretty much impossible to avoid bringing bias into anything we humans do. We’re shaped by our own experiences, and while we have enough in common to be able to communicate ideas to each other, everyone views and experiences the world in a different way. A person embarking on any project takes their own assumptions in.
We might not even be aware of it. Often, we aren’t. But, some might think, that’s one reason we design machines to do certain jobs. They are objective, independent, inanimate objects. What possible bias could they have? Well, so far anyway, those machines don’t think on their own. A person programs them, in essence teaching them everything they know, and that person can’t help but input biases into the program. That’s how facial recognition technology often fails to differentiate between people of color. And it’s how even medical devices end up with racial or gender biases that can have a negative effect on health outcomes for certain groups. But there are ways to fix this.
AI issues
Artificial intelligence makes our lives easier in innumerable ways, and with its ability to perform calculations and filter information much faster than the human mind can, AI has led to some remarkable medical breakthroughs. It’s not perfect, however, and in a lot of cases that’s our fault. As we’ve seen from facial recognition technology, if programmers show the AI lots of images of lighter-skinned people and not many of dark-skinned people, it will be better at recognizing and differentiating between light-skinned people. It’s the same with AI medical devices that scan for skin cancer. If the AI hasn’t seen a lot of different types of skin, it won’t recognize potential problem areas on darker skin. It only knows what it’s been taught.
UCLA professor of electrical engineer and computer science Achuta Kadambi, whose work focuses on computer vision, machine learning, and medical devices, provided an example that might seem innocuous but points to a serious problem.
“Being fairly dark-skinned myself,” Kadambi told Scientific American, he sometimes has trouble with no-touch soap dispensers and faucets because they detect reflecting off skin. Once at an airport, “I had to ask a lighter-skinned traveler to trigger a faucet for me.”
That’s deeply annoying but not necessarily life-threatening. But if a medical device can’t detect cancer on the skin of dark-skinned person, the consequences are much graver.
Kadambi says three types of bias go into medical devices: physical bias (built into the mechanics of a device), computational (bias in the software/data that skews results), and interpretation (bias on the part of the human reading the data). With AI like this, there’s physical bias in the design (a person fails to give the program enough data) and computational bias (because of the lack of sufficient data, the program makes computational errors).
This is actually fairly easy to solve by simply showing the machine learning images of darker skin so it can account for different skin types in its calculations.
Breathing uneasy
Especially important in the COVID era are accurate measurements of blood oxygen levels and lung capacity. With oximeters, which measure blood oxygen, skin color again can skew results. Oximeters obtain their reading by beaming light through the blood. In people with dark skin, the medical devices can present readings that show blood oxygen levels to be higher than they really are.
Low oxygen levels can be a key indicator of COVID infection, and people with COVID-related hypoxemia often don’t show outward signs of breathing difficulty. So, if an oximeter says their blood oxygen is fine and they feel OK, patients and medical professionals can mistakenly believe there’s no problem until it’s too late. A University of Michigan study in 2020 found that oximeters missed low blood oxygen levels in Black patients nearly three times as often as they did in white patients.
“Given the increased mortality amongst ethnic minority patients during the Covid-19 pandemic, it is possible that the differential accuracy of pulse oximetry is a contributing factor to this health inequality,” a review by the U.K.’s NHS Race and Health Observatory concluded.
Fixing this is a bit more difficult, as it requires a different method of measuring blood oxygen, but recognizing the drawbacks of current medical devices is an important first step. At the very least, doctors can let patients of color know that oximeters alone might not be enough.
In the case of spirometers, which measure lung capacity, the medical device itself is fine, but there’s often interpretation bias. Kadambi attributes this to a reliance on outdated studies which indicated Black and Asian people as a baseline have lower lung capacities. As a result, measurements from patients of color are often “corrected” in a way that renders them inaccurate.
“For example, before ‘correction’ a Black person’s lung capacity might be measured to be lower than the lung capacity of a white person” Kadambi wrote in the journal Science. “After ‘correction’ to a smaller baseline lung capacity, treatment plans would prioritize the white person, because it is expected that a Black person should have lower lung capacity, and so their capacity must be much lower than that of a white person before their reduction is considered a priority.”
Here, the fix is simple. Acknowledge the old studies are flawed, stop relying on them, and stop “correcting.”
The right fit
Even for medical devices that are decidedly low-tech and made for health professionals, there can be glaring deficiencies that endanger safety. Such is the case with protective masks that have been commonplace in certain medical settings but ubiquitous during the pandemic. You might imagine facemasks to be more or less one-size-fits-all, but functionally they are more like one-size-fits-white-men.
Studies on the efficacy of such masks tend to oversample whites and males, so when they conclude masks are sized and shaped well to prevent bacteria and viruses from getting through, they can be neglecting and endangering other groups.
“Adequate viral protection can only be provided by respirators that properly fit the wearer’s facial characteristics,” an Association of Anaesthetists review article from September 2020 said. “Initial fit pass rates vary between 40% and 90% and are especially low in female and in Asian healthcare workers.”
If you’re wearing a mask that doesn’t fit right, you’re more vulnerable to catching something, an obvious problem if you’re working at a hospital or working a job that involves a lot of interaction with a lot of different people during a respiratory pandemic.
Getting enough masks where they needed to be might have taken herculean effort at the outset of the pandemic, but by now there are many companies making them. They certainly have the capacity and scalability to make them in various shapes and sizes.
Not every problem has an easy fix, but some racial and gender-based biases in medical devices do. First, though, we have to recognize they exist.
Leave a Reply