Healthcare: When AI Biases Become Dangerous

December 31, 2020 - 9 minutes read

Algorithms have done really well when they’re looking at a greyscale radiologic image that’s not connected to skin color. But more and more evidence points to the inherent biases that algorithms operate on. These biases are magnified when the algorithm is used in healthcare, where a patient’s life and livelihood could be at stake based on the algorithm’s findings.

It turns out that algorithms are as biased as the people who code them (usually a white male), and there’s no surprise that the biases heavily skew towards black people, Latinx populations, colored women, and minority genders (like transgender patients). A study of Boston-area patients uncovered the depth of harm caused by algorithms used for estimating kidney function wherein black patients received healthier scores, despite the black population suffering disproportionately more from chronic diseases and inferior healthcare compared to white patients.

The Wrong Math

The worst part is that doctors trust, often blindly, algorithms that analyze patient medical records or test results to make life-changing decisions about patient care. Many of these algorithms factor in the patient’s race, but because algorithms are black-boxed (meaning they don’t allow a user to look inside at the inner-workings of the code), a doctor would never know that the artificial intelligence (AI) application they’re using is racist or sexist.

The problem is amplified when algorithms contain racially-skewed math. The Boston-area study analyzed the medical records of 57,000 patients in the Mass General Brigham health system that had chronic kidney disease. More than 700 black patients, representing one-third of the black patients analyzed, were incorrectly placed into a healthier score. If the 700 black patients’ data had been used with the white patients’ formula, they would’ve been placed into a much more severe category of kidney disease. For 64 of those black patients, the white score would’ve secured them a qualification to be placed on a waitlist for a kidney transplant.

Mallika Mendu is a kidney specialist at Brigham and Women’s and assistant professor at Harvard Medical School. She says the results were “really staggering” to her. She quickly stopped using the race-based calculations with her patients, citing that there are “already other disparities in access to care and management of the condition. This is not helping.”

In 2019, many health systems found out that the software they were using to prioritize special care access for patients with chronic conditions was systematically favoring white patients over black patients. Even though it didn’t explicitly use race in its calculations, other factors, like medical history, poverty, and the number of previous healthcare visits still disproportionately cut down black patients. This study is just another example of how wrong math can negatively affect a patient’s life and even accelerate death.

The Algorithm Problem

The kidney algorithm isn’t the only algorithm that uses race in its calculations. Race is used in algorithms that look for cancer, analyze lung care, and even brain injuries. In August, the NFL was sued by a group of black, retired players who alleged that the league used an algorithm that assumes white people have a higher cognitive function. This algorithm was used to calculate compensation for brain injuries, with lower compensation amounts for black players.

Thankfully, there are people with influence and power who want to do something about this unfair treatment of minorities. Democrat Representative Richard Neal (Massachusetts) says the kidney study is a great example of the requirement to reconsider using race in any medical algorithm. He says, “Many clinical algorithms can result in delayed or inaccurate diagnoses for Black and Latinx patients, leading to lower-quality care and worse health outcomes.”

Neal has gone a step further and asked the Centers for Medicare & Medicaid Services and medical societies to hold an investigation that the impact of race-based clinical algorithms has on patients. Other congressional representatives have additionally asked the Department of Health and Human Services to open an investigation into race-based healthcare algorithms.

A Huge Miscalculation

One major issue was unveiled by the Boston-area study: a standard calculation called CKD-EPI that converts the results from a blood test for creatinine (a waste product) into a correlation of kidney function, called eGFR (estimated glomerular filtration rate). A lower score means worse kidney function, which is in turn used to calculate the disease’s severity and care needed. The calculation uses the patient’s age and gender, but for some reason, black patents had an additional 15.9% added to their score.

The researchers who created the formula in 2009 added 15.9% as a “race correction” due to the low amount of black patients in their data. But according to Nwamaka Eneanya, an assistant professor at the University of Pennsylvania who also worked on the new Boston study, that doesn’t really explain why the correlation between creatinine and kidney function is different for black patients. Eneanya says that simple things that change creatinine levels, like diet, weren’t even taken into account. She added that race is a social category, and not a physiological one, so it’s not needed to interpret blood tests.

Eneanya says that the eGFR formula needs to be abandoned, and its effects have already caused life-changing results for patients. “Any degradation of treatment for these already marginalized groups could have profound results,” she added. Several major U.S. hospitals, like Mass General Brigham, University of Washington, and Vanderbilt, have abandoned the race-based eGFR formula. These hospitals are turning their attention to a new method of calculating eGFR that relies on a different blood test for protein cystatin C.

But unless medical societies change their guidelines, many institutions and doctors will continue to use the race-based eGFR calculation. The two main U.S. kidney care organizations have decided to take action on this problem. More than 1,300 people have signed a petition urging changes to be made.

Major Changes Moving Forward

The fact of the matter is that the traditional eGFR formula is a small part of a growing problem. The medical community, institutions, and societies need to review black patients’ health care plans, revise how they train new doctors, and reframe how they think of race. Any new medical application, especially one that is likely to be relied upon heavily, needs to be transparent in its calculations. It also needs to be adjustable to new research findings.

medical app developer

If you think you’re not affected by race-based algorithms, think again. Any algorithm that uses race in its calculations is already incorrectly analyzing the problem at hand. Even worse, it could mean the difference between life and death for every patient whose provider trusts the algorithm.

Tags: , , , , , , , , , , , , , , ,