In August, a group of Black former NFL players filed a lawsuit against the league claiming its system for awarding compensation for brain injuries was racially tilted. The suit blamed the NFL’s use of a health care algorithm that assumes white people have intrinsically higher cognitive function than Black people.
The NFL called the suit “misguided.” But Thursday, four Democratic lawmakers asked the Department of Health and Human Services to investigate that cognitive assessment tool and other health care algorithms that use race as a factor. Their letter warns that including race in such calculations could cause doctors to make decisions that worsen health outcomes for people of color.
The letter was written by Senators Cory Booker (D-New Jersey), Elizabeth Warren (D-Massachusetts), and Ron Wyden (D-Oregon), as well as US Representative Barbara Lee (D-California). They cite examples of algorithms including a tool that estimates Black and Latinx women have a smaller chance of a successful vaginal birth after a prior cesarean section. Another is a score of kidney function commonly used to make decisions about transplants and other care that rates a person’s disease as less severe if they are Black.
“To reduce health disparities among communities of color, we must ensure that medicine and public health organizations take a staunchly anti-racist approach to medical care and reevaluate the ways in which current practices, including the use of race-based algorithms, could be worsening outcomes for people of color,” the letter says. It argues the coronavirus pandemic, which has disproportionately sickened and killed people of color, makes the matter more serious.
The group epistle was sent to the Agency for Healthcare Research and Quality, a division of HHS that works on research and tools to improve health care and policy. It asks the agency to assess the use of race-based clinical algorithms, any harms they are causing, and how medical standards might be changed.
Nwamaka Eneanya, an assistant professor at the University of Pennsylvania, welcomes the letter. She has studied the kidney tool, eGFR, and says there’s good reason to think it puts Black patients at a disadvantage, compounding existing health disparities. Black patients are effectively required to get sicker than white patients before they can access specialist care, she says. The tool has been superseded by an alternative that doesn’t use race, and abandoned by some major US hospitals, including Massachusetts General and University of California San Francisco.
More broadly, Eneanya says the medical community needs to recognize that including race alongside biological measures like body mass index or blood test results is bad science. “We need to be more responsible and stop putting social constructs into these tools,” she says.
Eneanya is a member of a taskforce of the American Society of Nephrology and National Kidney Foundation reassessing the use of race in diagnosing kidney disease. She says other research and medical societies should take up similar projects to prevent race skewing clinical decisions.
Algorithms are becoming more common in health care, in part because they offer the potential for more effective care. But concerns about how they influence who receives care are growing, too. Last year, Booker and Wyden wrote to the Federal Trade Commission and Centers for Medicare and Medicaid Services after a study showed that widely used software used to prioritize patients for extra care effectively put white patients ahead of Black patients.
The lawmakers’ letter cites six health algorithms that use race. A recent research paper on the topic listed more than a dozen, in areas including cancer and lung care. Lead author Darshali Vyas, a second year resident physician at Mass General, believes plenty of other such algorithms remain to be discovered.
Vyas hopes more attention to the issue will help flush them out, and draw more resources to studying the effects algorithms that use race have had on patients’ health. “The next important step is to quantify any harm done and see what the consequences have been for our patients,” she says.
More Great WIRED Stories