Five takeaways from STAT’s investigation of bias in the use of health-cost analytics software

This post originally appeared on StatNews.

By crunching data on patients, software developers promise to help U.S. hospitals and insurers accomplish a crucial task: identifying those most in need of stepped-up care to manage their chronic illnesses.

But a STAT investigation found that these software systems are infusing racism into health care by systematically overlooking obstacles faced by people of color. Here are five takeaways:

Using cost as a proxy for health need results in unintentional redlining.

The most widely used software products predict how much patients’ care will cost, which hospitals use as an indicator of health status to help figure out who might benefit from additional doctor visits or other preventive care. But using cost as a filter infuses racial bias into decision-making because people of color tend to receive less medical treatment than white patients with similar illnesses. This is due to inequities in access to care and insurance coverage, as well as historic discrimination.


The fallacy of how these software systems are used is exposed by disparities in outcomes.

Analytics software is designed to find patients who aren’t getting enough help. But the evidence suggests it may instead be reinforcing long-standing disparities in treatment and outcomes. Black people are more frequently re-hospitalized following surgery and for complications of heart failure and diabetes. These imbalances are also plainly evident in rural and urban communities where generations of segregation in hospitals and other health care settings continue to affect how and where people of color get medical services.

Algorithmic bias is pervasive.

Algorithms that predict cost are used to help analyze the health needs of some 200 million Americans. Hospitals increased their use of these algorithms following passage of the Affordable Care Act, which created incentives to identify and intervene in the care of high-cost patients, through new arrangements that shared financial responsibility for runaway medical expenses between insurers and hospitals. Many provider groups have formed accountable care organizations that must keep a close watch on costs in order to prevent financial losses.


Race is glaringly absent from the discussion about how analytics software is used.

None of the most widely used software systems explicitly warns users that racial bias could arise from the use of the products to target medical resources. In fact, their websites and online brochures promote the use of these products for this purpose. Some developers claim their software is configured to guard against bias, but cost prediction is still a core element of their products. Hospitals and insurers, meanwhile, have failed to account for these biases, and are disincentivized from doing so in contracts focused on holding down costs.

Social and behavioral health data that might counteract disparities are not universally collected and analyzed.

Providers are increasingly collecting data on social and behavioral factors that contribute to disparities in health outcomes, such as food and housing insecurity and unequal access to transportation. But these data are not reported in a standard format, and are not yet consistently applied in analytics software meant to target resources to needy patients. Doing so requires time and money that stakeholders have not been willing to spend.

Read the full investigation here.

This post originally appeared on StatNews.

Orange in Life

Subscribe to Our Newsletter