[Skip to Content]

Improving health equity one algorithm at a time

Data and algorithms are frequently used in the healthcare industry to identify populations that may benefit from specialty care management. Data driven programs that utilize algorithms can improve disease management, health outcomes and reduce the cost of care. They also have the potential to remove bias from human decision making when it comes to accessing care. 

But what happens when the algorithm itself is biased? Recent research has shown that algorithms in healthcare[1] and other fields[2] can show bias against certain populations due to systemic racism that is reflected in the data used to construct these computer-based calculations. In healthcare for example, data on cost and utilization of care is often relied upon as an indicator of problem severity. However, studies show that Black, Indigenous, and People of Color (BIPOC) typically consume healthcare at lower rates than non-Hispanic Whites despite having similar health status.[3] In this example, over-reliance on utilization or cost-based indicators can perpetuate bias by under-recognizing health issues in BIPOC populations.

Recently, Beacon Health Options and the Connecticut Behavioral Health Partnership (CT BHP) embarked on a 14-month project aimed at improving the health and wellbeing of homeless Medicaid recipients by providing housing supports and access to state funded housing vouchers. Their first algorithm solution was tested and revealed a bias that would have under-selected individuals with Hispanic ethnicity for participation in the Connecticut Housing Engagement and Support Services (CHESS) program. The Beacon team, led by SVP of Analytics and Innovation Dr. Robert Plant and Health Research Scientist Dr. Krista Noam then set out to develop a new algorithm to mitigate and improve equity.

The initial algorithm for CHESS utilized data on inpatient hospital stays and other utilization indicators. Beacon found that this algorithm over-selected non-Hispanic Whites and under-selected people with Hispanic heritage. The bias grew even larger when cost savings were included. 

To improve equity, our data scientists shifted their focus to a diagnosis-based comorbidity index, which is used to predict who is most likely to become seriously ill or die in the coming year. This approach reduced bias significantly but not quite enough. Beacon found that Hispanics were still significantly less likely to be selected for program inclusion, indicating that other adjustments to the algorithm were needed.

To ensure that the population selected through the algorithm more closely resembled the total population experiencing homelessness, the number of days a member had spent in shelter throughout their life was included. Utilizing this approach, Drs. Plant and Noam found that the ethnic/racial composition of the members passing the algorithm more closely resembled the total population when those who spent more days in a shelter throughout their life were selected.

Beacon’s work with the CT BHP is ongoing, and just one example of the strides we are making to improve equity for vulnerable populations. To learn more about Drs. Plant and Noam’s work on successfully removing bias from algorithms, including helpful dos and don’ts, make plans to attend their poster session at the national conference for the National Council for Mental Wellbeing, NATCON 2022, in Washington, D.C. on Monday, April 11. Their session will include:

  • A brief literature review speaking to the presence of bias in algorithms
  • Metrics commonly used to identify specialty populations that may be subject to introducing bias
  • How to test for bias in algorithms
  • Risks associated with certain methods of bias mitigation that could be viewed as discriminatory
  • How incorporating indicators of social determinants of health can help to mitigate bias
  • Real world examples of algorithms that magnify and mitigate bias in client or member selection

[1] Obermeyer et al., Science 366, 447–453 (2019) Dissecting racial bias in an algorithm used to manage the health of populations

[2] NY Times December 7, 2020 – Even Imperfect Algorithms Can Improve the Criminal Justice System, and February 7, 2020 – An Algorithm That Grants Freedom, or Takes It Away

[3] Claims were pulled for one year, allowing for a four-month gap, starting from the month the HMIS data was loaded. Data was pulled regardless of the number of days the member had been eligible for Medicaid during that year.


No comments

Leave a Reply

Your email address will not be published. Required fields are marked *
Comments that are inappropriate and/or not pertaining to the immediate topic at hand will not be published.

Top Link
en_USEnglish