A study published earlier this month in Manufacturing & Service Operations Management found that scheduling systems that rely on machine learning to identify patients with the highest no-show risk can lead to longer wait times for Black patients.
In response, researchers developed a methodology aimed at both reducing racial disparity and achieving scheduling efficiency.
“To the best of our knowledge, our work is the first one to measure and address the racial disparity that takes place in appointment scheduling,” the study reads.
WHY IT MATTERS
Black patients and other patients of color already encounter structural and interpersonal racism within the medical system, which can contribute to poor health outcomes.
And health technology can further exacerbate those disparities.
For this study, researchers took a closer look at patient no-show probabilities, citing previous studies that found individuals in underserved populations were more likely to no-show due in part to encountering barriers to care. In turn, scheduling systems based on no-show rate disproportionately affect patients in those groups.
“Currently, these scheduling systems are penalizing Black patients for not showing up based on socioeconomic issues that are out of their control,” said Shannon Harris, an assistant professor at the Virginia Commonwealth University School of Business who co-authored the study, in a press release accompanying the findings.
“Disparities are present across many areas and cause a ripple effect for underrepresented groups,” Harris added.
“For example, due to transportation issues, Black patients cannot always reach their appointments, meaning they get scheduled in overbooked slots and wait longer. But is it really their fault? Is the system flawed?” Harris continued.
Using a data set of about 40,000 appointments from a large outpatient clinic on the East Coast, researchers found that state-of-the-art scheduling systems cause Black patients to wait about 30% longer than non-Black patients – a disparity that is not eliminated by removing socioeconomic factors from the data.
“Although it is beyond the scope of this paper to offer an extensive discussion of the underlying ethical argument for correcting disparate racial impacts, our study is premised on the view that it is fundamentally unethical to punish black patients if they are predicted to have a lower show rate,” wrote the researchers.
In response, the team developed what they called a “race-aware” methodology, which minimizes the waiting time of the racial group expected to wait longest.
“This strategy dramatically reduces racial disparity while obtaining a similar schedule cost,” wrote the researchers.
“We found that if clinics decrease the wait time of the racial group expected to wait longer, they nearly eradicate the unintended racial disparity, and do so without negatively affecting the clinic efficiency,” said study co-author Michele Samorani, assistant professor in the Leavey School of Business at Santa Clara University, in a statement.
THE LARGER TREND
Addressing bias in artificial intelligence and machine learning has been a recurrent theme for many technologists, who often note that such tools are only as accurate as the data they rely on.
For instance, an article published this past year in the Journal of the American Medical Informatics Association found that biased models may further the disproportionate impact the COVID-19 pandemic is having on people of color.
“If not properly addressed, propagating these biases under the mantle of AI has the potential to exaggerate the health disparities faced by minority populations already bearing the highest disease burden,” wrote the researchers in the JAMIA study.
ON THE RECORD
“Academic studies have shown that black patients may be less likely to be able to make it to appointments than white patients because of socioeconomic obstacles deeply rooted in historical racial discrimination,” read the Manufacturing & Service Operations Management study.
“Therefore, any appointment schedule that results in black patients being given inferior scheduling slots because they deserve that slot would in essence be penalizing those patients for the discrimination and socioeconomic conditions that they have historically suffered,” it continued.
Kat Jercich is senior editor of Healthcare IT News.
Twitter: @kjercich
Email: [email protected]
Healthcare IT News is a HIMSS Media publication.
Source: Read Full Article