Standard of Care is dead and Big Data killed it

 AI and Big Data are rising alongside the national death toll as the wave of digital healthcare and precision medicine runs red. US healthcare networks should not use predictive medical algorithms to triage patient care and acuity. Widely used medical algorithms used to prioritize medical care and resources are biased against Black patients in which predictive medical AI is restricting pharmaceuticals from their target population. 

A landmark study published in the journal Science in 2019, assessed racial bias against Black male patients within an algorithm used by hospital networks to identify high-risk patients that qualify for additional resources to manage their health, such as competitive specialized treatment plans for chronic illnesses. The study audited Optum, the health services company that developed the algorithm, and found that the algorithm overpredicted Black male patients' health and falsely concluded that Black patients were healthier than equally sick white patients. The algorithm was found to use total healthcare costs as a proxy for predicting illness severity in which the data reflects racial barriers to healthcare that result in lower health cost expenditure. Researchers estimate that racial bias reduces the number of Black patients identified for further preventative care by more than half as the fraction of Black patients within the threshold to qualify for care would rise from 17.7 to 46.5% (“Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations”). Machine learning (ML) models are trained on biased data that perpetuate pre existing socioeconomic and racial health disparity as the generalization of Big Data undercut the quality of care to minority communities of POC patients compared to their white patient counterparts. Companies such as Optum are condemned for jeopardizing health equity in which biased algorithms cost insurance groups and pharmaceutical companies revenue as the patient-provider relationship is crudely quantified.

While digital medicine has the potential to improve diagnostic accuracy and therapies, algorithmic bias reinforces existing inequality that stem from socioeconomic disparity. Large insurance groups and hospital networks that implement a wave of digital medicine view AI prediction models as revolutionary, claiming that these algorithms work for the majority to downplay inherent racial bias embedded within the data. Developers and healthcare groups represent stakeholders that invest in precise care and higher frequency of treatment delivery, ML models that under predicted health severity for black patients and overpredicted for white patients are not just a glitch. Companies like Optum are condensed into hospital risk assessment scores used within ICUs to allocate resources in which repeated biases have real implications to disproportionately restrict care from Black and hispanic patients historically underserved from systemic health disparities (Sarkar et al.). While developers and stakeholders claim that algorithms work for “most people”, they are talking about white, male bodies, perpetuating the barriers to care that vulnerable populations already face without the help of AI.  

Predictive ML models operate under the simple issue of flipping coins– every flip is a life-saving treatment or a patient that disqualifies for continued or preventative treatment. The law of large numbers is a pillar of probability and statistics, it states that a sample size grows to produce a mean that mirrors the actual average of the entire population. As the law of large numbers strives for accurate prediction to reframe a smaller prediction onto the larger population it serves, the fallacy lies within numbers that aren’t large enough– otherwise, bad data. ML prediction models embody the operations of the law of large numbers that emboldens Silicon Valley to push AI prediction, despite medical predictive models that fail to apply to underrepresented populations. Continued use of predictive algorithms to provide medical impressions and prognosis revoke necessary treatment from Black individuals from low-income backgrounds because their identity and pain are systematically erased from patient records and hospital data, in which they don't have claims to heads or tails. 

The NYT article “The Average Human Body Temperature Is Not 98.6 Degrees” reports on recent findings surrounding the medical norm of vital taking: body temperature. The average body temperature of 98.6 degrees has been disclaimed as an inaccurate standard of vitals due to sloppy experimentation, from faulty instruments to inconsistent control of variables and sample sizing. The new average human body temperature sits around 97.9, while recent temperature studies correct what was once the cornerstone of “sickness”; the debunking of poorly taken medical statistics then dethrones the generalization of Big Data. Comparison of modern temperature readings place the average temperature to be cooler, in which something so “tried-and-true” from doctor’s offices to households, set the bar for who was truly sick with a fever, and by the same token, who was excluded from that numerical diagnosis. The article tells us that “98.6 is off base” and calls for doctors and the public to degeneralize sickness based on fever readings. Corrected studies then inject the exigence that relying on poor data plays with the boundaries, the classification, between the sick and the healthy as false readings and Big Data are not fit to determine who gets treatment and who doesn’t. 

Vitals such as body temperature might occupy the domestic scene, maybe Covid-era temperature readings at the office or public buildings, but vitals function at the heart of critical care triage and inform providers to administer resources and care to one patient over the other. Under the Standard of Care, treatment delivery acquires the blessing of health provider objectivity, I say blessing because the public trusts a white coat, but it’s falsey plastered over the deliberation of medical algorithm bias that violates regulation for health equity and in doing so restricts access and revenue to pharmaceuticals. White men are a sliver of the population in need of preventative medical treatment for the metabolic and cardiovascular diseases that dominate the healthcare system. Implementing predictive medical algorithms that discount Black patients slashes the reputation of Standard Care and the healthcare market in one coin-flipping blow.


Comments

Popular posts from this blog

Scaling the Potential of Vertical Farming Going into 2025 and Beyond

Knot Your Average Problem: How do Tongue Ties Impact Oral Myofunctional Health?

Crisis to Care: NJ’s Battle with Addiction and Homelessness