Tailored, Thoughtful Technology to Address Bias in Medical Care

Medical bias based on race, ethnicity, gender and gender expression, sexuality, socioeconomic status, etc. reflects the worst in the medical community.  This same bias is reflected in critiques of AI.  There is a famous quote, “Technology is neither good nor bad; nor is it neutral.”  The quote continues:

By that I mean that technology's interaction with the social ecology is such that technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves, and the same technology can have quite different results when introduced into different contexts or under different circumstances”.

We know that AI facial recognition technologies are known to have disproportionate error recognizing BIPoC people, gender, and challenges with age.  Furthermore, technologies like facial recognition technology are being disproportionately deployed to surveil marginalized communities.  Much of that comes from training data, or the data that algorithms use to ‘learn’ how to work on, disproportionately contains white, middle-aged, male data.  As a result, facial recognition, for example, ‘sees’ and learns how to recognize subtle variation between white, male faces, but does not get this consistency with other races, ethnicities, genders, ages, etc.

Does building AI models off of healthy, white men sound familiar?  We see the same reflected in medical research.  Studies show again and again that much of the recent medical trial subjects are white men.  This partially stems out of weak recruiting methods which ignore the barriers to participating and make weak attempts to reach to underrepresented communities, very real distrust from a medical community that conducted the Tuskegee experiment, and many other biases and experiences, all which come together to help generate medical knowledge tailored to healthy, white, male bodies.

We believe that thoughtful, tailored technical innovations can be used to mitigate the effects of systemic racism, sexism, discrimination based on socioeconomic status, etc.

By prioritizing medical transcriptions with AI, we open doors to future developments like different types of sentiment analysis, which analyzes the positive or negative connotations of language to determine how that person might feel, or analyze patients’ use of words that indicate pain, etc. and compare that sentiment to how we prescribe painkillers.  Our community is developing tonal analysis, or technologies that ‘teach’ products to ‘understand’ how a speaker’s tone might indicate something about their emotions or health — something we’ve been very excited about.  We think that these developments offer us great opportunities to provide healthcare providers with more data that can help us understand our patients more.  It offers us as healthcare professionals, an opportunity to analyze our work and can act as a secondary check.  Coming into this space with a healthy level of acknowledging potential bias and acting on it will help us better treat our patients.  

We’ve argued before that tools like medical transcription services with AI-generated summaries can help doctors achieve high standards of care by reducing their workload, and we stand by this.  We believe that these tools can allow healthcare providers to spend more time with their patients and less time with their charts - without compromising quality or time.  We think that these tools can offer something new - in the form of more data and time - than we would get otherwise.  After all, when we have the opportunity to check a patient visit summary rather than write a detailed one ourselves long after the fact because we’ve fallen behind on charts we have an opportunity to greatly increase the quality of our care and time.

Individual patient profiles are great, but we these technologies may allow us to also (anonymize), aggregate and compare our patient population.  We know that our patients come to us with different experiences and levels of comfort, perhaps grounded in gender/gender expression, race, ethnicity, socioeconomic status, age, etc., which will affect, among other things, the ways that they communicate with us.  This data has the potential to allow us to compare our patient populations in ways that were previously inaccessible to us.  For example, lead poisoned drinking water in Flint, Michigan was made public partially due to a pediatrician realizing that they had the data showing that more of her patients had elevated levels of lead.  We can compare rising diagnoses in certain populations which may be linked to social determinants of health, we can compare how we prescribe painkillers compared our patients’ expressions of pain through sentiment analysis and tonal analysis.

Effective, tailored, limited tools can give our community the time and capacity to address biases in medical care and discover and best treat trends in order to support and best treat our communities.

Related stories

Blog

Bringing Real-time Intelligence to the Point of Care: Introducing DeepScribe Assist

DeepScribe Assist uses ambient AI to provide real-time insights at the point of care, transforming clinical workflows beyond just AI medical documentation.
Blog

AI for Specialty Care: 4 Reasons Why Generic Ambient AI Solutions Fall Short

Learn the 4 key issues with general ambient AI solutions in specialty care and how to choose the right AI scribe for better workflows, patient care, and efficiency
Blog

Specialty-specific AI and rich data: How Commons Clinic and DeepScribe are partnering to improve musculoskeletal care

"It's ultimately the data that comes out of richer notes, richer insights, more human anecdotes about what's happening with that patient that allows someone in my seat to look at the real human toll of a musculoskeletal issue and say, 'Okay, we can build something new.'"

Realize the full potential of Healthcare AI with DeepScribe

Explore how DeepScribe’s customizable ambient AI platform can help you save time, improve patient care, and maximize revenue.