Find care now
If you are experiencing a medical emergency, please call 911 or seek care at an emergency room.
Our research, published in Applied Clinical Informatics, examined the prevalence of negative descriptors in patient records and whether these were associated with adverse maternal outcomes.
While maternal death rates overall in the U.S. have dropped in recent years to an average of 18.6 deaths per 100,000 live births, Black mothers still have a rate nearly 3.5 times higher than their white counterparts.
Our recent research, published in the journal Applied Clinical Informatics, suggests that the words providers use in patient records could be part of the reason why these disparities persist.
Our researchers found that the use of negative description words—such as combative, non-compliant, or uncooperative—in patient records is much more common when patients are Black, young, or on public insurance. We also found that negative descriptors can play a role in how likely someone is to experience severe pregnancy-related complications, including sepsis, acute kidney failure, eclampsia, and more.
This research was led by my colleagues, Azade Tabaie, PhD, and Allan Fong, MS, and based on techniques used to explore bias in COVID-19 care. Our study helps us understand the real-world impact of provider word choices.
What is the impact of negative patient descriptors?
In a patient’s medical record, there’s a big difference between a clinical fact and a negative descriptor. For example, a clinical fact in a provider’s note might state that a patient “declined” treatment, which is a neutral, objective term. “Refused” treatment is a negative, subjective term.
A single negative word in a patient’s record can impact their care for years, because these notes act like a label that can follow a patient from one provider to another. Consider a patient who is in pain or stressed and becomes short-tempered with a nurse. If the provider writes in the record that the patient was “aggressive,” the term may influence the next doctor’s expectations, which can affect how much time they spend listening to the patient and how seriously they take their concerns.
This can create a “chicken and egg” cycle of bias:
-
A provider might have bias and use a negative word in the record
-
A second provider reads that note and develops their own bias as a result
-
That bias can lead to care decisions that aren’t based on objective evidence
Studies have shown that when bias is involved, whether intentional or subconscious, providers are less likely to follow evidence-based treatments that every patient deserves. In the high stakes of labor and delivery, this can lead to missed warning signs and worse health outcomes for the patient.
Unequal distribution of negative descriptions.
To identify negative descriptors across thousands of pages of electronic medical records, we used natural language processing (NLP). This technique acts like a high-powered search engine, scanning massive amounts of text to identify patterns of bias that would be difficult or impossible to find manually.
Our team analyzed more than 190,000 clinical notes from maternal health records of women who delivered between 2016 and 2020. The goal was to see how often negative labels appeared and in whose records.
Using NLP, we identified negative descriptors in fewer than 0.4% of the notes we studied. While that’s a small overall number, our statistical analysis found that these negative labels are not used equally, and they have consequences:
-
70.5% of notes containing negative descriptors were found in the records of Black patients
-
10.1% of negative descriptors were associated with white patients
-
Negative labels were much more common among patients aged 18-29 and those using public insurance, such as Medicaid
Most importantly, we identified a link between these words and severe maternal morbidity (SMM), including life-threatening complications during or after delivery. Patients with negative descriptors in their records had higher adjusted odds of having dangerous health outcomes.
This research shows that word choices aren’t random; they’re proxies for biases that can have a real and lasting impact on patient care.
Moving toward a more equitable future.
Addressing biased language is central to improving equity in maternal care and beyond. MedStar Health has formed a system-wide team dedicated to removing bias and negative sentiment from our electronic health records. This includes:
-
Reviewing menus and other software details to remove biased language that could be built into the record-keeping system
-
Training providers to avoid unnecessary demographic labels and subjective negative words when describing a patient
-
Implementing new technologies that give patients more robust access to their medical records
Our research into maternal health is driven by our goal to treat all people well and equally. That commitment includes how we talk about our patients and how we describe them in clinical notes.
High-quality care is, by definition, equitable. Through our efforts to identify and remove the hidden biases in our records, we’re helping more mothers get the safe, respectful care they deserve.

