Let’s say you’re a doctor and you’re trying to monitor a patient for sepsis — a dangerous blood infection that kills hundreds of people each year in Colorado.
If you wait until it’s obvious that your patient is septic, your chances of saving that patient are only 50-50. But the other problem is that the early signs of sepsis — fever, shortness of breath, elevated heart rate — look like a ton of other things, too. If you jump too early, you could be treating something that doesn’t exist.
So how do you predict when a patient is about to be in trouble?
“It’s a terrible disease,” Dr. CT Lin, an internal medicine specialist with UCHealth who serves as the hospital system’s chief medical information officer, said during a panel discussion earlier this year at SunFest. “And it’s very hard to spot.”
The problem is so severe that sepsis, sneakily, has become a major killer in the United States.
According to estimates by the Centers for Disease Control and Prevention, 350,000 adults each year who develop sepsis either die in the hospital or are discharged to hospice care. That is approximately 40 people every hour. Of all people who die of any cause while hospitalized, 1 in 3 had developed sepsis during the stay.
But Lin and others at UCHealth have developed a system they say is saving hundreds of lives a year by catching sepsis cases early. And they’ve built this system using the technology that is poised to revolutionize how doctors deliver — and how patients receive — health care: artificial intelligence.
“One of our ideas was: Can we use years of our data and assemble it and use the machine-learning models to come up with predictions? And so we did that,” Lin said.
But this is not a tale of instant success. The story of what happened next reveals much about the current limits of AI’s ability to take over and do the work of nurses and doctors and the need for hospitals to be careful in how they implement it.
Building a better AI system
The idea of using artificial intelligence to detect sepsis is not new. Work on such systems — even if they were crude at first — goes back over a decade. And the results have not always been encouraging.
In a blockbuster 2021 article in the Journal of the American Medical Association, researchers looked at the accuracy of a commonly used AI detection system for sepsis created by the electronic health record company Epic. The researchers found that the system missed a significant number of real cases even as it also generated a lot of false alerts.
“Its widespread adoption despite poor performance raises fundamental concerns about sepsis management on a national level,” the researchers wrote.
UCHealth uses the Epic system, which has since been retooled, as well as others. But an initial three-month pilot program proved disappointing. There was no improvement in the number of sepsis cases or in catching them earlier.
Lin and his colleagues presented the data to frontline UCHealth doctors and nurses, trying to figure out what was going on.
“And they say, ‘Well, you know, we’re 100% busy taking care of patients who are sick right now. It’s hard to think about 12 hours from now someone might get sick,’” Lin said.
That insight proved crucial in building a better way to deploy the AI tools.
The health system didn’t want to turn down the sensitivity of the detection model, which also monitors for patients who are on the cusp of a life-threatening decline not caused by sepsis. Doing so would lead to more cases being overlooked.
“We wanted to make sure we didn’t miss anything,” Lin said.
But that meant doctors and nurses treating patients on the floor were overwhelmed with alerts and were prone to experiencing what is known as alarm fatigue.
There were moments, Lin said, where nurses were so busy that they didn’t even have time to enter a patient’s vital signs into the computer. They would write them on their arms and enter them at the end of their shifts, leading to a large number of alerts hitting at the end of the day.
So Lin and his colleagues decided to shift where the alerts went.
Using HI — human intelligence — to enhance AI
The UCHealth Virtual Health Center is, frankly, not the most glamorous place.
It’s located in a nondescript office building in Aurora several miles south of the system’s flagship Anschutz Medical Campus. Other equally nondescript office buildings and a large self-storage facility are its closest neighbors.
It is here, though, that UCHealth began sending patient sepsis alerts, rather than to the nurses and doctors treating those patients at the bedside. The center serves as a kind of eye in the sky across the entire UCHealth system, which has hospitals up and down the Front Range.
ICU nurses staffed 24/7 sit before a bank of six monitors — three across, stacked two high — each watching the vital signs and other information of as many as 500 patients at a time. When an alert pings, a nurse can pull up more information on the patient, check their chart, even look into their room via a remote camera.
Using that information, the nurse can decide whether the alert requires action and can then contact the patient’s bedside providers to let them know, said Amy Hassell, the Virtual Health Center’s director. They can also activate special teams in the hospital that work to combat sepsis and other forms of rapid deterioration.
Hassell calls it “the bat signal.”
“We’ve seen our time-to-identification for sepsis improved by over two hours,” Hassell said in an interview with The Sun. “For every hour you delay sepsis care, your mortality goes up by 10%. So to find sepsis earlier is a big deal for your mortality.”
Here’s how big: Hassell said UCHealth estimates its sepsis alert system is saving 375 lives a year. She said other tools used to detect early signs of other forms of deterioration are estimated to save more than 800 lives a year.
“So that’s over 1,000 more patients walking out of our facilities that would have not the previous year,” Hassell said.
And to Hassell and Lin, that’s a big validation that AI tools can be valuable in health care — they just have to be used correctly.
Lin calls this the 80-20 rule. When a new innovation gets implemented, the tech only accounts for about 20% of the work. The remaining 80% is the work of restructuring the human systems that use the technology.
“What you don’t see is the iceberg under the water, which is: Did you reinvent the way you care for patients? Did you really take the team apart and put it back together in an effective way?” Lin said.
“We feel like that,” he said, “that’s the secret sauce.”