During the past two years, as COVID-19 has stretched hospitals beyond thin, staff on every team have had to consider how patient care might be sped up without causing a hit to the quality of that care. For University of Washington (UW) School of Medicine radiologists, it has become an opportunity to make artificial intelligence (AI) a more routine part of patient scans and diagnoses. Doing so has created efficiencies that have counterbalanced increases in CT, MR, and x-ray imaging requests.  

“We had to find time to do more patient scans in the same 24-hour day, even with staffing shortages and radiologists working remotely. Embracing AI has become a way to support radiologists’ work and to improve our productivity,” says Mahmud Mossa-Basha, MD, a neuroradiologist. “Embracing AI has become a way to support radiologists’ work and to improve our productivity.”

His team employs U.S. FDA-approved algorithms in two ways: to help detect disease and to improve the visual quality of images that have significant digital noise.

“We’re using image-enhancement algorithms for brain MR and for abdomen/pelvis CT and head CT. It allows us to purposely accelerate a patient’s scan, which results in a noisier data set, but the algorithm can remove that noise, so the image quality is more on par with a non-accelerated image,” Mossa-Basha says. “This allows us to speed acquisition of a patient scan by 30%-40% while maintaining similar image quality.”

That algorithm also helps recover signal and detail lost during the scanning of large patients, he adds.

Mossa-Basha also describes how AI provides the first set of “eyes” to triage specific emergent CT or x-ray studies: “Say a patient has some life-threatening condition. Their scan is de-identified, sent to the cloud and reviewed by the algorithm. It comes back to us a minute or two later with a heat map to flag any emergent diagnoses. It might indicate ‘You need to get to this case first, within the next few minutes.’”

A human radiologist reviews all scans to confirm findings suggested by the algorithm. But the auto-generated heat-map email ensures that a patient’s brain bleed, pulmonary embolism, or fractured spine will be prioritized and treated as quickly as possible.

“Without that red flag, if there happened to be 20 emergent cases at around that same time, it might take us an hour to get to that bleed case,” Mossa-Basha says. “It’s easy to see where speed can affect patient outcomes and help us avoid disastrous outcomes in those circumstances.”

The team has also just implemented AI in CT angiography for stroke. The machine-learning detects blood-vessel blockages that announce stroke events before a radiologist has seen the scan—“findings that are of course very time-sensitive,” Mossa-Basha adds.

The initial AI review helps not only with speed but with accuracy, too. Though every radiologist has more than a decade of training under their belt, human error is always possible. It’s not assumed that the algorithm is correct every time, but machine-learning software has the benefit of having been trained with thousands of patient scans associated with both positive and negative findings for diseases and emergent conditions.

“I think generally the AI algorithm does very well at detecting pathology. We did a study showing that sensitivity and specificity of AI to detect brain hemorrhage is in the 94%-95% range. But it is not flawless; it can miss things, which is why radiologists are still necessary to confirm the algorithm’s findings,” Mossa-Basha says. “The algorithm can help serve as a second pair of eyes reviewing the imaging, increasing the radiologist’s confidence in the diagnosis or helping cover potential blind spots.”

Image courtesy of University of Washington.