By Elaine Sanchez Wilson
With an annual caseload of nearly 6 million patient studies, a company like Virtual Radiologic (vRad) is hugely appealing to information technology firms looking to apply their algorithms in the healthcare space. The substantial amount of data generated from vRad’s vast teleradiology network of 2,100-plus client facilities is a goldmine for enterprising companies hoping to ensure appropriate learning and to build proper models.
That’s what MetaMind founder and chief executive Richard Socher had in mind when he approached vRad with an idea to collaborate on radiology diagnostics. Specifically, the Stanford graduate and founder of the Palo Alto, Calif. based company was hoping to leverage his “deep learning” artificial intelligence (AI) solution to improve the speed, accuracy, and quality of how radiology is performed.
vRad was intrigued, and a partnership was born.
“We think that, moving forward, the way radiology is practiced is going to change significantly over the next decade—for the benefit of both patients and physicians,” said Shannon Werb, chief information officer at vRad and author of the AXIS column “Critical Findings.” The innovation and technology that MetaMind offers is going to be a big part of how it changes.”
Anonymized data from vRad’s clinical database of over 30 million patient studies is being used to train MetaMind’s AI software to search and “red flag” images that potentially show intracranial hemorrhaging (IH). Without immediate medical treatment, IH can quickly lead to increased pressure in the brain and potentially damaged brain tissue or death.
Once an image is flagged, the patient’s study can be automatically escalated within the radiologist’s reading queue. It can also be assigned to the most appropriately experienced radiologist, so they can immediately direct their attention to the image, accurately diagnose the condition, and relay any critical findings back to the attending physician as quickly as possible.
“Because of the nature of our business, with our doctors being remote, we have worked hard to build technology solutions that automate the collection and organization of objective information the physician needs to interpret the studies,” Werb said. “We’ve made major investments over the last two years, focusing on really automating, and what we’ve seen is increased quality and reduced turnaround time so we get the results back faster. Three quarters in a row now our quality continues to improve.”
“Our chief medical officer is very focused and has a clear vision: We want to have radiologists open studies and immediately have their eyeballs on the images,” Werb continued. “We want doctors to be doctors, not performing administrative duties. They should not be looking for prior images or reports, dictating objective information like the number of views or the performed procedure, or even further flipping or rotating the images—all of that should be done for them.”
MetaMind’s patent-pending deep-learning platform trains computer systems with high volume, complex information that includes text, images, and other data inputs, such as diagnostic radiology reports. It comprises a set of techniques that do not require domain experts to program knowledge into algorithms. Instead, these techniques build models from labeled example inputs and learn by observing data.
Werb offered one scenario of a trauma case where a patient presents with an unknown condition yet suspecting significant trauma and undergoes a whole body scan. After receiving the study, vRad’s patent-pending workflow technology splits the study at the clavicles. “We’d send the head imaging to the neuroradiologist, the body imaging to the body radiologist, they’d collaborate and deliver a single result back to the client accurately and very quickly,” he explained. “So for example in this case, we’d put eyeballs on images in about five minutes, and we’d get reports to the clients on average in about 11 or 12 minutes really driving an improved outcome for the onsite trauma team assessing the patient.”
Working with MetaMind’s platform, vRad believes there is enough time in those five minutes to test images and produce what Werb refers to as “a hint” suggesting high likelihood of intracranial hemorrhage. “With the technology, we can take that hint, and not just tap the doctor on the shoulder or put it on top of their worklist,” Werb said. “We can actually close the study they are working on, open the new study in front of them, and tell them they need to read it immediately because a critical condition is likely here. In trauma situations, faster turnaround time is key. Improving the quality by providing more of this objective information to the doctor up front allows them to provide a high-quality result.”
vRad is currently focusing on intracranial hemorrhage and pulmonary embolism, as they are at the top of the list of chronic conditions seen in emergency settings where patients need to be treated immediately. However, the partners see no limitation on where the collaboration could go.
“The validity and volume of clinical data are crucial in training deep learning algorithms. The more data you have, the better you can train them. Given the amount of clinical data vRad has access to, this is really only the beginning of what we can do,” Socher said.
Meanwhile, vRad has also filed a patent for the training and use of “deep learning” algorithms and models in a telemedicine platform. The company plans to embed the technology into the services that it offers its clients, in addition to working with MetaMind to commercialize the algorithms and make them available to other organizations.
vRad’s goal is to have the first algorithm clinically operational by year’s end, according to Werb. For more information, visit vRad and MetaMind.
Elaine Sanchez Wilson is associate editor of AXIS.