Hawaii is a lovely place to vacation, though, for some, the memories include more than postcard vistas. Hawaii Healthcare Systems Corporation (HHSC) is what Vice President and Chief Information Officer Dennis Sato calls a quasi-public provider of acute and long-term hospital and health services. HHSC runs 12 facilities spread between the five major Hawaiian islands. Four of those facilities are acute care hospitals that have their own radiology departments. Quasi-public, Sato says, means that much of the care dispensed at the facility is paid for from public funds and that those payors exercise some control. HHSC, a nonprofit, operates a total of about 1,250 beds. Those beds are not empty very often; local residents and vacationers fill them. Because injured or ill tourists often return home needing more care, HHSC must have good image and medical data transporting links to the mainland. Because the islands themselves are spread out, HHSC must have strong data links between the hospitals on each of the islands. For this reason, HHSC has been a leader in electronic transmission of imaging and other medical data-telemedicine. It has won awards for its telemedical ingenuity.

Now, HHSC hopes to apply that same forward thinking to building a PACS (picture archiving and communications system) and incorporate with it a system for storing and transmitting a growing amount of radiological data. Today, according to Sato, the group is operating a miniPACS that handles digital images from those modalities that produce them. But the company is running out of space to store film at the four acute care sites where it performs radiology. It wants to convert all of its image archiving to an electronic storage system. So it is researching ways to construct its imaging archive based on its own island-culture needs.

To come up with its storage strategy, HHSC has formed a steering committee made up of doctors and administrators from its headquarters and its regional sites. “We are building our strategy,” says Sato. “First we need to look at the cost, because we have to justify a return on investment.” The HHSC planners have already made a string of decisions on storage. HHSC will begin with a Digital Imaging and Communications in Medicine (DICOM) server based at its Honolulu headquarters that will feed to and accept images from any of its outlying hospitals. DICOM is the accepted common language for recording and storing imaging studies. “What we don’t want is an individual PACS for each hospital. We want to standardize,” Sato says.

Storage Strategy Checklist

It is not just the swiftness and flexibility of electronic imaging that are driving the installation of electronic storage systems, it is also the rapidly multiplying volume of imaging. The Cleveland Clinic Foundation (CCF) operates hospitals and clinics in Ohio and Florida. It has about 100 radiologists on staff, and it processes more than 1 million imaging studies annually. Robert A. Cecil, PhD, is CCF’s network director for radiology. Cecil says CCF is now storing about 20 terabytes of image data per year, but he sees that already large number doubling annually for the next few years. There are several reasons for the anticipated, dramatic increase: imaging studies are becoming more complex and lengthy, more modalities are being converted to a digital imaging format, and more studies are being ordered because more emphasis is being put on imaging as a diagnostic tool, Cecil says.

Cecil says a storage strategy for any institution converting to an electronic archive is absolutely essential, and he offers a list of decision points to consider in formulating such a strategy:

  • Retention time. Decide first how long data will be kept, Cecil advises. This will provide a rough idea of the storage volume needed. The Health Insurance Portability and Accountability Act (HIPAA) requirements for paper and electronic data storage are equal; no distinction is made between the two types, Cecil says. For now CCF is committed to keeping its data on tape “forever,” Cecil adds. “We purchase an archive four to five times bigger than we think we will ever need over a 4- to 5-year period.”
  • Fill rate. Decide how much data will be stored in a year. To achieve economy of scale, combine departments into a single archive. “I highly recommend-and what appears to be widely recognized-is that niche archival within a department is not the way to go,” Cecil says. He notes that turf battles between departments that store imaging data together, for instance radiology and cardiology, will evaporate as long as all data is available to the doctors who need it when they need it. CCF feeds data from a dozen divisions into its single archive, but the costs of the archive are split between radiology and cardiology. This is a convenience, Cecil says.
  • Storage media. At CCF, when images leave the modality, they go “to the nearest neighbor workstation,” says Cecil, and from there to a localized server. From the localized server, the images flow to all the other workstations on that campus. “Then the server immediately turns around and sends the image to the enterprise archive.” For 5 weeks or so, the images remain locally stored on RAID drives but are also on the central tape. For recent images, there might be five copies available from different disk drives and the tape, Cecil adds. The disks and the tape are all magnetic. The tapes are “enterprise class, meant-to-last-forever tapes” that can be accessed faster than some disk formats, Cecil says. “We have run through 5,000 of those and have yet to find a single bit error in the tape.”
  • Hierarchical criteria. At CCF there is no hierarchical strategy in use. The storage is strictly chronological. That way, says Cecil, “the newest data is always on the newest technology. The data in every logarithm of time is accessed equally. Eight to 16 years is accessed about the same as 4 to 8 years; that same ratio is pretty uniform right from the beginning.”
  • Redundancy. The Mayo Clinic duplicates its storage, but CCF does not. “What do we need redundancy for?” Cecil asks. “No institution has duplicate paper or duplicate film. Two archives separate and online become cost prohibitive.” An inexpensive disaster-recovery taping system would be of little use since there would be a time lag to recovery, making the system clinically useless, Cecil argues. “A 2-week disaster recovery does you no good on patient care,” he says. “You basically have to redo the examination in a disaster situation.”
  • Migration. CCF migrates its taped data every few years when new tape technol ogy creates the financial incentive to do such a transfer. Cecil says, “The electronic media double capacity every 18 months, so you can migrate and still retain capacity. You have to upgrade media or drives every few years, but that’s only 10% to 15% of the incremental costs of the storage. The hardware costs and the software costs stay relatively constant. We have migrated twice to higher density tapes. You can do that with magnetic tape data because the transfer rates are very fast tape to tape. We wait until the next tape drive comes out and at a price we like. We buy it, and let the robot handle the transfer. In a week or a month, it finishes, and we can take all the old tapes and sell them to somebody else-after we’ve cleaned them and erased them.”
  • Compression. At CCF the only compression used is reversible or lossless. “Compression destroys data. Sometimes it matters, sometimes it doesn’t,” Cecil says. “Selective compression is hard to do. The only alternative is not to compress. The physicians here are willing to read compression on a per-case basis, but they aren’t willing to write a standard that says compress all MRI by 10. Compression really limits your options and you’ve got to put a lot of time and effort into it. Why not spend that same time looking for cheaper tape drives?”
  • Security and access. “In a hospital anybody in a white coat gets anything he wants,” says Cecil. He adds that CCF sometimes partitions its archive for access and sometimes does not, but those partitions are made more for geographical than for security reasons, he adds. If security concerns become paramount, he says, there are easy solutions such as “secure sockets and passwords and log-ins.”
  • Cost. For some this may be the primary decision point. Cecil compares installing an archive to buying a PC. “A good PC is $1,000, and it’s been that same price for 10 years. But what you get for that money has changed drastically,” he says. “It’s the same way with a robot archive.” A system capable of storing 500 terabytes of accessible data might cost $500,000, he estimates, but for half the money you might get only 100 terabytes. Depending on need, there are economies of scale.

“The big thing to stress,” Cecil sums up, “is to think big and plan for nonobsolescence from day one. Plan a policy that will let you keep your storage forever, or have a plan to roll it over. But plan to do that. Plan that in advance and the whole thing will become friendly and cost-effective.”
-George Wiley

HHSC now has narrowed a long search for archive vendors down to a handful of candidates. One criterion used was how well entrenched the vendor applicants were in the islands. It is a 5-hour flight from the mainland, Sato notes, so vendors cannot be hopping planes to cure system glitches. On the other hand, Sato adds, “Most of the majors are here already.”

When it decides on its vendor for hardware, then HHSC will decide on its vendor for software. Most hardware vendors can handle any demand for storage capacity easily, Sato says, “so the flexibility in the software becomes the most important factor.” Just what HHSC will ask that software to do is still being assessed. Sato estimates it will be another 2 years before the company has its PACS and storage media in place and the archive running.

While it is finalizing its vendor selection, HHSC is taking another step to guide the decisions it will have to make on software design. It is studying in detail current work flow at each of its key hospitals. How are images, including film, currently routed? Who looks at images, when, and where? “We are trying to clean up our manual systems to prepare for our electronic system,” says Sato. “We’re looking at everything from registration to charging to the transcription of dictation. We want to see the work flow before we go electronic. You get out there and select a vendor and then there are all these manual aspects that need to be cleaned up. We’re trying to do that beforehand. This will reduce our installation time.”

HHSC still has a lot of storage decisions to make. How to hierarchically prioritize the storage of images so that those needed most can be accessed quickest is, in Sato’s words, “a level of detail we haven’t got to yet.” Neither have decisions been made on the compression of images for less expensive storage, nor even the types of media to be used-disks, tape-for storage. Others have crossed over those decision points, however, and they can offer guidance to those who, like HHSC, are in the initial stages of putting a storage system in place.

San Francisco

Katherine P. Andriole, PhD, is the PACS clinical coordinator and an associate professor of radiology at the University of California at San Francisco (UCSF). UCSF conducts about 250,000 radiological examinations per year, Andriole says, and she says estimating examination volume is a good starting point for deciding how to build a digital storage system. “Look at the volume you are doing now-then what subset of that do you need immediate access to?” she asks. “Relevant prior examinations are the key. For the same patient, you may want the previous chest, but you don’t need to see the foot from 2 years ago.” Andriole suggests planning for more storage than you think you will need. The newer imaging systems demand more storage capacity to add features like color. Andriole learned this the hard way. “We didn’t anticipate spiral CT,” she says. “Imaging is getting more and more intensive with larger and larger files. You need to plan for all that when you’re creating your archive, because it is an expensive piece of architecture.”

One common way to cut storage costs is to compress imaging data. This lets the facility keep more data online for immediate access. But compression causes images to degrade in quality. It produces so-called “lossy” data, as opposed to the “lossless” original in its slightly compressed form. A certain amount of compression does not degrade images. At UCSF, says Andriole, “the primary diagnosis is losslessly compressed at between 2-1 and 3-1. We then ship that losslessly off-site on tape. We think that’s going to be a one-way trip.” The images assumed to fall into the relevant prior category-those that may be needed for comparisons soon-are “wavelet compressed at a fairly high ratio” and kept available for quick access on RAID [redundant array of inexpensive disks] units on campus, says Andriole. The RAIDs store images, typically on magnetic disks, which can be accessed in a few seconds. Just how much the images stored on these devices at UCSF are compressed varies according to the modality producing the images, Andriole says. “If we compress at 25-1 on CR or DR, we feel comfortable, but for CT it’s only 10-1, and for MR only 5-1.” The far different allowable compression rates per modality are one reason no single standard for compression has been developed by the American College of Radiology (ACR), Andriole says. Instead, the ACR leaves to the physician’s discretion what degree of compression results in a usable image, notes Andriole.

The software to run an archive, says Andriole, is “tremendously important.” One key variable is how well software from one vendor can communicate with the software from another vendor. This is especially true where the storage archive is managed with software from a vendor that may be different than the vendor supplying the PACS software. “It is not an issue to take somebody’s archive and put it with somebody else’s PACS,” Andriole says. “The issue is the display function.” Do the archive images play up properly on the PACS workstations?

The interplay between software is also vital when a facility is attempting an enterprise-wide data management system. Rather than taking data from radiology, pathology, cardiology and the like and storing data from all the departments together, the goal at UCSF is to create what Andriole calls a “virtual database” that lets the databases in various departments talk with one another. She uses the example of a patient scheduled for a CT abdomen scan with contrast. Does that patient have liver function problems that would rule out the use of contrast? The question can be answered if the clinical information system can be accessed by the radiologist. “You use software to connect databases and build a virtual database,” says Andriole. “A lot of people think you should buy a bigger box. I like to go to the approach that we can do it with software. We can optimize even what we have now using software techniques.”

The Mayo Clinic

The Mayo Clinic in Rochester, Minn, has household-name status. The clinic is a huge producer of radiological images, as many as 5 million per year, read by a staff of about 120 radiologists, according to Bradley J. Erickson, MD, a neuroradiologist who is in charge of the informatics laboratory at Mayo. He has headed a number of electronic imaging projects at the clinic, including the storage of digital images.

About 3 years ago, Mayo planners became convinced that “having an archive tied integrally to the PACS was not the right way to go,” says Erickson. There were several reasons for this. First of all, hardware technology, particularly for reading stations, was changing rapidly, and it did not make sense to have to adjust the archive to every new generation of reading stations. Particularly troublesome was the need to update compression instructions and make compression changes with every change in PACS hardware. “That’s one of the things we’ve learned, having run an archive for 12 to 15 years now. You don’t want to write your own compression algorithm and have to keep that forever,” says Erickson.

To solve the compression update problem, Mayo came up with a clever solution that other major data producers are now turning to as well. Erickson and his colleagues decided to store all images straight from the modalities in losslessly-compressed mode, that is, essentially uncompressed. These master copies are stored on magnetic tape drives just like the tape drives used for all of Mayo’s record keeping. The taped images can be used for diagnosis, tumor measurement, and other needs where the degradation of lossy images would not be tolerable. The magnetic tapes used can be accessed in less than 5 minutes. But that is obviously too long a wait to get images that have to be read quickly. So Mayo records 2 years’ worth of its most recent images on RAIDs that can be accessed in less than 30 seconds. The images on the RAIDs are the ones used routinely. The RAID images, to cut costs, are compressed in a lossy format at about a 10-1 overall ratio when modalities are averaged, says Erickson. The taped images, the master copies, are accessed only when they are more than 2 years old and are no longer on the RAIDs.

Before deciding on the 2-year limit for RAID storage, Erickson and his staff ran computerized studies on use rates for all kinds of imaging. Rather than write an overly complicated software program to prioritize quick-access retention according to types of image, they decided on the flat 2-year retention window for all studies. “There were spikes in image retrieval at 6, 12, and 18 months,” says Erickson, “which reflected regularly scheduled follow-ups for a variety of diseases.” The Mayo team found that retrieval after 2 years occurred with only about 10% to 12% percent of images. This was a low enough rate to close the 2-year window on RAID storage. The real beauty of the system, though, was that retrievals from tape could be back-fed into the RAIDs and become part of the RAID data pool. And because the data on the tapes were not compressed much, the compression to put them back on the RAIDs for quick access could be done using the most recent compression algorithm that fit the latest PACS and RAID hardware. The lack of compression on the taped copies makes them compressible using the latest equipment. There is no old compression technology to get in the way when taped images are transferred to the newest hardware.

Mayo is still looking at ways to upgrade its storage system. Currently, Mayo is using three separate PACS systems to handle imaging. The PACS hard drives themselves hold about a month’s worth of imaging for immediate access, but Mayo is studying doing away with even this short-term PACS archive. “One of the problems is that if we do that, we will become very dependent on the institution providing adequate bandwidth to all those images. Certainly, we need to do some tests to see if that can really work well,” Erickson says.

For now, Mayo is keeping its taped images indefinitely, but Erickson acknowledges that soon enough the clinic will have to deal with migrating data from old tapes to newer, better, faster tape drives. “We think it might take a year and a half to migrate all the data. We will want to have access quickly to the most recent data, so we’re looking at migrating in reverse chronological fashion,” Erickson says. Another big problem is how to update patient data between the hospital information system and the radiology information system with any migration. Will the different PACS systems, which have different compression techniques, be transferable with yet another compression instruction? “It seems like a simple thing, moving data from one tape to another, but when it takes a year or more, it becomes more and more complex,” Erickson says.

CONCLUSION

Building a storage archive is complicated and what works for one institution may not work for another. Still, in the broad picture, the basic hardware is out there. It is finding or creating the software that will have the information technology people scratching their heads.

George Wiley is a contributing writer for Decisions in Axis Imaging News.