A wave of technological advancements has delivered a handful of new modality technologies—digital mammography, multislice CT, and 3T MR—resulting in an increase in the seemingly insatiable appetite of imaging providers for digital storage space. A 64-slice CT, for example, can generate between 1 and 3 gigabytes (GB) of data per patient study.

“As recently as 1 to 4 years ago, having 2 terabytes [TB] of storage was considered sufficient, but today, that is no longer the case, and many imaging facilities need three to five times that amount just to keep up with the data being generated by new imaging modalities,” says Hossein Pourmand, vice president of business development at Candelis Inc (Irvine, Calif). “We have customers in the cardiology field who are generating more than 1 TB of data per month with their 64-slice CT system.”

As a result, determining the best method for ensuring that a facility’s images are archived in a reliable fashion and in compliance with stringent HIPAA regulations, while still remaining accessible and secure, tops the agenda for many of today’s chief information officers (CIOs).

Despite the careful consideration the topic is given, this relatively recent, unexpected growth has left some facilities in a pinch. Organizations that just a few years ago invested heavily in tape-based storage technology are finding themselves in need of more space—and fast.

The Customer Is Always Right

“One of our customers invested in a PACS several years ago that was primarily based on tape technology, because at the time, the only medical images they were going to have to store digitally were from their CT, and that would be a relatively few number of images,” says Patrick Boyle, director of health care and life sciences at IBM (Armonk, NY). But the PACS’ success, coupled with the ever-increasing number of slices per study, began to overwhelm the capabilities of the new system. “It would take upward of 15 minutes to recall even their short-term images,” he explains, “and physicians don’t have a lot of patience for that.”

In fact, one of the primary considerations to a storage solution is how it will impact the end-user. It’s not just the abundance of images that is driving storage and archiving vendors to offer more solutions. As the world goes digital, the users have developed an expectation of being able to access their data anytime and anywhere, effortlessly and without delay.

“The most important thing is to understand what needs to be stored and what is an appropriate service level from a physician’s perspective for retrieval time of that stored data,” says Mike Battin, director of IT applications at Evergreen Healthcare (Kirkland, Wash).

Making sure everyone in the organization is able to stay connected so that caregivers can make the best patient care decisions in a timely manner is the ultimate goal of any storage solution. Exactly how each imaging center or hospital achieves this goal varies based on individual needs and requirements. Thanks to a range of products available on the market today, organizations can find one that best fits their needs.

“We have a range of systems to meet both the business and clinical needs of a health care organization, from products as small as an EMC CLARiiON AX150, which stores up to

6 TB of data on serial ATA, to a large-scale Symmetrix, based on the SCSI format, which the nationwide organizations might standardize on,” says Roberta Katz, director and global solutions leader of EMC Corp’s Healthcare and Life Sciences Group (Hopkinton, Mass). The latest addition to the product line is the CLARiiON CX3 UltraScale series, with a 4GB/second retrieval time, designed for short-term cache needs.

IBM offers an array of open standards-based storage hardware and software solutions—including the DS 4000, DS 6000, DS 8000, and nseries (NAS) disk drive-based enterprise storage systems—and a broad selection of tape solutions, from single-drive, single-server drives, to enterprise-class libraries that support storage of petabytes of data.

The company also offers sophisticated software-based storage and content management solutions like Tivoli Storage Manager, which supports more than 600 different storage devices, including tape and disk, from multiple manufacturers; and prepackaged storage solutions, including the Grid Medical Archive Server, which supports multiple storage technologies—from fibre and serial ATA disk drives to a variety of tape architectures.

Factors to Consider

Facilities should consider many factors when looking for storage solutions to meet their growing digital-imaging needs. Among the first things that need to be addressed is whether the purchase is intended to serve one department, or if it should be capable of providing enterprisewide access.

“Obviously, a departmental system is going to have fewer requirements than an enterprise system,” Boyle says. “So it’s an important question, because it requires our customers to think not just about their imaging data, but also about their entire strategy.” Historically, hospitals managed each department’s infrastructure—a lab network, physician order entry programs, one PACS for radiology, and another PACS for cardiology—as its own system. “Today, customers can’t afford that,” he says, “so they want to be able to buy into one enterprise storage strategy and then have all applications use that.”

It’s also imperative to decide how much growth will be expected from the system after it’s put in place. As a general rule, any storage product needs to be scalable to accommodate growth—increased patient load or the addition of new locations—not just throughout the organization but also within the department to absorb new modalities and improved technologies as they become available and widely used.

Preparing for the Worst

Ensuring the fastest possible retrieval of data takes a combination of software and hardware, and it requires doing away with some more traditional approaches to cost-efficient solutions, such as tape. Although budget-friendly, the reality of rapidly retrieving one document, image, or study from miles of tape is less achievable.

“It’s possible that an employee might accidentally delete a file, and if the center is using a tape backup, he or she would have to find that tape and then search through it to find the one file,” says Mitchell Goldburgh, senior vice president of marketing and sales for InSiteOne Inc (Wallingford, Conn); the company’s Recovery Plus solution provides real-time protection designed to safeguard data. “With digital storage and a simple call to a data center, we can help you repopulate that file or system.”

One file might be inconvenient, but even more daunting is the prospect of using tape to reload millions of files.

“More and more customers tell us that they know they need a better way to recover from a disaster, because if they need to pull 400 TB of information off of tape drives, it could take the rest of their lives to do that,” Boyle laughs.

IBM assists its clients through the entire planning process, in addition to offering technology to help create efficient and effective disaster-recovery models. “Seriously,” Boyle says, “it’s a very, very important consideration with customers—knowing how, in the event of a disaster, they will recover this massive amount of storage quickly.”

It’s also important to remember that “disaster recovery” isn’t necessarily limited to retrieving files after catastrophic events, such as hurricanes, floods, and fires. The ability to regain files lost in more mundane, day-to-day crises can be just as important. If local storage technology fails, networks go down, or scheduled maintenance takes a system offline, a properly designed archiving solution should be able to help guarantee that business continues.

Building a Better Archive

Today’s storage trends are compounded not only by the explosive growth of data, but also by the need to protect that data. Regardless of how it is captured, each file must fit into a disaster-recovery strategy.

Hierarchical Storage Rules

Storage prices have dropped, but the need has escalated. The following list includes a few rules that can be used to prioritize data, so that the most important studies are most readily available.

  1. Establish acceptable retrieval times. What are the service expectations of the most demanding clients? How long are they willing to wait for a study to load?
  2. Positive studies receive priority over negative studies.
  3. Diagnosis-related priorities, such as studies for cancer patients or chronically ill patients, can be established.
  4. Don?t forget the law. Be cognizant of state regulations on the length for which images must be stored.
  5. Determine the tolerable time to recovery in the event of a disaster.
  6. Determine how much data you can afford to lose.
  7. Consider your budget. Recalculate your priorities based on how much storage you can reasonably afford.

One way of making this process happen is by archiving images using a predetermined hierarchy. Making this a reality involves segmenting the images into different categories, such as short- and long-term storage, as well as the diagnoses associated with each study.

“If a facility loses images, the most important thing to recover first is all the images where there was a positive diagnosis,” Boyle says. “We’ll get the ones that were negative later; but all the people who are sick, we need their images now.”

The need to sort studies in this way is another by-product of the massive amount of images being generated. As the data sets continue to grow, so does the demand for some sort of filtering mechanism. The highest levels of interest currently focus around data-storage options that do more than merely catalog an organization’s terabytes.

“With image archiving becoming a major issue for imaging facilities, ‘intelligent’ storage is viewed as a necessity for reducing the total cost of ownership while playing a significant role in information life-cycle management of medical images and data,” says Candelis’ Pourmand. “To meet this demand, the ImageGrid server appliance offers the ability to receive images directly from virtually any modality for comprehensive, purpose-built image and data management.”

IBM’s Grid Medical Archive Solution is a software-based, intelligent storage solution that makes automating storage of images possible by pulling details from the information incorporated in the study file.

“It has the ability to read the basic information from the DICOM header, such as the type of image, as well as the ability to read other information, like diagnostic notes that were part of the DICOM header,” Boyle says. Intended for long-term archiving of image data, it also allows customers to build rules around how they store image data.

Another contender of this new breed of discerning storage systems is InSiteOne’s service, InDex (Internet DICOM Express). The intelligent archive’s indexing software works in conjunction with a facility’s existing front-end application to ensure access to the critical data moving to the on-site screening archive if required, either by an on-demand basis that is transparent to the user or on a priority basis, using prefetches. The key is virtually unlimited storage on-site without any capital costs.

And the Centera, developed by EMC, makes it possible to access all files with equal ease, regardless of when they were created and incorporated into the storage network.

“The value is that when a patient re-enters the system, you can have a longitudinal patient history,” Katz says. “Regardless of whether it’s for a particular episode or disease, it allows the clinician to obtain a full view of patient information, instantaneously.”

Have Priorities

With or without smart storage, getting images to the radiologist’s desktop in a timely manner also is reliant on filing images based on their access requirements. For example, some studies show that the newer an image is, the more likely it will be recalled from the system by a radiologist or other medical professional. The older the image grows, the lower the odds that it ever will be pulled from storage.

According to sources interviewed for this article, research has indicated that 6 months from the day a study is performed, there is about a 10% chance any one medical image will ever be recalled; after 1 year, that number drops to about 1%. At that point, the file is stored mainly to satisfy state and federal regulations, essentially putting the customer in the position of storing images for multiple years with the knowledge that he or she never will need to access them. The ideal way to store these images varies by organization.

“It’s important to realize that the legal requirements vary from state to state about what needs to be stored and what doesn’t,” Evergreen’s Battin says. “It also depends on the legal approach of the organization, whether it is conservative and stores everything, or if it decides to store only what is absolutely mandated.”

Go Forth with Focus

Faced with such a multitude of choices, the best advice for facilities is to stay focused on what will best meet their needs, both for storage and archiving.

“It comes down to the requirements of a particular facility, their disaster-recovery or business-continuity plan, and how quickly they need to be able to recall lost images,” says Todd Thomas, CIO for Austin Radiological Association, a radiology group in Texas. When looking for disaster-recovery solutions, Thomas recommends that an organization establish a recovery-time objective, in terms of how quickly they need to be able to bring those images back online and how much data they can afford to lose if the system is down for a period of time. “Between those two decision points, that should lead an organization to a particular solution to satisfy their business requirements,” he says. “And then if they get sticker shock, they can adjust from there.”

For even the most profitable organizations, cost usually plays a factor in what the final purchase involves. Fortunately, the cost of image storage has been dropping significantly, and most companies are able to strike a successful—and affordable—balance.

“With proper system assessment, management, and enterprise communication, you can modify your storage strategy to meet your financial goals by setting retrieval-time expectations using prefetch and hybrid storage solutions while gaining physician acceptance of the process,” Battin says.

And in the end, the best storage solution for a facility is more than just a price tag and a place to stash millions of images. “Access to images is the most valuable asset, because you’re treating a patient,” Goldburgh says. “And having full-integrity DICOM data is an invaluable asset in terms of liability and avoidance of future costs.”

Dana Hinesly is a contributing writer for Medical Imaging.