Running the Numbers
Product Showcase: Active Archive Appliance Protects Data for 50-Plus Years
Informatics Report: A Look Ahead to 2007
Product Showcase: New RadiForce Digital Mammography Monitors
Logical Images Receives $2.7 Million in Funding
Reducing Backup Headaches
Product Showcase: Data Distributing Destroys Disks With CD Wipeout

Running the Numbers

162 PACS and PACS components companies exhibited at the 2006 Annual Meeting of the Radiological Society of North America (RSNA). Compare that to 169 PACS and PACS components companies at the 2005 RSNA Annual Meeting. Which companies did you see at the meeting, and who wowed you? E-mail your thoughts to Editor Andi Lucas: .

Product Showcase: Active Archive Appliance Protects Data for 50-Plus Years

Providing a new approach to keeping fixed content data active and accessible on the network, PowerFile Inc, Santa Clara, Calif, has introduced its new Active Archive Appliance (A3). It can store frequently accessed documents, images, and media files in a primary cache for rapid removal; this is a much more time- and space-efficient alternative to storing excess data on tape.

The PowerFile A3 can store up to 30.6TB of data on DVD optical media.

A3 presents itself to the network as being of standard network volume while committing permanent data to a virtualized range of DVD optical media. Pooling the DVD subsystem storage yields reliable volumes of storage up to 30.6 terabytes (TB); thus, archived information is protected from accidental erasure, unauthorized modification, data corruption, or viruses—for as long as 50 years, according to PowerFile.

“The process of retrieving a file from our tape archive was such a hassle even after the tape was mounted and restored,” noted Terreyl Kirton of VT Graphics Inc, Lands-downe, Pa, in a press release. “With A3, we are now able to pull the content from our archives in seconds, much like any other network share in our environment. In addition to being a rock-solid system, this has to be the easiest storage system I have ever installed.”

A3 power usage is about 5% of that of typical spinning disk; PowerFile also estimates that the total cost of its system over 5 years is a fraction of the total cost of spinning disk. A “starter kit” offers 3.4TB of usable archive capacity at $15,000, and 1.7TB expansion kits are available for $5,900. For more information, visit www.powerfile.com.

Informatics Report: A Look Ahead to 2007

By Michael Mack

Well, it’s here already: December. In addition to preparing for the holiday season, it’s also the time of year to look toward the future and what’s happening in the informatics space. Over the past 6 months, we’ve seen a few strong trends that I believe are going to continue to pick up speed in the new year.

First, cardiovascular information systems (CIS) have become, or are becoming, one of the key strategic initiatives at many hospitals. It seems that almost every hospital has been doing some level of digital capture and archive, typically in the cardiac catheterization lab with single copies of CDs or DVDs. Each modality has been managed as silos and, in many cases, with disparate vendors for cath/hemo, echo, EKG, and nuclear medicine. Any retrieval of the data has been cumbersome and could require delving into different archives to see multiple modalities (ie, echo and EKG).

The recent acquisitions of many companies by others have begun to bring these solutions completely together. Post acquisition, these vendors are releasing software applications that allow much tighter integration, enabling CIS to be realized. CIS is moving toward automating the capture of all of this cardiac data.

In the cath lab, the images and hemodynamic data are being archived and then combined with structured reporting. This provides the automated capability for the cardiologist’s report. The data is being electronically transferred into the site’s registry, eliminating manual entry of this data. Also, many sites now are using this data to interface with their inventory systems to update supplies used and inventory on hand. Additionally, this data can be used to generate a charge report and then pass into the billing systems. Typically, this is being done manually. The overall goal is to bring the CIS and all of its modalities into the facility’s EHR to provide clinicians and the referring community with rapid access to all patient data, reports, and images. It is the technical advancement of all of these pieces that is making CIS a trend.

Second, more and more sites are now ready, or will be in 2007, to upgrade their existing radiology PACS. Whether the decision is to stay with the current vendor or not, this usually will require some type of data-migration strategy. This requires several steps, including data inventory, data quality assessment, and risk analysis. The information gathered during these steps will determine not only what you can do, but also how long it will take to perform this migration. Depending on your state’s requirements, some data might not need to be migrated; still, some even older media could contain pediatric studies that need to move. Large archives—with more than 10 terabytes of data—easily can take more than 1 year to migrate.

During the migration, you also will want to “clean” the data, meaning correcting or fixing any data elements that were not pristine when archived originally. Again, this step will impact the time required for migration. Financially, it is expensive for the professional services to migrate this data; however, it also is expensive to keep the older legacy archive running and, in some cases, the hardware has been deemed obsolete. Data transfer typically is performed during off hours to minimize impact on systems and network. Additional hardware can be purchased that allows a more rapid migration if required.

A final trend is that we will continue to see the migration of third-party software applications onto PACS workstations with an ongoing improvement in the level of integration. Most of these packages either will be or are moving to concurrent licensing, thereby removing the inflexibility of seat licensing. In many cases, this additional flexibility at any workstation can require more robust hardware and also can impact your post-PACS workflow.

On a final note, I have been writing this column for the past 2 years, and I want to thank everyone who has told me that what they have read has been useful information. In October, I purchased The Thomas Group; because I will need additional time to manage the company, I will be unable to continue writing this column. I wish you all a prosperous 2007.

Michael Mack is president and CEO of the Thomas Group Ltd, Anaheim, Calif. With 20-plus years of experience in medical imaging, Mack now specializes in PACS planning and implementation. For more information, contact .

Product Showcase: New RadiForce Digital Mammography Monitors

The RadiForce digital mammography monitors from Eizo Nanao come in both a glare and an anti-glare configuration.

A new set of monitors from Eizo Nanao Technologies Inc, Cypress, Calif, boast 5-megapixel monochrome LCD panels; the RadiForce G5510 and G5510-G are for use with digital mammography, DR, and CR.

Because digital mammography involves the visualization of subtle calcifications and masses, every pixel counts; to address this and other accuracy concerns, Eizo now offers its latest developments in monitor technology in both a glare and anti-glare configuration. The anti-glare configuration (G5510) is designed, as with most monitors, for use in bright environments; the glare configuration (G5510-G) features a smooth surface with no diffused reflection from a waffled surface for easier interpretation in dark environments.

The glare panel also features a mechanism called modulation transfer function, which incorporates resolution and contrast data of a monitor’s panel into a single specification. Both new monitors include a digital uniformity equalizer to ensure consistent luminosity across the LCD panel. Eizo also offers two graphics boards, both of which contain twin DVI outputs and support portrait and landscape viewing without additional software. Flexible arm- and wall-mount options are available, as are both clear- and blue-base panels.

For more information, visit www.radiforce.com.

Logical Images Receives $2.7 Million in Funding

Logical Images Inc, Rochester, NY, has closed a financing deal with Ticonderoga Capital, Wellesley, Mass. Under the new agreement, the private equity firm will provide $2.7 million in funding for Logical Images’ product development, sales force expansion, and entrance into new markets. As part of the investment, Ticonderoga Partner James Vandervelden has joined Logical Images’ board of directors.

“We are attracted to Logical Images because it has a valuable visible clinical decision support tool; a strong management team; and an impressive, rapidly growing customer base, which includes more than 350 hospitals in the United States and 10 other countries, along with the US military and several state and local health departments,” Vandervelden said in a press release. “The Logical Images management team has accomplished a lot since the company introduced VisualDx in 2001. We plan to build on this momentum and help Logical Images capitalize on the promising opportunities it has for explosive growth in the future.”

The Rochester Business Journal reports that Logical Images will more than double its staff over the next year, with a focus on boosting sales and entering new markets. Arthur Papier, CEO and cofounder of Logical Images, told the Journal that the company’s VisualDx software system is undermarketed.

The VisualDx system aids physicians in decision-making using a database of more than 13,300 medical images, covering 800-plus conditions. The company hopes to build a consumer version of its product, allowing individuals to self-diagnose and self-treat; it expects to bring this version online by the end of the year or early 2007. For more information, visit www.logicalimages.com.

—C. Vasko

Reducing Backup Headaches

By Anne Rawland Gabriel

Norton Healthcare could save six figures by reconfiguring its storage system

With a new radiology PACS in the pipeline and a recently installed cardiology PACS already piling up the terabytes (TB), Norton Healthcare, Louisville, Ky, found itself at a crossroads in late 2005. Norton’s aging tape-based backup system was becoming overwhelmed, but simply replacing it wouldn’t keep up with demand.

“By that time, we had nearly 40TB total for all applications, including our cardiology PACS and HIS,” says Brian Comp, division director of technology services at Norton, Kentucky’s largest health care provider, which posted $3 billion in total revenues last year.

“Our nightly backups were colliding with production,” Comp says. “This created some exposure, because some servers weren’t getting backed up often enough to meet operational recovery objectives.” Plus, in the event of a catastrophe, disaster recovery (DR) could have taken weeks.

Operational recovery refers to day-to-day mishaps, such as deleting the wrong report or system crashes. DR is restoring an entire data center—hardware, software, and contents. Before Hurricane Katrina, relatively few organizations had adequate DR plans. Many had operational recovery gaps as well.

And, the problem isn’t simply logistics—backup tape is notoriously easy for humans to damage or misplace.

Brave New World

Fortunately, next-generation disk-to-disk (D2D) technologies are now within reach. Just like it sounds, D2D is the process of backing up information to disk-based libraries rather than to tape. Until recently, D2D barriers have included the relatively high cost of hard-disk capacity, compared to tape, and the inability of software to back up directly to disk.

With these hurdles largely overcome, D2D’s attractions include speed, scalability, and efficiency. Because disk drives process information faster than tape drives, backup windows are exponentially reduced and systems are brought back online. Then, in the background, the backed-up files are moved off to tape (D2D2T), and the tapes are transported to an off-site location. Or, in an even more advanced setup, backed-up files are transmitted to hard disks at a distant site (D2D2D), theoretically eliminating tape.

In Norton’s case, a variety of forces converged to require a systemwide technology refresh. Just the cardiology PACS’ voluminous data output made effective backup technology a significant refresh component. Norton began working with its dominant storage vendor, EMC Corp, Hopkinton, Mass, and decided to make the leap to D2D2T. The facility selected EMC’s Clariion CX3 disk library, with a 4GB/s retrieval time, pairing it with an upgrade to the department’s existing backup software, EMC’s Networker.

As data center renovation proceeded throughout the first half of 2006, Norton tested and rolled out D2D2T backups for what had grown to 45TB. “We’re a very cautious organization,” Comp says. “As government regulations and operational requirements change, it changes the way we store data, the way we replicate it, and how we respond to problems.”

Radiology PACS Arrives

One D2D2T payoff came when Norton’s radiology PACS arrived. Installation and testing were completed during May and June of 2006, with phased “go-live” beginning in July. “We squashed initial implementation into an extraordinarily short time frame,” says Sean O’Mahoney, senior project director for IT at Norton.

Prior to adding the mammography module in September, the radiology PACS generated about 1.2TB of new data per month during July and August. “Mammography studies are larger than other equipment due to the resolution required,” O’Mahoney says. “Based on our initial calculations, the radiology PACS will generate 12 to 15TB per year, and we think it’s going to be higher.”

Of course, D2D2T backups also are paying broader dividends. “Where it used to take us 8 hours to back up a particular application, now it takes only 2,” Comp says. “Plus, Networker gives us about 2:1 compression, so we purchased a much smaller tape library, Sun StorageTek, for cloning tape than we would have otherwise.”

In addition, despite an overall data growth rate of 5% to 10% per month, two FTE backup administrators are being redeployed as more than just tape jockeys. “We want our personnel resources to do more important things, like investigating failed backups,” Comp says.

As for restores, in Norton’s environment, they’re less of an issue. “Clinical data rarely requires recoveries,” Comp says. “Mostly, its personal user data stored on home directories, which we’ve moved off to [network attached storage] NAS appliances with checkpoints, so now our help desk crew can restore someone’s spreadsheet, most of the time.”

Not surprisingly, there are still some hurdles. “First, we have some proprietary applications that Networker handles poorly,” Comp says. “For those, we’re working with our local providers on alternative backup and redundancy solutions, which will be costly in the short term.”

In addition, Networker has reporting challenges. “Reports are cumbersome to navigate, and you have to dig down to see what’s working, what failed, what’s the speed, et cetera,” he says. “Ideally, we’d like something graphical, and EMC has come to the table with third-party solutions that we’re evaluating.”

Regardless of annoyances, Comp isn’t looking back. “D2D2T is permitting us to stay ahead of growth that’s exceeding projections while still keeping pace with compliance requirements,” he says. And, it’s eliminated the incalculable cost of a mishap that could have occurred due to exposure.

In fact, Comp estimates that the D2D2T setup will provide significant operational savings by reducing tape requirements, backup administration, and restore times. He says, “These savings could run well into the six-figure range over the life of the system.”

The Next Phase

With Norton’s total storage needs expected to top 65TB by the end of 2006, Comp and O’Mahoney already have turned their attention to a DR implementation for 2007. Estimated to cost more than $6 million in capital and ongoing operational dollars, the solution will include a secure “hot site” where Norton’s entire IT system—hardware, software, and content—will be duplicated. In short, they are creating a mirror-image data center for user-transparent failover.

“In our area of the Midwest, tornadoes are our biggest concern,” O’Mahoney says. “But, here at our main data center in downtown Louisville, flooding is the biggest threat, because we’re near the Ohio River.”

A key component will be a distributed D2D2D backup system, Comp says. Because distributed D2D2D essentially means backing up over distance, it requires significant infrastructure investments as well as leading-edge software to crush down, encrypt, and decrypt data for safe and speedy transfers.

Ironically, a hot site is unlikely to spell the end for much-maligned tape. “For the near term, there will be systems that make sense to continue backing up as D2D2T,” Comp says. “However, sometime in the next 5 to 10 years, the technology may make disks so affordable that tape might finally be obsolete.”

Anne Rawland Gabriel is a contributing writer for  Medical Imaging. For more information, contact .

Product Showcase: Data Distributing Destroys Disks With CD Wipeout

The CD Wipeout device grinds disks from the top layer down, removing both external labels and the data layer.

New from Data Distributing LLC, Laguna Hills, Calif, is the CD Wipeout device, which offers an efficient, HIPAA-compliant destruction system for patient data CDs. The company notes that even partially burned, then rejected, disks could still contain extractable patient data; the number of disks requiring destruction sufficient to maintain patient privacy is growing as digital imaging takes root.

“CD Wipeout is a disk-destruction device that destroys CDs to the extreme degree required by HIPAA,” said Jennifer Coleman, marketing manager for Data Distributing, in a press release. “Noncom-pliance with HIPAA’s wrongful disclosure standards, even in unintentional situations, carries a hefty fine. CD Wipeout renders the best possible results for facilities to protect themselves.”

The device works by grinding the CD from the top layer down to remove the indelible printed label while simultaneously destroying the data layer, which resides just below the surface. For more information, visit www.datadistributing.com.