David S. Channin, MD

PACS viewing, archiving, and networking technologies have advanced significantly during the past 18 months, with further impressive gains just around the corner, experts say. Enterprises currently availing themselves of these technologies are, as a result, finding it ever easier to cost-efficiently deliver better service to patients and referring physicians alike, according to David S. Channin, MD, associate professor of radiology at Northwestern University’s Feinberg School of Medicine, Chicago.

One important leap forward has been the advent and subsequent growing acceptance of flat-panel, liquid crystal display (LCD) monitors in primary diagnosis. “The transition to flat-panel LCD technology is very rapid,” says Steven Horii, MD, professor of radiology at the University of Pennsylvania Medical Center, Philadelphia. “Flat-panel display costs are still higher than CRTs, but are dropping rapidly. The longer life and lower maintenance costs for flat-panel displays may offset their higher initial cost compared to CRTs.”

Bruce I. Reiner, MD

Meanwhile, much attention also has been garnered of late by developments in viewing-related software. On this front, the big news is a further proliferation of web-based image distribution, Horii reports. “High-quality, web-based image viewerswith their promise of improved speed, functionality of image manipulations and measurements, and qualityare forcing a reconsideration of the traditional thick client’ PACS design,” he says. “Workstations are PCs with high-quality displays, not mini-supercomputers with proprietary communication and display technologies. Moreover, Web-based, thin-client’ workstations are much less expensive than thick-client systems. They are also simpler to maintain and, some believe, less expensive to develop software for.”


In the opinion of Bruce I. Reiner, MD, the most significant advance in viewing software is still for the most part on the drawing boardsprogramming to render PACS capable of reformatting huge, multi-slice image sets into three-dimensional presentations.

Keith J. Dreyer, MD, PhD

“Vendors right now are working to migrate from a traditional display-and-interpretation paradigm, one where images are displayed in a single axial plane, to a newer display presentation state using 2D and 3D multi-planar reconstruction,” says Reiner, director of research for the VA Maryland Healthcare System in Baltimore and an associate professor with the Diagnostic Radiology Department at the University of Maryland School of Medicine. “The reason for this endeavor is quite simply that data sets themselves are going from 100 images per study to 1,200 images per study. It’s impractical for a radiologist to be looking at that many individual images one at a time as would be done typically in a traditional axial plane environment. And data sets are not only increasing in size, they’re also increasing in complexity because of things like fusion imaging where you have CT and nuclear medicine combined.”

By combining multiple images into a reformatted 3D view, radiologists would be able to read a full 1,200-image data set in a much more time-efficient mannerabout the same amount of time it currently takes them to read a 100-image data set, Reiner postulates. “A concern for the radiology profession has been this 10- or 12-fold increase in the amount of time required to read one of these massive image data sets,” he says. “The question is, will radiologists actually look at all of the images or will they take shortcuts such as looking only at every third image in order to save time and maintain productivity? If the answer is shortcuts, then we must worry about the medical-legal liability ramifications. Changing the way we approach the image display presentationyet still making sure the information reaches the radiologist in a more timely fashion while potentially enhancing the overall accuracy of the interpretationis going to be a good solution to the problem.”

Steven Horii, MD

Reformatting huge image data sets into 3D presentations is especially feasible given the computing power right now available but underutilized on the typical PACS, Reiner contends. He asserts, however, that the biggest hurdle to making multi-planar presentation a readily available product is the lack of clinical standards to guide the software developers.

“No studies have been done to date to identify the end-goal and show us where we want to be,” says Reiner, this year’s chair-elect of the Society for Computer Applications in Radiology. “As a result, vendors are having trouble deciding how to best go about developing the application. They’re trying to come up with this solution but doing so without direction. What they need to do is go back to the clinicians and ask what those clinicians want to see on screen and how they want it delivered to them. The biggest players are doing just that by going to the large academic sites for input.”


During the past year and a half, there also has been dramatic progress in PACS archiving. Says Channin, “The technology curve of storage continues to exceed Moore’s Law for transistor density. Therefore, storage is becoming a nonissue for PACS. [Archiving products such as] NAS, SAN, and RAID are all mature enough and evolving fast enough to meet a PACS need. This has opened up a plethora of opportunities for new storage models.”

Among those new models are all-online systems. According to Horii, these obviate the need for mechanical jukeboxes by adding more hard-disk storage about once every 2 years. “[With this approach], no imaging studies ever have to be retrieved from a long-term or off-line storage facility,” he explains. “This is typically accomplished by adding an off-site, high-capacity, write-once storage device with shelf storage’ of the media. Since expected use of such media is low, shelf storage is adequate.

“Concerns about viruses and disaster recovery have led to hybrid design proposals. These storage systems are being proposed using network attached storage (NAS) and storage area network (SAN) designs, depending on performance requirements and storage system uses.”

Horii is convinced that all-online storage systems also are forcing a change in the way PACS designers think about archives and hierarchical storage management (HSM) processes. “If studies are all online, the need for? conventional HSM diminishes,” he says. “Meanwhile, the overall cost of a storage facility may also be reduced, since robotic jukeboxes tend to be expensiveand failure-prone because of their complex machinery.”

Horii predicts that lower storage costs can have immediate benefit. He cites, as evidence, a study by Langlotz et al from several years ago in which it was demonstrated that the archive stood as one of the major cost factors in a PACS application. “Being able to reduce that cost can shorten the time to financial break-even,” he says. “But what has to be examined carefully is whether the image generation rate by radiology is exceeding the growth rate of storage devices. With multi-detector CT and multi-pulse-sequence MRI, image volumes have grown by an order of magnitude. This is likely greater than the growth rate of data storage devices. At present, the doubling time for magnetic disk storage, for example, is about 18 months and there are limits to growth imposed by the physics of magnetic recording. It will, however, be several years before these limits are hit.”


In the realm of PACS networking options, Keith J. Dreyer, MD, PhD, director for medical imaging, Partners HealthCare System, Boston, and vice-chairman of the radiology informatics division, Massachusetts General Hospital, is most enthusiastic about switched gigabit Ethernet.

“It’s being deployed and is running in several locations, and has surpassed ATM (asynchronous transfer mode) in speed,” he says. “I’ve lately even seen 10-gigabit Ethernet. That might be overkill for PACS applications today, but with the volume of megabytes running in and out of these CT scanners, coupled with the coming proliferation of 3D visualization tools, there is indeed going to be a need for higher bandwidth in the networks.

“However, a 10-gigabit option here at Massachusetts General might very well be overkill. We do more than 500,000 studies a year on a single PACS, and so our utilization brings us just to the threshold of maxing out our 100-megabit switched Ethernet. In our situation, we have stepped up to 1-gigabit but only at key locations, not at every desktop. Now we have tremendous capacity. So, I don’t know that we’ll have a need for gigabit everywhereunless we were running 3D visualization everywhere and also starting to bring in cardiology and some real-time examinations.”

Dreyer says there are not any daunting challenges posed by converting to switched gigabit Ethernet from switched megabit Ethernet. “It’s very straightforward,” he assures. “Product is available to allow for switching out 100-megabit blades in their devices in the closets and replacing them with gigabit blades. The good news is you don’t have to switch the wiring throughout your infrastructurefor the most part, the wiring that exists and supports 100-megabit will also support 100-gigabit.

“As for expense, costs are limited to within the closets and within the computers to make sure the network and network interface cards can communicate at 1-gigabit. And all those prices are dropping continuously.”

Another relatively recent network development cited by Dreyer is the growth in wide area network (WAN) options.

“It’s becoming cheaper to run a T1 or T3 line, and I’m also seeing wider use of the Internet, with HIPAA security, to be able to transfer information. In time, it may become the WAN of choice.”

Dreyer also speaks favorably of the movement toward wireless networking. “People are soon going to be looking at the cost of ownership involved in tying down a desktop to a wire and having that in several different places versus giving an individual a computing device that is wireless and that can be carried with them everywherethe office, the OR, the clinics, while they round,” he says. “For now, people still feel comfortable having that direct line and the PC, but if we watch the trend over the next year or two, you’ll see them go to wireless in large

numbers. And they’ll do it because the cost of putting in a wireless hub is essentially the same as the cost of putting in a wired hub. If you are simply viewing images on a browser-based application, it may even be cheaper to have a wireless tablet device than a full desktop.”


Experts generally give a hearty thumbs-up to the headway achieved by the Integrating the Healthcare Enterprise initiative.

“Integration [as pursued by the IHE initiative] allows for increased automation and, with it, increased productivity. There are now studies emerging that show this to be true for technologists and radiologists,” says Horii.

Channin indicates that purchasers of equipment are beginning to understand the importance and value of specifying IHE in their contracts and RFPs. “This is the only way to change the priorities of the vendors such that they invest more development dollars in [IHE] connectivity and communication of their devicesas opposed to the physics and other clinical features,” he says.

Reiner is sure the IHE initiative will empower individual customers.

“Any time you introduce standards, you level the playing field by moving from having everything being proprietary to having more commonality among vendors,” he says. “Eventually, if the IHE initiative succeeds, you’ll be able to buy PACS hardware off-the-shelf and at a fraction of today’s costs because vendor products will be largely interchangeable. That will be very beneficial for sophisticated IT departments that want to equip with off-the-shelf monitors and archives.”

Horii identifies the IHE initiative’s greatest milestones of the past 18 months as innovations including performed-procedure step and modality worklists. “The interactions of the technologist with information systems to accomplish the equivalent tasks in nonintegrated systems are often 3 to 5 minutes per technologist per patient. If you multiply such times out by examination volume, the amount of time saved per day per technologist is easily enough to add one or more patients to that technologist’s workload without increasing the length of the workday.”

Channin adds that successful deployments of IHE-compliant systems in heterogeneous, multi-vendor environments are just beginning. “It takes longer for these technologies to be deployed in modalities as these systems often need major upgrades even for incremental software [IHE] improvements, ” he says. “This means that IHE rollout will be slow and steady for years to come.”

Dreyer is enthusiastic about IHE developments to come: “I’m most excited about Year 5 and Year 6 IHE efforts that are leading to the ability to actually separate the applications so we can get a wider breadth of hardware and software solution providers. This will result in better competition and from that will come better product and lower costs.”


Naturally, as PACS technology continues to evolve, there will be caveats for implementers. “It helps to always bear in mind that the technology is changing at an incredibly rapid rate,” says Reiner. “So, every set of variables an implementer is considering now will almost certainly be obsolete within just a couple of years.”

How does one stay flexible enough to respond to technology and application advances? “Strive for scalability,” Reiner answers. “The archive is a great place to address this, because the price of storage has exponentially dropped over the years. Procure just enough storage to last 1 or 2 years. That way, every time you need to add on, you can do so at a fraction of what that same amount of storage cost you a year earlier.

“As the storage technology itself changes, add on newer types of media. Assume that your storage was entirely on disc and that became antiquated. You would not be locked into the discs; now you can do spinning media, whether it be RAID or DVD. The idea is that you’re scalable so? you can add on and achieve economies of scale, but also if you add on with the newer technologies, you’re not penalized by locking in to older technology.”

To stay abreast of changes and be able to develop the right strategic responses, Reiner recommends participation in user groups. “They allow you to gain a certain amount of networking capabilities,” he maintains. “In one of the user groups to which I belong, the managers frequently send out to all the members a question concerning…how each addresses a specific technical issue; then the responses are compiled and shared with everyone in the group. This is very helpful. Also, the group provides a bully pulpit from which we as a single body can voice concerns to the manufacturerand be heard.”

Rich Smith is a contributing editor for Decisions in Axis Imaging News.