David Hirschorn, MD, applies a photometer to a monitor screen in a check of the monitor’s conformance with the DICOM Grayscale Standard Display Function.

Sure, the chest x-ray looks great on the monitor of your brand-new picture archiving and communications system (PACS) workstation, but how long will it stay that way? At what point do you need to check its performance to make sure that you are seeing what you are supposed to see, and how do you go about adjusting it? When is it time to throw it away and buy a new one? No radiologist is perfect. Everyone misses some findings among those myriads of telltale pixels that persist on tape archives more than long enough for colleagues and lawyers to examine them in the 20/20 vision of hindsight. That is bad enough. But what will you say when they ask what steps you took to assure that the display equipment you used was up to par? Perhaps the findings you missed were sufficiently apparent in the image data, but your monitor had lost its ability to display those subtle differences in attenuation. How can you know?

THE DICOM GSDF

Quality control of medical image displays does not have to be complicated. It does not have to be expensive, either. But it does require a basic understanding of image display systems. The issue is a little more complex now that there are two types of monitors, cathode ray tubes (CRTs) and liquid crystal displays (LCD), but the basic idea is the same.

Most digital radiologic images, including radiographs and CTs, are windowed to attempt to display, at most, 256 different shades of gray at any one time because that is approximately the maximum number of just noticeable differences (JNDs) humans can perceive under the best of circumstances. It would seem that a gray-scale level of 4 should appear twice as bright as a level of 2, just as 100 should appear twice as bright as 50. It turns out, however, that the human eye is much better at seeing the difference between 100 and 101, for example, than it is at seeing the difference between 3 and 4, or 250 and 251. In all of those cases, the numeric differences between the two levels are all equal to the same amountonebut the perceived difference is less at the higher and lower ends of the scale. Our eyes’ performance drops off in the very dark and very bright portions of the image, and it is our perception of the pixel values that counts, not the actual numbers themselves. Therefore, the pixel values undergo a perceptual linearization, called the Digital Imaging and Communications in Medicine (DICOM) Grayscale Standard Display Function (GSDF). This ensures that pixel values that are supposed to ascend in a linear fashion, say 6, 7, and 8, for example, actually appear that way to the human eye despite the nonlinearity of our perception. In contrast, a computer assisted diagnosis (CAD) algorithm would not benefit from application of the GSDF to an image first, as it would more likely obscure findings than help bring them out. It would be like putting prescription eyeglasses on someone who does not need glasses at all; they will only worsen vision.

The DICOM GSDF not only improves conspicuity of findings in the extremes of luminance, but also provides a mechanism to standardize the appearance of images on monitors of different inherent brightnesses and with different response curves. It stands to reason, then, that if the GSDF can be used to calibrate different monitors with different characteristics, then it can also be used to recalibrate the same monitor whose luminance and response curves have changed over time. And this is in fact what is used to calibrate and recalibrate monitors.

MONITOR CALIBRATION

Calibration software packages come with the DICOM GSDF builtin; they know what they are trying to achieve. But no matter how sophisticated the software, one thing it can not do without some outside help is see your screen. What the software needs to accomplish is a process of trial and error. It needs to try driving the screen at different numeric grayscale levels, measured in pixel values (0-255), and find out which grayscale brightnesses were actually produced, measured in nits (or candela/square meter or foot-lamberts, 1 nit = 1 cd/m2 = 1/3.4 fL). Since computers do not yet come with eyes, they need a device to measure that light, and they need someone to place that device on different parts of the screen.

The device is called a photometer. It is the shape and size of a hockey puck, and is attached by a wire to the computer through either a special port on the video card or the computer’s USB port. The process, directed by the calibration software, can involve 17 to 256 measurements, and usually takes about 2 minutes. Once those calibration measurements are obtained, they are uploaded to the graphics card for that monitor into a special component of the card called the lookup table (LUT) or gamma correction table. As alluded to above, the process is very much like having your vision evaluated for a new pair of glasses (better? worse? better? worse?), and the resulting calibration values are very much like the prescription you receive. The uploading of the values to the card is analogous to putting the glasses on so images are displayed through their corrective lenses.

The process of calibration is usually done at installation of the monitor, and, in the case of display systems specifically targeted for the medical market, it is done at the factory. Note that calibration is a customized adjustment of the video card for a particular monitor, so in order to buy a precalibrated display system, you must buy the monitor and its video card as a pair.

Some vendors charge a price for the photometer as well as a licensing fee for the calibration software for every computer it is installed on. While the photometer usually costs $400 to $500, it can be shared among all the workstations in a given radiology department. But the licensing fees can quickly add up as the number of workstations increases. Other vendors just charge a little more for the photometer, and charge nothing for the software; it can be downloaded onto as many workstations as you like for free. Of course, the software cannot function without the photometer, but as above, a single photometer can be used on as many workstations as desired. How easily it can be transported among the various facilities of the hospital then becomes an issue.

CONFORMANCE CHECKING

There are various opinions about how often to check monitors for conformance to the GSDF, but a 3-month interval is probably reasonable. However, there is a key distinction to be made between a conformance check and a calibration. Aside from the GSDF itself, the factors that shape the initial calibration curve of a monitor are 1) the physical response of the display as a result of the manufacturing process and 2) the minimum and maximum brightness, also called the black and white levels, of the monitor. The first factor is presumed not to change significantly over time. However, the second factor is the one that does change, and causes a monitor to eventually deviate from the GSDF. Therefore, so long as the black and white levels remain constant after a calibration, the system can safely be assumed to still be calibrated. Hence, at 3-month intervals, a conformance check can be performed wherein the photometer is used to check the black and white levels of the monitor. If they have changed, there are two options. The first is to use the brightness and contrast controls of the display to try to bring them back to the way they were at calibration. If that succeeds, then the whole conformance check takes about 45 seconds, and this will usually work until the monitor has aged past its life expectancy. It is when the brightness and contrast controls can no longer return the black and white levels to their former states that one has to decide to either recalibrate to the GSDF at the new levels, or replace the monitor, either partially or completely, which is explained in more detail below.

Some monitors designed for medical imaging actually have a light sensor built in to the display to monitor the black and white levels. While a photometer can certainly take those measurements, the sensor can do the job just as well and without human intervention. As such, the display can monitor the levels itself on a regular basis, adjust the brightness and contrast automatically as needed, and notify the user when it can no longer maintain the black and white levels. The notification can occur by means of a red light on the frame of the screen, or even by means of an email to a system administrator. It is not so much the sparing of the 45 seconds of the conformance check that benefits the QC technician as it is the elimination of the need to visit every PACS workstation monitor every 3 months. This is especially helpful if there are many workstations distributed throughout the hospital and among distant facilities. Rather, the QC technician need only visit those monitors that can no longer automatically maintain their black and white levels.

DEGRADATION AND REPLACEMENT

CRT and LCD monitors work very differently, and as such the manner in which they degrade and the cost of replacing the degraded components also differ. CRTs shoot an electron beam at a phosphor screen. When the electrons hit the phosphor, their energy is converted into a burst of visible light photons. Over time, the amount of light that the combination of the beam and the phosphor can produce diminishes. When that happens, the only way to restore the monitor to its original brightness is to replace the entire guts of the monitorthe cathode ray tube itselfwhich often costs about 80% as much as a new monitor. A flat panel LCD display operates by shining a bright backlight at a screen of liquid crystals. The crystals attenuate and filter that light to produce the different intensities and colors on the screen. In the LCD display, it is the backlight that dims over time, and therefore only that component need be replaced to restore the system’s original brightness. The backlight of a medical grade flat panel LCD typically costs about $500 to replace, and lasts longer than a CRT.

As mentioned above, it is not absolutely necessary to replace a CRT tube or backlight that has dimmed beyond the ability of the brightness and contrast controls to compensate. Rather, the monitor can be recalibrated to the DICOM GSDF using the photometer. However, it will still be a dimmer monitor, capable of displaying fewer just noticeable differences. The monitor that cost extra because of its greater luminance may be so dim as to be functioning like a cheaper one. That is why, in many cases, either the tube or backlight is replaced, or the whole unit is replaced.

GRAPHICS CARDS

“Medical grade” graphics cards confer several advantages over consumer grade cards, including the ability to support higher resolutions (up to 5 megapixels, or even higher) and to provide finer calibration to the DICOM GSDF with less loss of just noticeable differences. However, they come with a medical grade price, and what is less well known about consumer grade cards is that not only can they support up to 2 megapixels, but they, too, can be calibrated to the DICOM GSDF. Almost any graphics card made in the past 3 years, perhaps more, can be calibrated. Operating system barriers have also been removed. Windows NT strove to create a strong separation between the computer user and the low level workings of the system components, including the graphics card. While this supposedly prevented system crashes, it interfered with the user’s ability to access the card’s LUT to calibrate the monitor. However, with Windows 2000 and XP, this issue has been resolved, and it is now relatively easy for calibration software vendors to write applications which can access the LUT.

Most PACS workstations use two monitors for image viewing, which makes it tempting to buy a cheaper dual headed consumer grade graphics card to drive them both. Unfortunately, many who tried this were unpleasantly surprised when they learned that although the card outputs signals to two monitors, it has only one LUT, typically for the left monitor. Therefore, the second monitor could not be calibrated. Until recently, the cheapest dual headed graphics cards that had two LUTs cost more than $800. There is a simple alternative: just use two single-headed cards, each of which has its own LUT and costs as little as $90. Windows 2000 and XP can seamlessly manage multiple graphics cards and monitors. However, that solution would require graphics cards with a Peripheral Component Interconnect (PCI) interface instead of the new Accelerated Graphics Port (AGP). While most computers have multiple PCI ports, they usually have only one AGP port. Using PCI cards is less desirable, as the personal computer market is trending toward use of the AGP for graphics cards because it is optimized for graphics. Over time, PCI-based graphics cards will probably become more difficult to find. Recently, however, the consumer marketplace has seen the introduction of dual headed AGP graphics cards with two LUTs for a price under $400. This is a significant price shift, as now one can obtain a graphics card that can drive two monitors at resolutions up to 2 megapixels, allowing calibration of both of them as well.

CONCLUSION

In sum, with some basic understanding of monitor quality control, even the only moderately technically savvy can calibrate a monitor and check its conformance periodically. The software packages offer step-by-step guidance through the process. While it may seem like a burden, it actually does not take too much time and ensures that patients are receiving some objective assurance of quality control in the interpretation of their radiology studies. It also provides radiologists with an objective means of assessing the adequacy of the display system they are using. Neither radiologists nor the equipment they use are expected to be perfect, but the radiologists are expected to perform due diligence in assuring the quality of their work. Monitor quality control is part of that. Moreover, it need not require the purchase of “medical grade” monitors and graphics cards in all cases. While they definitely have several advantages, including the auto-sensing of black and white levels, they are not always necessary for good monitor QC. With more hospitals and imaging centers looking to purchase their PACS workstation hardware directly from the cheaper consumer market instead of paying the high markups of PACS vendors, this point is becoming increasingly important. Monitor QC is neither too difficult nor too expensive, and is the responsibility of all radiology departments.

The author would like to acknowledge Jerry W. Gaskill, PhD, and Allen Brown for their input to this article.

David Hirschorn, MD, is a clinical fellow in Radiology Informatics/MRI at Massachusetts General Hospital/Harvard Medical School, Boston. His email address is [email protected]