High radiation doses normally result in cell death, and this can cause harmful effects such as skin burns, loss of hair, or the induction of sterility. These deterministic effects are characterized by a threshold dose, below which there is no possibility of the harmful effect’s occurrence. Above the threshold dose, however, the severity of the effect may be related to the radiation dose. For example, doses of about 600 rad can induce erythema, but increasing the radiation dose to 2,000 rad can result in serious skin burns.
Protecting patients from deterministic effects requires that the radiation dose delivered to any given tissue or organ be kept below the corresponding threshold dose. Since the highest dose delivered to a patient normally occurs on the skin at the point where the x-ray beam enters the patient, the entrance skin dose is the best dose descriptor for use in predicting deterministic effects. In general, deterministic effects are extremely unlikely to occur during normal radiographic, fluoroscopic, or CT examinations. Deterministic effects are a definite possibility, however, during interventional or therapeutic radiological procedures.
Of patients undergoing diagnostic examinations, an overwhelming majority receives a skin dose that is considerably lower than the threshold doses required to induce deterministic effects. At these lower radiation doses, the main risks to the patient are stochastic (random) effects: the induction of cancer and genetic effects that occur in the offspring of irradiated individuals. Genetic effects were deemed most important 50 years ago, but the risk of cancer is the major concern today. For stochastic effects, the radiation dose determines the probability of the effect’s occurrence. The severity of any radiation-induced cancer, however, is independent of the radiation dose.
Knowledge of the cancer risk associated with radiation comes from epidemiological studies of groups exposed to radiation. Studies have been performed on the survivors of the atomic bomb attacks on Hiroshima and Nagasaki, Japan; on patients who have been exposed during diagnostic and therapeutic procedures; and on exposed workers, including radium dial painters and uranium miners. There is no doubt that high levels of radiation exposure induce cancer, but there is little reliable evidence of any cancer risk at the lower doses normally encountered during diagnostic radiological examinations. It is, therefore, important to note that quantitative radiation risks at the low doses generally encountered in radiology are based on a theoretical linear extrapolation from the demonstrated effects of radiation at higher doses.
The effective dose, expressed in mrem, is a dosimetry quantity that takes into account the dose delivered to all organs irradiated during a radiological examination, as well as the radiosensitivity of each irradiated organ. The effective dose attempts to quantify the stochastic risk of a given radiological examination, and is the best available dose descriptor for the overwhelming majority of diagnostic procedures. One major advantage of the effective dose, as a dosimetry quantity, is that it permits all diagnostic examinations to be compared using a unified scale (Table 1, page 16). Another advantage is that the radiation received in a radiological examination can be compared with natural background radiation (approximately 300 mrem per year), as well as with dose limits for human-generated radiation sources that have been set for members of the general public (currently 100 mrem per year) and for radiation workers such as radiologists (currently 5,000 mrem per year).
Another advantage of the effective dose as the dose descriptor in radiology is that it permits the radiation risk to be quantified. Current estimates are that an effective dose of 1 rem (1,000 mrem) corresponds to a risk of fatal cancer of 5 in 10,000. It is important to note that there is considerable uncertainty about this risk estimate, and it is so small that it would be impossible to measure directly in any epidemiological study. It is possible, however, to analyze the benefits of radiological examination by evaluating screening mammography. If 1 million asymptomatic women undergo breast x-ray examinations, about 1,500 cancers will be detected, resulting in the saving of about 300 lives. The corresponding (theoretical) risk of the radiation exposure resulting from the examination is about five fatal cancers, which would occur decades in the future. The real benefits of screening mammography outweigh by a very large factor the (theoretical) radiation risks from the small amount of x-rays used to perform this type of radiological examination.
For the past 100 years or so, radiographic images have been obtained by exposing film to radiation. One important consequence of this process is that the amount of blackening on a film is directly related to the amount of radiation used. If more radiation is used, the film is blacker (overexposed), and if less radiation is used, the film is lighter (underexposed). Replacing a film with a digital detector enables the radiation’s intensity to be converted into an electronic signal that is digitized and stored as a number in a computer. This process effectively decouples the amount of radiation used and the appearance (blackening) of an image. For this reason, the amount of radiation that could be used to obtain any radiographic image is, effectively, a free parameter. The important point is that the amount of radiation used to perform a study in the digital world needs to be explicitly selected by the operator, rather than being determined by the physical characteristics of a screen-film combination. Furthermore, the choice made regarding the amount of radiation used will now have a direct impact on image quality and the corresponding patient dose.
In traditional radiology, conventional film was used to capture the radiographic image, store the image, and display the image to the viewer. Digital imaging separates these three processes; thereby, it permits each to be optimized individually. A digital detector captures the radiographic image, and the choice of technique is now related to the image quality and dose requirements, not the amount of blackening on a film. Digital data can be conveniently stored in compact electronic archives and retrieved at the press of a button for transmission to any interested party around the world. Displaying images on monitors enables the viewer to adjust (optimize) image display and to process the image data digitally. There is little doubt that radiology is becoming digital, and that the radiology of the 21st century will be radically different from the film-based imaging of the 20th century.
Digital imaging modalities such as computed radiography (CR) can perform conventional radiographic examinations using a wide range of radiation exposures. Whereas a conventional film typically requires an exposure of about 0.5 mR to result in satisfactory film blackening, CR can generate satisfactory images using radiation exposures of as little as 0.005 mR (100 times less) or as much as 50 mR (100 times more). Changing the amount of radiation used to perform the radiographic examination affects not the film blackening, but the degree to which the image is mottled, which limits the visibility of subtle lesions. It is important to note that, until the advent of digital radiology, the choice of the amount of radiation used in conventional radiography was, in effect, fixed; technologies such as CR offer operators choices undreamt of hitherto in the use of radiographic techniques.
Images obtained using 0.005 mR will be 10 times more mottled than those obtained using 0.5 mR, and will be totally unacceptable for clinical imaging. On the other hand, images taken using 50 mR will have no perceptible mottle, and will be indistinguishable from those obtained using 10 times less radiation (5 mR). Current research efforts are directed toward establishing the appropriate amount of radiation that is required to generate an optimum radiograph. Using less than this optimum amount of radiation significantly degrades image quality and can reduce diagnostic imaging performance; using more radiation would expose the patient to unnecessary radiation.
The advent of digital imaging modalities has been of major benefit to radiology and medicine. Nonetheless, there are problems associated with the practical application of modalities such as CT to patients who are larger or smaller than most. Figure 1 on page 41 shows x-ray transmission through the abdomens of patients ranging in size from 5 to more than 100 kg. The amount of radiation transmitted through the largest patient is more than 100 times less than that transmitted through a newborn infant. Despite this fact, CT imaging protocols have not taken the size of the patient into account; most institutions scan adults and pediatric patients using the same techniques. This is illogical, and it results in the delivery of unnecessarily high radiation doses to infants and children. On the other hand, the scanning of unusually large adults is likely to result in CT images that are too mottled because of the much lower penetration of x-rays.
Table 2 shows effective doses for typical head and body CT examinations for patients of different size. These data show that doses delivered to pediatric patients are higher than those delivered to adults, which reflects the smaller mass of pediatric patients. It is also important to note that children are more radiosensitive than adults, and their risk of cancer will be correspondingly higher. It is only recently that a major effort has been made by the radiology community and CT manufacturers to rectify this state of affairs, and many institutions are implementing protocols that explicitly account for the size and age of the patient. This is a welcome step that will help to ensure that protocols for all digital imaging modalities are optimal and that patients are not being unnecessarily exposed to radiation.
Protecting The Patient
The International Commission on Radiological Protection (ICRP) is an international body that makes recommendations pertaining to the protection of radiation workers and the public from ionizing radiation. In addition, the ICRP periodically issues advice regarding the exposure of patients undergoing diagnostic and therapeutic radiological procedures. When patients are to be exposed to radiation, it is an essential requirement that such exposures be justified by a benefit to the patient undergoing the diagnostic procedure.
The radiologist responsible for the examination should not allow a procedure to be performed if the benefit to the patient is deemed to be less than the corresponding estimated patient risk. For example, a procedure would not be justified when images already exist that can answer the clinical question that prompted the referring physician to order the procedure. Likewise, imaging would be unjustified when there are equally effective tests that do not expose the patient to radiation or when the examination will not affect patient management (as can sometimes be the case for routine preoperative chest radiography).
Given that a diagnostic examination is justified by the net benefit to the patient undergoing the procedure, it is important for patient doses to be kept as low as reasonably achievable; this constitutes the ALARA principle. In effect, the ALARA principle requires that patients not be exposed to any radiation that is not required for producing an image of diagnostic quality. One example of the ALARA principle, as applied in radiology, is the use of x-ray-beam collimation to minimize the tissues and organs irradiated by limiting them to only those for which the radiological examination itself requires irradiation. Another example is the use of mA settings for CT that are no higher than required to keep mottle at a level that will not adversely affect the diagnostic interpretation.
Once an examination is justified and the ALARA principle has been applied, there are no formal dose limits applied to patients undergoing radiological examinations. Medical exposure of patients is not subject to the regulatory dose limits set for radiation workers and the general public. Implicit in this state of affairs, however, is a professional judgment by the radiologist regarding any risk to the exposed patient. Only radiologists have the training and professional responsibility required to undertake this type of risk-benefit analysis. An example of good medical practice is the requirement by many institutions to permit a radiographic examination to be ordered only when there is an explicit medical need specified on the order form.
Imaging equipment has become increasingly complex, and determining how it is to be used on patients is not a trivial matter. In today’s environment, imaging protocols require input from radiologists, technologists, and manufacturers, as well as medical physicists. A leadership role should be played by the institution’s medical physicist, who is the only individual capable of integrating the perspectives of radiologists and technologists while fully understanding the capabilities of the imaging system. The manner in which digital imaging equipment is to be used to perform a given examination needs to be reviewed carefully, both to ensure that image quality is optimized and to see that patient exposure to radiation is kept to a minimum. Regulatory authorities are increasing their vigilance in this area; in New York, for example, inspectors of CT facilities have been explicitly looking for scanning protocols that take the size of the patient into account.
Diagnostic radiology has been of enormous benefit to medicine. Over the past century, major advances have been made to provide the medical community with increasingly sophisticated imaging tests. The manner in which these procedures are performed is the responsibility of the radiological community (consisting of radiologists, technologists, medical physicists, and equipment manufacturers). It is the joint responsibility of the radiological community to ensure that patients are scanned in a manner that ensures optimal image quality. This requires image quality to be sufficient for the diagnostic task at hand, and it calls for patients to receive no more radiation exposure than is required for a satisfactory diagnosis.
Committee on the Biological Effects of Ionizing Radiation. Health Effects of Exposure to Low Levels of Ionizing Radiation. Washington, DC: National Academy Press; 1990.
Huda W, Atherton JV, Ware DE, Cumming WA. An approach for the estimation of effective radiation dose at CT in pediatric patients. Radiology. 1997;203:417-422.
International Commission on Radiological Protection Publication 34. Protection of the patient in diagnostic radiology. Ann ICRP; 1982.
International Commission on Radiological Protection Publication 73. Radiological protection and safety in medicine. Ann ICRP; 1996.
Rogers LF. Taking care of children: check out the parameters used for helical CT. AJR Am J Roentgenol. 2001;176:287.
United Nations Scientific Committee on the Effects of Atomic Radiation. 2000 Report to the General Assembly. Vol 1: Sources. New York: United Nations; 2000.
n Wagner LK, Eifel PJ, Geise RA. Potential biological effects following high x-ray dose interventional procedures. J Vasc Interv Radiol. 1994;5:71-84.
Walter Huda, PhD, is a professor of radiology, Department of Radiology, State University of New York Upstate Medical University, Syracuse.