In 2003 and beyond, the specialty of radiology will change the face and the facilities of clinical medicine around the world. Whether researching new technologies, performing complex new procedures, counseling patients, coaching employees, or negotiating a new business venture, the basic elements of quality, accountability and trustworthiness are inherent in the health care organization of the future. 1 The practice of breast imaging, intervention, and care will continually evolve as an indispensable, interdisciplinary subspecialty contributing to more progress in the reduction of morbidity and mortality from breast cancer. 2-15 As one leader in management stated, “If you don’t believe in quality…you’ll never produce it.” 16 This article will describe new challenges, directions, resources, and a variety of successful case examples for improving and sustaining quality in breast imaging centers. Rapid progress hopefully soon will be made in molecular medicine and other research to advance the reality of breast cancer prevention. Until then, it will remain our mission and passion as breast care professionals to optimize and advocate for uncompromising quality throughout the continuum of breast health care delivery.

Many factors and agencies will influence the direction, delivery, quality and accountability of breast health and cancer care services in this country and worldwide. Breast cancer has been identified as one of several high priorities by organizations and advocacy groups (see a list of resources in Table 1a, Table 1b, posted in the online version of this article at www.imagingeconomics.com). One of the most exciting, unique, and potentially influential of these groups is the National Quality Forum (NQF). Formed in May 1999, the NQF is a relatively young, private, nonprofit, public benefit corporation headquartered in Washington, DC. This membership-based organization successfully creates a climate of openness and inclusivity in its meetings and activities. Member groups are divided into four councils including the Consumer, Provider/Health Plan, Purchaser, and Research/Quality Improvement groups. The NQF has recently published its National Framework for Healthcare Quality Measurement and Reporting. 17 Three purposes are outlined in the document:

  1. 1. To provide a standardized framework for identifying voluntary health care quality consensus standards, which are endorsed by a diverse group of stakeholders concerned about health care quality.
  2. 2. To identify strategic areas that the NQF will pursue to maximize the potential for improvement once standardized health care quality measures are available. (Of note, breast cancer is one of these strategic areas.)
  3. 3. To set forth an NQF-endorsed, consensus-driven platform and statement of principles for health care quality improvement in the United States.

This well-researched and carefully worded consensus report outlines four guiding principles, five steps to improved national health care, five operating principles, and 17 recommendations. One conclusion (and understatement) is that “Change is possible, but it will not be easy.” 17 Another more provocative and intriguing statement appears in the last paragraph of the document (Appendix D-Consensus Development Process: Summary), which is probably reflective of the Centers for Medicare & Medicaid Services (CMS) on the membership roster. It states, “Once a measure set has been approved, the federal government may utilize the information for standardization purposes in accordance with the provisions of the National Technology Transfer Advancement Act of 1995 (P.L.104-113) and the Office of Management and Budget Circular A-119.” 17 Thus, there is an inference that measuring and reporting of quality performance indicators may ultimately be tied to reimbursement by major purchasers of health care in the future.

Regarding breast imaging and breast health care, perhaps the most relevant definition of quality comes from the National Cancer Policy Board of the Institute of Medicine: “Quality Care means providing patients with appropriate services in a technically competent manner, with good communication, shared decision making, and cultural sensitivity.” 18 (See Table 1a, Table 1b.) Three critical success factors are necessary to integrate this definition of quality with the goal of decreasing morbidity and mortality from breast cancer, regardless of the setting of care or sponsoring organization. Strict attention to technical, organizational, and personal success factors (TOP) will influence your team to become a top performer. Several examples and practical recommendations are described to help develop and incorporate realistic guidelines, benchmarks, and report cards, designed to improve quality, reduce costs, and-optimistically-improve reimbursement.

ACHIEVING EXCELLENCE

Jay R. Parikh, MD

The primary diagnostician of early breast cancer has become the radiologist, who is ultimately accountable for the medical and technical quality as well as safety of radiologic imaging services for patients with breast concerns. 19 In addition, the specific oversight and role related to quality determinants of x-ray equipment, film processing, patient positioning, patient/referring physician communications, interpretive expertise, and interventional proficiency remain the primary responsibilities of a supervising radiologist as defined by the Mammography Quality Standards Act (MQSA) and others. 10,19-24 MQSA requirements must be met annually for accreditation by the Food and Drug Administration. These requirements have been previously described and can be accessed through the FDA web site (see Table 1a, Table 1b).

The cornerstone of any breast center model must be high-caliber, comprehensive, and technologically advanced breast imaging services delivered by a compassionate, competent team. Technical precision and clinical breast care go hand in hand. Literally speaking, the palpation skills required for good clinical or correlative breast examinations by appropriate practitioners, 10 and the positioning skills acquired by mammography technologists, have a direct impact on the accuracy and physician interpretation of findings described in the BIRADS (Breast Imaging Reporting and Data System) classification system. 10,15,25-27 Each component of technical and clinical care is as vital as the others, and a symbiotic and logical relationship between the components is critical for efficiency and the economic success of a breast imaging center. 11

GUIDELINES DEVELOPMENT

“Quality usually goes up when management has high expectations for their staff.”16 With the leadership of a medical imaging director, 28 breast imaging guidelines should be instituted at each facility for defining benchmarks related to the screening population, the work-up of abnormal screening mammograms, indications for diagnostic mammography, indications for breast ultrasound, and indications and appropriate methods of image-guided percutaneous needle biopsy. 29-31 Internal audits can then be used to measure adherence and assess outcomes. Several authors and organizations have demonstrated improved quality, outcomes, and cost savings resulting from the institution of guidelines in community based breast centers, national screening programs, and managed care organizations. 29-39 (See Table 1a, Table 1b, posted at www.imagingeconomics.com.) Guidelines at a facility can be a powerful tool to build quality in a breast imaging center. This section describes practical tips for the establishment, incorporation, and auditing of institutional guidelines to monitor quality improvement and measure outcomes in a breast imaging center.

ESTABLISHING BENCHMARKS

How do you determine what your facility’s guidelines and benchmarks should be? The steps toward establishing institutional guidelines include:

  1. Establish a guideline development team.
  2. Review previous institutional guidelines.
  3. Review current literature.
  4. Review community practice standards.
  5. Develop new facility guidelines.

Establishing a Team. Leading a team can be challenging, but the dedication to quality and teamwork can be a great rallying force.

Typically, the guidelines development team will meet as a committee chaired by the medical imaging director. Other committee members might include radiologists with passion and expertise in breast imaging, a liaison pathologist, a physicist, a surgeon, an administrative manager, a chief technologist, and the lead quality control (QC) technologist(s). Some breast centers also are including a risk management liaison because of the increasing risk exposure for missed and delayed diagnosis of breast cancer. 31,32,40,41 Interestingly, the average cost of initial care for early stage breast cancer is $22,000. 42 The average payment for a breast cancer physician malpractice case in one recent study is $438,047 for ages 20-79 years. 26

One individual-usually a well-respected and well-organized technologist-should be designated as the QC resource person for technical quality control issues. He or she should receive support and assistance from the medical imaging director, consulting physicist, and administrator. Time and space should be allocated for all QC tasks and paperwork to be effectively performed and documented without distraction. This technologist leadership role in QC is crucial for developing realistic guidelines and benchmarks as well as an effective and rewarding partnership with the medical and technologist staff.

Cathy Coleman, RN

Choosing the right people for the team is also important. Selection should be based not only on position but also on personality. A group does not always function as a team. A group may be conceptualized as a number of people who work together toward a common goal. A team can be thought of as a group that trusts each other. Team players are critical, with trust needed between the members. Forward-thinking breast imaging centers may want to consider inviting and educating payor representatives to participate in the team process as a liaison to provide specific input from a purchaser perspective. This will also be enlightening for them regarding the true complexity and challenges of busy breast imaging centers striving to create quality, economy, efficiency, and a friendly environment.

The California Medical Review Institute (CMRI) has developed “The Collaboratives Project,” which is an informative 10-step program to help structure the process for making change and improving quality in any setting. (See Table 2.)

Review Previous Institutional Guidelines. It is prudent to be aware of any previous institutional guidelines before suggesting any modifications or changes. Identify and review prior informal or formal sets of guidelines or any policies/procedures resembling them, preferably before the first team meeting. Copies of current and prior guidelines should be kept on file for future reference.

Review Current Literature. Current literature-based guidelines should be reviewed and circulated to the committee. The team leader should delegate the responsibilities of literature review by category to different members of the team. The members then present their respective reviews and recommendations to the committee at sequential meetings. This way, each member of the committee is encouraged to actively participate in the process and to learn from, respect, and trust other members of the team. Table 1a, Table 1b includes the most up-to-date resources for obtaining sample guidelines, report cards, consensus statements, etc from leaders in quality, education, and breast care. Two particularly helpful places to start are the web sites for the American College of Radiology (ACR) and the Society of Breast Imaging.

Review Community Practice Standards. Current community practice standards should also be compiled and reviewed by the committee, similar to literature-based guidelines. The current ACR Practice Standards are an invaluable reference (Table 1a, Table 1b ). If available, practice standards from similar facilities within the organization or health care system and in the neighboring communities should be requested and reviewed.

Developing Facility Guidelines. Having reviewed the literature, sample guidelines, benchmarks, and report cards, members of the workgroup are then faced with the challenge of compiling and organizing the data to create their own document. For any residual gaps, the members should dialogue, discuss individual opinions and approaches, and constructively achieve a consensus. Constant revision is to be expected until the final document is satisfactory to all parties. A sense of pride and accomplishment will unify the team toward achieving common goals. A timeline for periodic review and update also needs to be established.

“Opportunities are usually disguised by hard work, so most people don’t recognize them.” -Ann Landers 43

Incorporating a Final Set of Guidelines. The incorporation of a final set of guidelines or benchmarks into a facility involves a stepwise process that will require persistence and repetition. Health care and human nature are both resistant to change. Therefore, it is wise to remember the words of Henry Ford: “Whether you think you can or think you can’t-you are right.” 44 The medical imaging director should first meet with and introduce the guidelines and the guidelines workgroup to the larger breast imaging team-including clerical staff, administrators, technologists, and physicians. The rationale behind the development of the facility guidelines is presented for input and discussion. Thereafter, the topic of quality improvement (QI)/guidelines should be a regular agenda item during team meetings in the department of radiology or breast imaging. Modification of the guidelines is always necessary over time, as new literature, internal data, operational issues, or other factors influence implementation.

CURRENT BENCHMARKS: THE AUDIT

“The difference between ordinary and extraordinary is that little extra.” 16

The Audit. The audit can be laborious and time-consuming, but also a rewarding and revealing process. When the breast program is initially developed, an investment in a prospective data collection, audit system, computers, and appropriate staff (ie, part-time tumor registrar) should be included in the budget. Many vendors provide systems that integrate BIRADS and interface with tumor registry or pathology systems, like SNOMED (Systematized Nomenclature of Medicine) of the College of American Pathologists (see Table 1a, Table 1b ).

Two types of databases are typically used for audits: manual and computerized. Manual or homegrown systems are relatively inexpensive, easy to learn, and flexible for implementing upgrades as required by MQSA and other state regulatory agencies. These systems are usually slow and tedious for calculations, and require storage space for accrued data.

Computerized databases are a practical, alternate solution for high volume breast imaging centers. They can often be linked to computerized registration and scheduling systems, enabling the simultaneous data entry needed for auditing at the time of patient registration and/or scheduling. Mathematical calculations from the data entry fields can be readily generated and used for formal audits, internal quality assurance, peer review, and publications. This efficiency is particularly critical in high volume breast imaging centers. Disadvantages of computerized systems include the costs of software and hardware installation and upgrades, training costs for personnel, and vulnerability to hard drive crashes and viruses. A complete list of vendors is available through the ACR web site (see Table 1a, Table 1b ).

In a breast imaging center, the data collected about the screening population can also be used as a springboard to develop and monitor clinical benchmarks by other members of the interdisciplinary cancer care or women’s health team. In a hospital setting, the cancer committee should be consulted in the development process of medical or patient satisfaction audits. Collaboration with other relevant departments and medical staff leadership will also foster support. For example, the lumpectomy vs mastectomy rate can be readily monitored with tumor registry data. Some authors advocate a lumpectomy rate over 60%. 37,45 The rate of specimen mammograms received for lesions surgically excised after preoperative needle localization can be evaluated. Some authors advocate that specimen mammography be performed after 100% of preoperative needle localization procedures. 41,46,47 This one specific correlative activity has major implications for patient care and risk management as well as medical education. 26,41,45,47

Table 4. This sample radiology clinical practice ‘Balanced Report Card’ from Kaiser Permanente-Denver, Colorado, was provided on August 15, 2002, courtesy of Kim A. Adcock, MD. Adcock is a radiologist and associate medical director, business management, in the Colorado Permanente Medical Group. He can be contacted at [email protected].

IMPLEMENTING THE AUDIT

Regular auditing of quality indicators, performance, and outcomes in a breast imaging center enables comparisons to both internal and external benchmarks and can be used as a teaching tool. 11,21,48-50 Some health care organizations use the data for marketing, contracting, and business development as well. Table 1a, Table 1b features an inclusive list of reference organizations committed to quality, education, and performance improvement. The ACR web site provides one of the most detailed and step-by-step educational guides for understanding and initiating a standard breast imaging audit.

Fundamental data elements that should be collected and analyzed include:

  • cancer detection rates
  • recall rates
  • positive predictive value (PPV)
  • tumor size
  • axillary node involvement

It needs to be emphasized that screening data should not be confused with diagnostic statistics. Recall rates and recommendations for biopsy will naturally be higher in the diagnostic population than the screening population.

  • Cancer detection rate. The cancer detection rate (cancers detected per 1,000 screened patients) ranges from 2-10 per 1,000. The prevalent cancers found per 1,000 first-time cancers range from 6-10 per 1,000. The incident cancers found per 1,000 follow-up cancers range from 2-4 per 1,000. 50-51 If the number of cancers is less than 2 per 1,000, the sensitivity is suspect. 49
  • Recall rates. According to North American studies, recall rates from screening should be 10% or lower. 20,49,50 These can be higher in facilities where there is a higher prevalence of breast cancer, eg, older population, previous history of breast cancer, or strong genetic risk. Aberrantly high recall rates reduce cost-effectiveness and credibility of the screening program. 52 The individual audited recall rate should be compared to the group rate and used as an objective method to educate outliers. In Europe, the desirable recall rates for initial screening are less than 5% and for subsequent regular screening less than 3%. 53 Not incidentally, in the United Kingdom, 5,000 mammograms must be read annually by a radiologist to be deemed proficient in interpretation. 7,53
  • Positive predictive value. The PPV when biopsy is recommended based on mammographic findings should be documented. The accepted range is 25% to 40%. 20,54 In addition, with the use of core biopsy in a breast care program, a goal for the true positive surgical biopsy rate is 50%, with two centers reporting a 1.5:1 ratio and 1.6:1 ratio of open surgical biopsy/carcinoma ratio. 22
  • Tumor size. Breast cancer mortality is directly related to tumor size at presentation. 2,3,5,55-56 Therefore, a primary goal of screening is to detect smaller, nonpalpable tumors. In several series, more than 50% of detected cancers diagnosed by mammography were stage 0 or I. 2,5,8,29,37 Many series also have shown that more than 30% of screening-detected cancers diagnosed by mammography were minimal cancers (ie, invasive cancers are less than or equal to 1 cm or ductal carcinoma-in-situ). 21,49,51,52 If a malignant tumor of less than 1 cm is found, a woman has a more than 90% chance of long-term survival. 3,5
  • MemorialCare of Long Beach, Calif, has used this particular quality indicator to create a report card showcasing its breast care centers to both internal and external audiences including payors and the media. “Our guidelines also direct two board-certified doctors to check and recheck each screening mammogram. That’s what helps us achieve excellent detection rates. Altogether, our centers detected more than six cancers for every 1,000 mammograms performed. Of the cancers detected, 26.2% are less than 1 cm in size. Early and accurate diagnosis is what makes the MemorialCare Breast Center at Long Beach one of the nation’s 10 best.” 34 (See Table 1 .) Both of interest and importance, the role of double-reading of screening mammograms has recently emerged as a major issue in the medical literature and lay press. 23,24,57-59 The double-reading procedure can effectively be carried out by two independent radiologists or computer-assisted detection. 22,34,37,45 It is the belief of the authors that leading breast centers will certainly benefit from having adopted this extra measure of care sooner rather than later.
  • Axillary node involvement. Breast cancer mortality is also proportional to axillary node involvement at the time of surgical staging. 49 Therefore, a goal of screening is to detect node-negative nonpalpable breast cancers. In several series, the rate of lymph node involvement has been less than 25% of screen-detected cancers. 21,49,52,55
Table 5. Sutter Health Breast Cancer Report Card. Information gathered through the Internet at www.sutterhealth.org on August 15, 2002 and in a telephone interview with Krystin Dozier on September 12, 2002.

LEADING-EDGE EFFORTS

In addition to MemorialCare, several exemplary health care leaders and organizations have proven track records of success in defining, improving, and publishing report cards that address the spectrum of breast health and breast cancer care. Three different case examples are discussed to illustrate that dedication, focus, and hard work lead to real progress and real rewards. For more detailed information, please refer to Tables 3-5.

  • Johns Hopkins Breast Center: The Academic/Research and Patient-Driven Model.
    This high-level academic institution has earned designation as a comprehensive cancer center by the National Cancer Institute. The breast center director of outreach and education is a nurse whose background was in QI/performance improvement prior to becoming a breast cancer survivor. Her energy and passion for excellence have mirrored a commitment to excellence in patient care. Johns Hopkins Breast Center has created report cards for virtually each component of breast care, including psychosocial care and survivorship retreats. This center has also won numerous awards, but two unique report cards get the most hits on its web site-a “patient developed” Bill of Rights and a “patient developed” satisfaction report card. (See Table 3a, Table 3b, Table 3c, Table 3d, Table 3e, Table 3f, Table 3g, Table 3h,.)
  • Kaiser Permanente: The Physician-led HMO Model.
    Kaiser Permanente deserves its reputation as one of the leading managed care organizations in the country. In at least three regions, including Colorado, Northern California, and Southern California, the breast care staff and services within this HMO model demonstrate a sincere and measurable dedication to quality. 39 Northern California has a regional breast care coordinating committee, physician led, but with true interdisciplinary composition including policy makers, researchers, and public health experts. Southern California has focused on complex clinical and operational problems resulting in new clinical pathways to increase access, patient satisfaction, and cost-effectiveness.
    Kaiser Denver is illustrated here to highlight the extraordinary efforts of Kim A. Adcock, MD, which led to individual and departmental changes including a unique self-assessment tool for ongoing physician education and evaluation related to interpretive skills and patient management. He also implemented a Radiology Practice Report Card for Performance that is reproduced in its entirety as an example of prioritizing breast imaging into the overall radiology practice (Table 4). Kaiser Permanente, the Colorado Permanente Medical Group, and Adcock epitomize the characteristics described by futurist Paul Elwood, MD, in his mnemonic “HEROIC,” which pinpoints the necessary changes to reform the American health care system. 1 Adcock’s attention to solving the problems associated with unacceptable variability in individual and group practice related to breast imaging is unprecedented, or at least unpublished if it does exist anywhere else. An article in the New York Times recently cited Kaiser Denver as having the best program in the country to address medical education and prevention of medical errors in mammography. Adcock was interviewed for this article on August 15, 2002.
  • Sutter Health Care Breast Cancer Project: The Community Hospital and Independent Physician Network Model.
    What began as a pioneering and somewhat courageous effort in 1996 at the Breast Center of Mills Peninsula Hospital in Burlingame, Calif, has now become a system-wide, multiyear program within Sutter Health Care, a community-based, nonprofit network of 5,000 physicians and 25-plus hospital affiliates in Northern California. The late Gale Katterhagen, MD, was determined to introduce evidence-based guidelines and to raise the bar in breast cancer care during his tenure as medical director of the cancer program and breast center. In 1998, he and colleagues published the first clinical results 29 and he was subsequently invited to showcase the program as a model for both the National Cancer Policy Board of the Institute of Medicine and the Health Care Advisory Board in Washington, DC. Early progress from 1996 to 2000 included a 25% increase in the number of patients diagnosed with Stage 0 breast cancer, and an increase in the use of needle biopsies vs open surgical biopsies from 35% to 75%. Sutter Breast Care is renowned for the excellent physician, nurse, and radiologic technologist role models in its network of breast and cancer centers (see Table 5).
    Thomas Edison said, “If there’s a way to do it better… find it”… and these organizations did! 44

Interview with Kim A. Adcock, MD

An article by Michael Moss in the New York Times63 cited Kaiser Denver and Kim A. Adcock, as having one of the exemplary radiology departments in the United States relative to addressing medical education and preventing errors in mammography. For this article in Decisions in Axis Imaging News, he was interviewed on August 15, 2002, and asked about quality and leadership.

Adcock: “The rubber-road interface in mammography screening is radiologist interpretive skills. One of the many measures we follow is the proportion of new ?probably benign’ diagnoses. If a radiologist shows a trend toward a high or low proportion of new ?probably benign’ cases over time, he or she has drifting diagnostic criteria, and could probably benefit from some remedial training or other intervention. This one finding is an early powerful, surrogate indicator for radiologist confidence in mammography interpretation skills. Constructive, timely feedback and measurable quality improvement, not quality ?assurance,’ are necessary to evaluate individual physicians as well as collective group performance and patient care outcomes.

“In 1999, we implemented an independently administered, self-assessment and educational tool, which is now required on a quarterly basis. Each doctor reading mammography visits a designated alternator preloaded with a range of screening cases including both normal examinations and cancers, usually with subtle findings. Prior films from the previous 2 years are also included for comparison. These teaching/testing cases are chosen annually from the previous years’ workload by one rotating radiologist from the breast group and the chief mammography technologist. Answer sheets are tallied quarterly; results are distributed and discussed so each member of the team can review individual and collective performance profiles. Two and one-half category one CME credits are awarded for each of the four cycles. The results of this quality improvement exercise have been well received and contribute to a variety of metrics related to mammography interpretation that are calculated into both individual and departmental clinical practice report cards. The performance scores reflect a balanced evaluation and compensation process (see Table 4, page 68).

“I am not sure what my leadership style is…depending on the issue, sometimes you have to be more authoritative and other times more collaborative. Fortunately, Kaiser Permanente has created an organizational culture that supports physicians in challenging the status quo to improve quality and patient care. I remain concerned that my gravestone will probably read: ?the price of a tyrant’s victory is eternal vigilance.'”

Kim A. Adcock, MD, is a radiologist and associate medical director, business management, in the Colorado Permanente Medical Group. He can be reached at [email protected]

INSPIRING ACCOUNTABILITY

“In the middle of difficulty lies opportunity.” -Albert Einstein 44

As stated earlier, benchmarking and publishing outcome results will become increasingly important for enhancing credibility and accountability in American health care as it has been for decades in American business. Ellwood, 1 Kizer, 17 and other leaders in the fields of quality improvement, health policy, and strategy make a compelling case based on cost alone, for addressing the variation in outcomes, medical errors, fragmentation, dysfunction, and inefficiency. According to a study by the Midwest Business Group on Health, poor quality in health care costs the typical employer an estimated $1,700 to $2,000 for each covered employee each year, which represents one third of the total cost for health care of each employee. 60

There are other reasons to focus on individual and organizational outcomes: patient safety, service excellence, public health, continuing education, marketing, participation in research studies, preferential contracting, and risk management. Two non-health care tools have application here. The federal government has a web site devoted to sharing best practices related to contracting with vendor partners to enhance accountability, productivity, and economy (see DOD, Table 1a, Table 1b ). Another simple and comprehensive tool to measure the health of business alliances has been eloquently described in a recent report by Bamford and Ernst (see Table 6). This report card defines four dimensions of “performance fitness,” which, if applied to health care, might immediately impact business and contracting relationships. With the Colorado Permanente Medical Group, Adcock has created a very similar and balanced approach in the management of his strategic business unit-the Department of Radiology. In these models, both individual and collective behavior and outcomes contribute to personal and departmental evaluation and compensation (Table 4). In 1995, Kleinke predicted a “medical industrial revolution.” 61 Maybe it is here.

A FEEDBACK MECHANISM

One major challenge in counseling and educating employees relates to providing feedback for improvement. According to a recent paper by Manzoni in the Harvard Business Review, there are better ways of delivering bad news. “Critiquing weak performance is a job nobody likes. But by taking a more open approach, you can be a better boss, and get a lot more from your team.” 62
Manzoni suggests that making feedback more acceptable has three requirements:

  1. The person offering the feedback is reliable and has good intentions.
  2. The feedback development process is fair.
  3. The feedback communication process is fair.

A very real case in point was illustrated recently in a New York Times series by Michael Moss on mammography and medical errors. 24,63,64 An exemplary, multiyear program at Kaiser Permanente in Denver was created in 1998 and led by chief radiologist Adcock. He was faced with unacceptable variation among radiologists reading mammograms in the department and conducted an unprecedented and retrospective audit to find false-negative examinations and further quantify fundamental measures of interpretive expertise and patient management. His leadership skills were being tested and at the risk of disharmony, defiance, and potential risk exposure, he rigorously evaluated the individual and group performance of the department, including his own. The ultimate outcome led to retraining of all and dismissal of some, but with a new respect for the process of individual and organizational self-examination. He credits the open, progressive climate within his managed care organization with empowering the authority and accountability he needed to implement new interventions for better breast imaging services.

Other breast imaging radiologist and technologist leaders also exemplify this commitment to excellence and serve as role models by publishing and actively teaching about the trials, travails, and tricks of the trade for both older and younger generations of practicing physicians and mammography teachers. (See references 3, 8, 10, 11-13, 20, 22, 25, 53.)

ON BECOMING A CENTER LEADER

The following was written by a recent breast fellowship-trained radiologist on becoming a codirector of a breast center.

“Andy Meade, MD, and I are codirectors of the breast center. Andy is conservative, fair, approachable, no hidden agenda, and he is great with clinical/imaging components. He has also been around the block so he is able to keep me from stepping into any institutional landmines. I am fortunate to have him as a partner.

“So, how many progressive, interdisciplinary breast centers have their clinical breast radiologists leading the ship? I advocate the codirectors as team leaders and builders, and do not allow it to be a divisive role. Our role as codirectors is as much to facilitate as it is to lead and foster teamwork. It is important to try to build consensus because all members of the team must have ownership in the operation. I doubt many in our region even realize the level of quality and care we offer to women in terms of early detection, diagnosis, and appropriate (advanced) care. Over time, we only continue to get better and better. We have the right mix with our physicians and administrative support staff. I could not have landed in a better position.”

-Richard L. Ellis, MD, Gundersen Lutheran Medical Center, codirector, Center for Breast Care, La Crosse, Wis (September 2002 via email).

CONCLUSION

The commitment to building quality in a breast imaging center should be instituted at the birth of a center. Well-respected, credible, and congenial physician leaders in radiology and cancer care can initiate the process of developing truly interdisciplinary and patient-centered care by integrating new strategies, approaches, and report cards described herein. Ongoing development, incorporation, and monitoring of facility guidelines, metrics, and outcomes are critical to improve current quality in a breast imaging center. Internal audits can be generated by manual or computerized systems to facilitate data comparisons with national benchmarks to ensure quality and accountability.

Breast imaging centers in 2003 and beyond will then be able to offer better quality of care and a better quality of life for women facing the fear or the reality of breast cancer. 65,66

“Nothing in life is to be feared. It is only to be understood.” 43 -Madame Curie

References:

  1. Ellwood PM. A cure for the common health care system. Decisions in Axis Imaging News.2002;15(8):6.
  2. Tabar L, Vitak B, Chen HH, et al. Beyond randomized controlled trials: organized mammographic screening substantially reduces breast carcinoma mortality. Cancer. 2001;91:1724-1731.
  3. Tabar L, Dean PB, Kaufman CS, et al. A new era in the diagnosis of breast cancer. Surg Oncol Clin N Am. 2000;9:233-277.
  4. Duffy SW, Tabar L, Chen HH, et al. The impact of organized mammography service screening on breast carcinoma mortality in seven Swedish counties. Cancer. 2002;95:458-469.
  5. Cady B, Michaelson JS. The life-sparing potential of mammographic screening. Cancer. 2001;91:1699-1703.
  6. Parker RG, Leung KM, Rees KS, et al. Mammographic screening downstages breast carcinomas at time of diagnosis: a community-based experience. The Breast Journal. 1999;5(6):359-363.
  7. EUSOMA (European Society of Mastology) Position Paper: The requirements of a specialist breast unit. Eur J Cancer. 2000;36:2288-2293.
  8. Feig SA. Effect of service screening mammography on population mortality from breast carcinoma. Cancer. 2002; 95:451-457.
  9. Nass SJ, Henderson IC, Lashof JC, eds. Mammography and Beyond: Developing Technologies for the Early Detection of Breast Cancer. Washington, DC: Institute of Medicine, National Research Council; April 2001.
  10. Kopans DB. Breast Imaging. 2nd ed. Philadelphia-New York: Lippincott-Raven; 1998:189-191.
  11. Parikh JR. Components of a breast-care center. Decisions in Axis Imaging News. 2002;15(8):39-46.
  12. Piccart M, Cataliotti L, Buchanan M, et al. Brussels statement document. Eur J Cancer. 2001;37:1335-1337.
  13. Sickles EA. Breast imaging: from 1965 to the present. Radiology. 2000;215:1-16.
  14. Ioannidou-Mouzaka L, Hortobagi G, Gros D, et al. Senology-the urgent need for a specialty. The Breast Journal. 1998; 4(4):280-284.
  15. Smith RA. Achieving the fullest potential of breast cancer screening: new challenges for the coming decade. Presented at: 10th Annual National Interdisciplinary Breast Center Conference; March 8-11, 2000; Orlando, Fla. National Consortium of Breast Centers Inc.
  16. McAlindon HR. Commitment to Quality. Lombard, Ill: Great Quotations Inc; 1989.
  17. A National Framework for Healthcare Quality Measurement and Reporting. Consensus Report. Washington, DC: National Quality Forum; 2002.
  18. Hewitt M, Simone J, et al, eds. Ensuring Quality Cancer Care. Washington, DC: National Academy Press; 1999:3.
  19. Parikh JR. Breast center model puts emphasis on patients. Diagnostic Imaging. 2002;24(4 suppl):15-19.
  20. Bassett LW, Hendrick RE, Bassford TL, et al. Quality Determinants of Mammography. Clinical Practice Guideline No. 13. Rockville, Md: Agency for Health Care Policy and Research, Public Health Service, US Department of Health and Human Services; October 1994. AHCPR Publication No. 95-0632.
  21. Sickles EA. Quality assurance. How to audit your own mammography practice. Radiol Clin N Am. 1992;30:265-275.
  22. Roux S, Logan-Young W. Private practice interdisciplinary breast centers: their rationale and impact on patients, physicians, and the health care industry: a bicoastal perspective. Surg Oncol Clin N Am. 2000;9:177-198.
  23. Castleman M. Mammogram controversy-what every woman needs to know. Family Circle. May 21, 2002:87.
  24. Moss M. Spotting breast cancer: doctors are weak link. New York Times. June 27, 2002.
  25. Long SM, Miller LC, Botsco MA, et al. The Handbook of Mammography. 4th ed. Edmonton, Canada: Mammography Consulting Services Ltd. January 2000:81-29. (www.mammography.com)
  26. Breast Cancer Study. 3rd ed. Rockville, Md: Physician Insurers Association of America; 2002.
  27. Van Houten B. Stat Read. Positioning improves detection. Decisions in Axis Imaging News. 2002;15(5):8.
  28. Lee CZ. Oncopolitical issues: obstacles and options for success in a comprehensive breast center. Surg Oncol Clin N Am. 2000;9:279.
  29. Katterhagen G, Borofsky H, Berliner K. The use of evidence based guidelines to improve outcomes and drive down costs in a community based breast center. Cancer Management. 1998;3:8-14.
  30. Bassett LW. Options in breast biopsy. Decisions in Axis Imaging News. 2002; 15(5):26-32.
  31. Guthrie TH. Breast cancer litigation: an update with practice guidelines. The Breast Journal. 1999;5(3):35-339.
  32. Goodson WH III, Moore DH II. Causes of physician delay in the diagnosis of breast cancer. Arch Intern Med. 2002;162:1343-8.
  33. Ibarra J. The pathologist in breast cancer: contemporary issues in the interdisciplinary approach. Surg Oncol Clin N Am. 2000;9:295-317.
  34. Korn P. America’s top ten breast cancer centers. Self. October 1997.
  35. Margolin FR, Radovich N, Jacobs RP, et al. Improving efficiency in a breast imaging practice: a community radiologist’s experience. Seminars in Breast Disease. 2001;4:27-35.
  36. West JG, Sutherland ML, Link JS, et al. A breast cancer care report card: an assessment of performance and a pursuit of value. West J Med. 1997;166:248-252.
  37. The Advisory Board Company. Oncology Roundtable. Innovations in Breast Cancer Care. March 1999. Washington, DC.
  38. Smith RA, Von Eschenback A, Wender R, et al. American Cancer Society guidelines for the early detection of cancer: update of early detection guidelines for prostate, colorectal and endometrial cancers. CA Cancer J Clin. 2001;51:38-75.
  39. Boersma R, Mollen A. Improving breast care at the Kaiser Permanente Bellflower medical center. The Permanente Journal. 2000;4(4):43.
  40. Osuch JR, Bonham V. The timely diagnosis of breast cancer: principles of risk management for primary care providers and surgeons. Cancer. 1994;74 (1 suppl):271-278.
  41. Lagios MD. The contribution of pathology to breast cancer malpractice: common sources of potentially damaging errors in current pathology practice. Seminars in Breast Disease. 1998;1(1):15-21.
  42. Gentry C. Improving quality of care for Californians with breast cancer. White paper prepared for the California Healthcare Foundation. Available at: www.chcf.org. Accessed July 15, 2002.
  43. Great Quotations from Great Women. Lombard, Ill: Quotations Inc; 1992.
  44. The Best of Success. Lombard, Ill: Quotations Inc; 1984.
  45. Coleman C. Building quality into comprehensive breast care: a practical approach. Surg Oncol Clin N Am. 2000;9:319-338.
  46. Bavermeister DE, McClure HH. Specimen radiography: a mandatory adjunct to mammography. Am J Clin Pathol.1973;59:782.
  47. Anderson R. Getting sued for breast cancer. The Doctor’s Advocate. Third Quarter 2001:4-8. (www.thedoctors.com)
  48. Coleman CM, Lebovic G. Organizing a comprehensive breast center. In: Harris J, Lippman ME, Morrow M, Hellman S, eds. Diseases of the Breast. Philadelphia-New York: Lippincott-Raven Publishers; 1996:963-970.
  49. Linver MN. The medical audit: statistical basis of clinical outcomes analysis. In: Diagnosis of Diseases of the Breast. London: WB Saunders Co; 1997:127-140.
  50. Bird RE, Wallace TW, Yankaskas BC. Analysis of cancers missed at screening mammography. Radiology. 1992;184:613-617.
  51. Burhenne HJ, Burhenne LW, Goldberg F, et al. Interval breast cancers in the screening mammography program of British Columbia: analysis and classification. AJR Am J Roentgenol. 1994;162:1067-1071.
  52. Bird RE. Low-cost screening mammography: report on finances and review of 21,716 cases. Radiology. 1989;171:87-90.
  53. European Commission. European Guidelines for Quality Assurance in Mammography Screening. 3rd ed. 2001:50.
  54. Kolb GR. Disease management is the future: breast cancer is the model. Surg Oncol Clin N Am. 2000;9:217-232.
  55. Burhenne LJW, Hislop TG, Burhenne HJ. The British Columbia mammography screening program: evaluation of the first 15 months. AJR Am J Roentgenol. 1992;158:45-49.
  56. Michaelson JS, Silverstein M, Wyatt J, et al. Predicting the survival of patients with breast carcinoma using tumor size. Cancer. 2002;95:713-23.
  57. Sickles EA, Wolverton DE, Dee KE. Performance parameters for screening and diagnostic mammography: specialist and general radiologists. Radiology. 2002; 224:861-869.
  58. Elmore JG, Miglioretti DL, Reisch LM, et al. Screening mammograms by community radiologists: variability in false-positive rates. J Natl Cancer Inst. 2002;94:1373-1380.
  59. Kessler LG, Andersen MR, Etzioni R. Much ado about mammography variability. J Natl Cancer Inst. 2002;94:1346-1347.
  60. Freudenheim M. Study finds inefficiency in health care. New York Times. Business Section, June 11, 2002. Available at: www.nytimes.com. Accessed June 16, 2002.
  61. Kleinke JD. Medicine’s industrial revolution. Wall Street Journal. August 21, 1995.
  62. Manzoni JF. A better way to deliver bad news. Harvard Business Review. September 2002:114-119.
  63. Moss M. Mammogram team learns from its errors. New York Times. June 28, 2002.
  64. Monsees B. The breast imaging profession: take my job, please! Opinion article/editorial in response to New York Times series on mammography June 27-28, 2002. Available at the web site of the American College of Radiology (www.acr.org/announce). Accessed July 8, 2002.
  65. Thorne SE, Harris SR, Hislop TG, Vestrup JA. The experience of waiting for diagnosis after an abnormal mammogram. The Breast Journal. 1999;5(1):42-51.
  66. Schain WS. Physician-patient communication about breast cancer: a challenge for the 1990’s. Surg Clin North Am. 1990; 70:917-36.

Cathy Coleman, RN, an oncology certified nurse, is principal, Coleman Breast Center Consultation Services, Tiburon, Calif, [email protected], (800) 474-1717, and territory manager in Northern California for a breast care device vendor.

Jay R. Parikh, MD, is medical director, Interventional Breast Imaging, Swedish Breast Care Centers/Women’s Diagnostic Imaging Centers, Swedish Medical Center, Seattle, [email protected].