As customer service and market share become increasingly more important in our competitive health care market, we must employ strategies that differentiate ourselves from the competition. At the Massachusetts General Hospital (MGH) Department of Radiology, customer service through improved operational processes has become our primary focus. Through the development of innovative, quality management improvement principles, the MGH Radiology Department has initiated efforts to become increasingly competitive in service.

MGH Radiology recognized the need to shift its focus toward customer service for two reasons. Substantial gains have been made in cost reductions within radiology, so in order to retain customer loyalty, our emphasis was shifted to our commitment to customer service. Given our reputation for clinical excellence, a new priority for customer service would position us to successfully compete against those community hospitals and freestanding imaging facilities that are usually known for their excellence in service. This new focus can enable us to increase our market share and potentially position our organization to lead in service within the marketplace.

In order to reach our organizational goal of enhancing customer service, the MGH Department of Radiology has developed, implemented, and reaped the initial rewards of a quality management improvement process called Service Metrics, which incorporates operational efficiency efforts. In this article, we will highlight our process, approach examples of results, and share future plans.

GETTING STARTED

Historically, the success of every quality management effort depends on the level of employee involvement, both horizontally and vertically. The Department of Radiology’s Division of Quality Management and Education (QME) has created an integrated Radiology Improvement Team consisting of physician advisors, operations managers, frontline technologists, and support staff.

The first step in any strategic plan is to define the goal, or mission. In this instance, our initial goal was to create a way to measure our organization’s performance within the dimensions of importance to our patients and our referring physicians. With this goal in mind, the Radiology Improvement Team developed Service Metrics, a process that allowed them to obtain data for organizational measurements focusing on patient throughput. Specific principles must then be designed to guide the team through the process of measuring service and operations focusing on performance improvements/enhancements. The journey of performance improvement begins with five guidelines that focus on the approach. This is critical since all levels of the organization must be engaged in the process. Those guidelines are:

  • Maintain the importance of planning before measuring.
  • Do not underestimate the importance of actively involving all levels of the organization.
  • Link service improvement initiatives to the mission, along with the clinical and business objectives of the organization.
  • Do not measure for the sake of measuring.
  • Try, whenever possible, to link service improvement initiatives to revenue-enhancement or cost-reduction efforts.

Once these guiding principles were established, the Radiology Improvement Team engaged in brainstorming sessions to identify key service variables for the initial focus of effort. The original list consisted of more than 70 potential measures. These measures were then placed into categories that best describe the group of variables identified. The brainstorming exercise allowed for team building and the collaboration necessary to identify all of the different variables that could be potentially measured within the department (Figure 1).

After this extensive list had been formulated, it became apparent that the Radiology Improvement Team needed to develop a tool that would help them determine exactly where to begin. It would be unrealistic to think that an organization could simply start measuring the aforementioned variables independently from each other and the operation. The next step was crucial to the program; this analysis step allowed the team to determine what variables would be measured and what relevance each measure had to the organization. That tool became known as the Metrics Down Selection Process (see Figure 2).

The development of this tool subsequently forced each potential measure through two rigorous levels. Level One, the Macro Criteria, asks:

  1. Does the measurement reflect the mission, vision, and values of the institution and the department;
  2. Does the measure coincide with the departmental strategic and tactical goals?

Level Two, the Micro Criteria, asks:

  1. Is the measurement relevant to the referring physician and patient?
  2. Is it economically feasible to measure (how easy is it to measure, what will it take to track good data)?
  3. Can the manager and his/her team use the results of the measure to improve operations?

After completing this intricate process of selecting and prioritizing the results of the Metrics Down Selection Process, the Radiology Improvement Team decided that the starting point for their Service Metrics would be patient throughput, which was defined as the minutes from patient arrival to patient departure. The second Service Metrics selection was report throughput, which consisted of the minutes from examination to report finalization. The third Service Metrics selection was patient satisfaction, which was defined as how the patient and referring physician perceive and evaluate the service they received.

IMPLEMENTATION — A TRUE STORY

The next step in the process was to determine where to implement the program and then form a team that would be more familiar with the specific operational area(s) chosen. Ambulatory Care Radiology (ACC), CT, Inpatient Radiology, Teleradiology, and Ultrasound were chosen for the implementation process because they each had high visibility, high volume, overlapping operational infrastructures (waiting rooms, for instance), revenue growth, and solid leadership. Beginning with the ACC areas, teams were formed consisting of the operations manager, technologists, Image Service Representatives (film librarians), Radiology Service Representatives (schedulers, receptionists), a Quality Management Representative, and a physician advisor. (For the purpose of this article, improvement efforts will be detailed for the area identified as ACC).

Within each metric, which in this case was patient throughput, key milestones were established in order to break down the process into measurable phases. The key milestones established by the team for patient throughput were pre-examination wait time (patient arrival to examination begin time); examination time (time examination is begun until completion); and post-examination wait time (time examination is completed until time patient departs). Other equally important measures were designed as well, such as appointment availability, which is how long an outpatient must wait for an appointment. Once these milestones had been developed and measured for a specific period of time, the Radiology Improvement Team was ready to operationalize the Service Metrics program.

Before the Service Metrics program was fully operational, however, a strategic plan was created. The following tenets are key in program implementation:

  • Staff education and training for proper service data entry and an ability to analyze data and identify improvements;
  • Data input compliance tracking and remedial training;
  • Process flow chart development, complete with failure point analysis for the identification of operational bottlenecks and process flaws (Figure 3);
  • Interim standard generation;
  • Identification of process improvement focus.

On the basis of nonsubjective data from the operational flowcharting in the ACC division — which revealed that 42% of the patient experience was waiting for films after the examination — the Radiology Improvement Team chose to focus on post-examination wait time (Figure 4A). This, by definition, is the time from examination completion to the patient’s departure from the department. Typically, it includes the time spent waiting for film to be printed and read before the patient returns to his or her respective referring physician. It was also determined that the average real wait time was 21 minutes, which helped to formulate two hypotheses for this lag:

  1. Poor film tracking systems;
  2. Delays in on-call radiologist arrival for providing online reading services.

SERVICE IMPROVEMENT ACTION TAKEN

Once we had zeroed in on our failure points, we initiated two improvement efforts. With regard to the poor film tracking, we instituted a color coding system that reduced the number of lost films from an average of 20 per month to less than five (Figure 5).

In order to improve the response time of the online radiologist, we developed an MD Paging System that reduced wait time from an average of 13 minutes to 5 minutes. Even though 13 minutes may not seem like a long time, when it is compounded by 20 or 30 hold-out readings a day (films are taken and the patient is instructed to wait for the film so he or she can take the images to the referring physician), it can certainly add up and cause a considerable lag in patient wait time, thus creating a bottleneck. Even the physicians could not dispute the overwhelming evidence that their process decreased operational efficiency and degraded the patient experience overall. Especially noteworthy is the fact that the project leader for this improvement process initiative was a technologist, Tina Knowles. She chaired a team of two physicians, an Image Service Representative, and a Radiology Service Representative, thus proving that organizational involvement at all levels is very powerful.

REPORT THROUGHPUT PROGRESS

Throughput in the ACC areas has also been worked on by a Service Improvement Team. By leveraging technology, the ACC division has seen not only a decrease in operating dollars but an improvement in report throughput by 40%.

Service Metrics also supported the Radiology Department justification to implement computed radiography (CR) in one of the ACC areas to reduce patient post-examination wait time by 60%, while also assisting in a department-wide filmless initiative (cost savings), which has currently saved more than $500,000 in film costs. The team was able to reduce post-examination wait time (Figure 4B), but realized an increase (mean, 30 minutes) in pre-examination wait time due to the implementation of the new CR technology. Using Service Metrics, the team was able to diagnose the situation and implement improvement activities that brought the pre-examination wait times back down to an average 16-17 minutes.

A subsection of the Service Metrics team along with physician leadership were able to create a new workflow process in which technologists worked in teams of two to move patients through the process while being supported by a patient care flow coordinator; this coordinator monitors the inflow and outflow of patients when the area is hitting peak demand levels. Radiology Service Representatives were also calling patients at home, preparing them with medical information prior to the appointment, and collecting patient information, so that when they arrived, data was already in the information system and patients were ready to be seen. These efforts not only brought the pre-examination wait times down to the aforementioned levels, they even reduced the times to levels lower than the original times before implementation of CR (mean, 19-20 minutes).

Finally, and perhaps most important, is the need to sustain this improvement process initiative. Currently, MGH Radiology has five working Service Metrics teams, all in different stages within the improvement process. In order to track and monitor efficiency, the Radiology Improvement Team created storyboards that are posted in their respective areas and depict monthly departmental measures. The posting of throughput numbers also allows the frontline staff to see its performance through run and control charts. This form of communication reinforced the importance of sharing information that can be collaborated on at all levels of the organization. Each team meets at least once a month to determine whether operations and improvement efforts are running smoothly and/or need tweaking. Sustainment is the most challenging step in the Service Metrics initiative. Celebrations are typically short-lived since the Radiology Improvement Team is cognizant that everyone must continue to work just as hard to maintain the improvements as they did during the initial implementation period. Given the process improvement results, it is clear that the Radiology Improvement Team accomplished what it set out to do. It successfully implemented a proven quality management tool that afforded MGH the opportunity to objectively improve patient care and customer service.

THE NEXT STEP

As previously mentioned, celebrations of success are typically short-lived. The Radiology Improvement Team must now leverage its operational successes and develop a more comprehensive customer- and market-focused program. This new Service Metrics program is termed the Performance Enhancement Program.

The storyboards will soon have a new look, both in content and aesthetics. In addition to our patient throughput measures, the following will be included: Total Patient Throughput, adding to our traditional patient throughput measure the time it takes images to reach the referring physician’s desktop; Physician Report Throughput measuring the time it takes the radiologists to finalize their reports; Financial Indicators, measuring and tracking cost and revenues in relation to volume; Customer Satisfaction, conducting satisfaction surveys to not only discover how services are perceived, but to also learn about patient expectations and their preferences; Referring Physician Satisfaction, tapping into our existing annual MGH Radiology Referring Physician Satisfaction Survey program; Local, Regional and National Benchmarking, exploring and analyzing our market. In proactively monitoring and reacting to the above measures and indicators, MGH will be able to not only successfully manage its clinical operations, but to also make strategic business decisions for the future. Health care is changing, and it is incumbent on every institution, from the departmental level up to the executive level, to focus on service and excellence in quality of care.

Through the implementation of a quality management program like the Massachusetts General Hospital’s Department of Radiology Performance Enhancement Program, health care organizations will be better positioned to serve their purpose, while simultaneously capitalizing on market opportunities. This type of program not only increases service levels, it also reinforces collaboration, shared responsibility to the patient, and the ability to become more competitive within the marketplace.

Ron Doncaster is project manager, John Couris is managing director, and Sharon Antiles and Julie Angulo are consultants, Radiology Consulting Group (RCG), from the Massachusetts General Physicians Organization