Considerable confusion surrounds radiology compliance with the security provisions of the Health Insurance Portability and Accountability Act (HIPAA). There is no such thing as 100% security. Nonetheless, it is possible to clarify the situation and attain compliance by taking a series of concrete steps.

The first of these is to separate reality from hyperbole. What are the genuine risks to radiology systems? Risks are based on an organization’s past information-security incidents. In the absence of reports, radiology images mean little to the average lay intruder.

So how can you determine the real risks to your radiology information system (RIS)? Statistical data regarding hacker attacks are available from many sources, but these data cover all industries and are not specific to health care. Often, the industries involved, such as banking and government, attract hacker attacks to a degree that is unlikely to be found in radiology, where little motive exists beyond the thrill of gaining unauthorized access to data, with the exception of a targeted hack against a celebrity or a VIP. In addition, statistics can include only breaches of security that have been reported; in health care, it is probable that at least some incidents have been suppressed from being reported to the public.

How much security does an organization really need to implement in order to comply with HIPAA? The rule itself requires what it calls “reasonable and appropriate” measures to be taken. Unfortunately, it is difficult to attain consensus concerning what constitutes due diligence, so a type of herd mentality has evolved. Each organization attempts to stay with the pack, neither leading nor lagging in adopting security measures. A balanced security approach, in which the organization keeps up with its peers, provides due diligence without impeding health care.

Most radiology practices would probably agree that providing patients with high-quality health care includes protecting their confidential information. The degree to which patients trust a health care provider to maintain confidentiality can affect both the cost and quality of care. For example, a patient may need to undergo additional diagnostic testing if he or she is unwilling to admit a health problem or habit that may cause embarrassment (or affect employment/insurance status). Doing a good job of protecting patient privacy as an industry could engender more widespread trust among patients, perhaps reducing the treatment delays and excess diagnosis costs attributable to withholding important information from clinicians.


Administrative safeguards constitute 55% of the security standards, with 12 required actions and 11 addressable concerns. Physical safeguards are the subject of 24% (four required and six addressable), and only 21% of the standards cover technical safeguards (four required and five addressable).

In meeting standards that contain addressable implementation specifications, a covered entity will ultimately do one of the following: implement one or more of the specifications, implement one or more alternative security measures, implement a combination of both, or not implement either an addressable implementation specification or an alternative security measure. The last option may be feasible only for very small practices (such as one physician with two staff members) in which it would be absurd to ask to see identification or have a form filled out before granting access to the computer. In many cases, though, the paperwork required to avoid an addressable security measure is more cumbersome than the security measure itself, in daily practice, so it is easier simply to comply.

For example, under the “Security Awareness and Training” standard, the implementation specification “§164.308(a)(5)(ii)(C) Log-in monitoring (Addressable)” could be addressed by teaching users to properly log off a system when it is not in use and not to share their passwords with others (see box “Implementation Specification” below).

Implementation Specification

Guidance 164.306(d)(3)
This advisory provides a guidance on how a provider can assess compliance with the security standard.
   (i) Assess whether each implementation specification is a reasonable and appropriate safeguard in its environment, when analyzed with reference to the likely contribution to protecting the entity’s electronic protected health information; and
   (ii) As applicable to the entity–
      (A) Implement the implementation specification if reasonable and appropriate; or
      (B) If implementing the specification is not reasonable and appropriate–
         (1) Document why it would not be reasonable and appropriate to implement the implementation specification; and
         (2) Implement an equivalent alternative measure if reasonable and appropriate.

Covered entities must assess whether an implementation specification is reasonable and appropriate. The necessary decisions will be affected by size (from solo practice to integrated health care delivery system) and by office configuration or physical setting. No cookie-cutter solutions apply, so compliance is not as easy as buying prefabricated policies and inserting the organization’s name as prompted. Determining the appropriate means of compliance through risk analysis gives organizations great flexibility in HIPAA compliance. It does not imply, however, that they have been given complete discretion to make their own rules. Instead, they must determine their true risks and decide whether to remedy, transfer, or accept them.

Under HIPAA, the privacy rule applies to all protected health information, whether in paper, oral, or electronic forms. The security rule applies only to electronic protected health information. Organizations should bear in mind, however, that information is worth protecting, regardless of format. In addition, most information on paper in a health care setting has been generated by computer, so it still constitutes electronic information at some level. The security rule is also likely to be expanded in the future to cover nonelectronic information, so it makes good business sense to plan for all information formats at one time.

Security standards extend to the members of a covered entity’s workforce even if they work from home, as many coders and transcriptionists do. They must be trained to use the same rules that apply to protecting information in the parent organization’s offices.


Risk can be mitigated (by applying controls), transferred (by insuring against a loss), or accepted (by doing nothing, but recognizing the risk). Risk should be handled in a cost-effective manner relative to the value of the asset concerned.

  • What assets need protection?
  • What are the possible threats to them?
  • What are the vulnerabilities that can be exploited by the threats?
  • What is the probability or likelihood of a threat exploiting a vulnerability?
  • What would the impact of that exploitation be?

Key questions to be answered during risk analysis.

Under HIPAA, each organization assesses its own security risks; determines its risk tolerance or risk aversion; devises, implements, and maintains appropriate security to address its business requirements; and documents its security decisions. The key questions to be answered are shown below.

The purposes of security controls, whether technical or not, are: prevention, assurance, and recovery. Security controls also can be technical or nontechnical. Examples of technical measures are access control, audit control, and encryption. Nontechnical measures consist of such activities as creating sound policies, procedures, and plans; and training users. When selecting security controls, you must consider the technology environment, the culture of the organization with respect to heightened security, and striking a balance between security measures and ease of use. The budget for technical and nontechnical steps should be based on the cost of the security control versus the value of the information being protected.


Although passwords are not actually required by HIPAA, they are the most common method of user authentication today. Passwords are changed in order to prevent unauthorized access to systems, but if they are changed too often, security may actually be compromised. The user whose password changes frequently is likely, sooner or later, to write it on an adhesive note and put it on the monitor (or under the keyboard, under the mouse pad, or in the top desk drawer).

Good, strong business practices must accompany password use. When an employee leaves, his or her password must be removed from the system. Rules must be designed (and enforced) that prevent users from choosing easy-to-guess passwords or from changing them merely by adding a single character. It is better to have one good, strong password for a year or more than to rotate among three or four weak or sloppy passwords every 45 days. Organizations should make sure that passwords are six to seven characters long, difficult to guess (and not in the dictionary), and easy to remember. The use of both numbers and letters can be required for the system to accept a password, for example.

One method used to create a complex-but-memorable password is to choose a favorite song, use the first letters of the first seven words of its lyrics, and add special characters ( !@#$% &*), numbers, and/or case changes. For example, the lyric “Oh, when the saints go marching in” yields owtsgmi. Replacing vowels with numbers, adding a special character, and using uppercase letters to frame the center character produces 0wT$Gm1. The user should be able to remember this, but it would be very difficult to guess (unless the user develops the habit of whistling the song when logging in daily).

For passwords as for other security concerns, a policy alone is not sufficient to protect patient information. Some technology that holds users accountable must be implemented if security policies are to be enforced. Determining how much technology is needed, and when it should be deployed, calls for thorough analysis.

The core steps in creating a good compliance program for information security are:

  • assessing and analyzing risks;
  • developing policies and procedures to address those risks;
  • selecting and implementing cost-effective controls, countermeasures, and safeguards;
  • training workers to understand their security responsibilities;
  • managing the computing environment; and
  • auditing, monitoring, and responding to incidents.

An organization that accomplishes this much deserves to congratulate itself. Most facilities have little time and few resources available for security activities, but by focusing on these few critical tasks instead of many trivial ones, they will be able to do what is necessary.

Tom Walsh, CHS, CISSP, is president, Tom Walsh Consulting, LLC, Overland Park, Kan, and coauthor of The Handbook for HIPAA Security Implementation. This article has been adapted from HIPAA Security: Technology Challenges, which he presented at the Radiology Business Management Association’s 2004 Radiology Summit, June 7, 2004, in San Diego.