HIPAA relies heavily on risk analysis in multiple contexts. For example, risk analysis has a major role in the Breach Notification Rule under the new regulations issued by the U.S. Department Health and Human Services on January 25, 2013. (You can learn more about the changes here.) This alert explains how risk analysis fits into the Security Rule.
The Security Rule requires an organizational risk analysis. The covered entity must conduct a thorough assessment of the potential vulnerabilities of and risks to electronic protected health information (ePHI). In a July 14, 2010 guidance document, the Office of Civil Rights explained that there are several elements to risk analysis:
While risk analysis is a stand-alone requirement for all covered entities, it is also a component of the criteria that a covered entity must use when it is determining specifically how to implement the Security Rule’s other requirements. A covered entity must adopt “addressable” security measures that are “reasonable and appropriate.” So how does an organization decide what is reasonable and appropriate? The Security Rule provides four factors that must be considered:
The risk analysis approach described above fulfills the fourth factor. Therefore, every time a covered entity is presented with an addressable implementation specification, the organization may use the results of
the organizational risk analysis it has already conducted. It is critical to keep in mind, however, that the other three factors are important parts of the analysis; risk analysis alone is insufficient.
Consider how a small company’s organizational risk analysis would come into play when the company is trying to decide whether it must implement encryption, which is an addressable implementation specification. Generally speaking, the company will develop a comprehensive list of risks to ePHI the company holds (e.g., theft by a workforce member, natural disaster, lost laptop, data entry error, etc.). Assume that two items from that risk list are “risk of lost portable device” and “risk of misdirected electronic message” (which are both really common problems). If the company considered those two risks and decided they were both fairly likely to happen, and if the company also decided the impact would be fairly harmful, those decisions in the risk analysis would put these risks higher up the priority list overall in terms of remediation. The company would basically rank all its risks that same way (some will be high and others lower), and it would document the analysis and results. The organization-wide risk analysis requirement would be complete (although it would need to be updated).
That risk analysis can then be used to determine whether it is reasonable and appropriate to implement encryption. Assume that due to its size, the company has limited technical infrastructure, hardware, and software security capabilities, and that the cost of encryption would be high. This is where the risk analysis comes into play. Recall the two risks above were identified as high priority. A very small company might still have to encrypt if members of the workforce store lots of sensitive ePHI on laptops and then drive around town with those laptops in their cars. In other words, the risk of losing unencrypted ePHI is high, and the impact of that happening is also high because the ePHI is sensitive. Being small is not entirely dispositive for that company. If the company decides to instead prohibit saving ePHI on laptops, then the risk declines, and that drop in conjunction with the small size of the company might support a decision not to implement encryption. The covered entity has likely satisfied its obligation to determine whether encryption is reasonable and appropriate and adopt an appropriate alternative security measure.