Designing systems instead of assigning blame

February 28, 2018

In the immediate aftermath of any newsworthy accident, we often hear the words "human error" or "user error".

Consider the recent missile alarm scare in Hawaii. The idea that a human was responsible for an accident has inherent appeal. However, this person-focused approach of error causation is at odds with a systems-based approach. In a systems-based approach, several factors give rise to a situation that leads to an accident in which the human is just one part, usually at the end of the situation.

The appeal of the person-based approach lies in its simplicity. A sentiment like "someone was not doing their job" is beautifully simple to visualize and has the crime and punishment encoded into it. Most times, this sentiment is also wrong.

A systems-based approach not only considers the task that a user has to perform but also the user’s capabilities, environment, the tools available, and the organizational policies that can affect task performance. It tries to analyze not only the how but also the why behind an error and to address the underlying issues.

James Reason, psychologist and a safety expert, wrote informatively and exhaustively on the person vs system approach to error attribution in his 1990 paper titled Human Error: Models and Management. One line in that paper has stuck with me: “...we cannot change the human condition, but we can change the conditions under which humans work.”

This statement is not an indictment of human beings, but more of a recognition that errors are to be expected in the best of systems with the most well-intentioned individuals. Being cognizant of such a possibility helps in developing safeguards that can avoid error-generating situations in the first place. 

... we cannot change the human condition, but we can change the conditions under which humans work.

The larger trend has been a movement toward a systems-based approach. "User error" has been replaced by "use error". Investigating a use error helps get to the root of the problem, instead of assigning blame to the user.

The ISO definition of usability is: “The effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments.” When dealing with medical devices or other high-risk systems, safety needs to be explicitly added to the usability criteria. 

One of the more common issues when considering safety for medical devices is the reliance on probability as an indicator of risk instead of the severity of the outcome. Getting long-run probabilities of use errors when developing a new medical device is not feasible and subjective probabilities of use errors are, well, subjective. What matters when designing a medical device is not how often it will harm someone but how much it will harm someone. 

I don’t think that we will be able to completely eliminate risks from a system or a medical device. However, understanding the consequences of various risks can help guide the design decisions, resulting in systems and devices that are safe and effective. Systems and devices that do not cause “human error”.