​"Human error" is frequently misused by organizations to pin a major muck up onto someone responsible. But are punitive measures the only way to ensure safety and quality?

Many of us joined the rest of the world in watching the US Elections unfold, and the ensuing drama. For a moment, some wanted to "stop counting", while others demanded a recount. In certain states and counties, the leading presidential candidate was ahead by less than 1% of the available votes. The margins were so thin that a human error might make all the difference.

Such was the concern in Michigan, in which a county briefly "flipped from red to blue" due to "a failure in properly updating the software" used for counting and tabulating the votes. While this conclusion (along with certain political proponents) hinted at badly-designed software, a malfunctioning technology, or an election conspiracy, the culprit was really the simple human error. 

As local officials prepared for elections weeks prior, a minor correction was made to reflect the latest updates to two local races. While the machines that were required to count these local segments were updated, the rest of the machines in the county were not. All the machines counted flawlessly, but when the results were combined and tabulated, data was misaligned.

"Human error" is frequently misused by organizations to pin a major muck up onto someone responsible, allowing for a straightforward resolution by holding him/her "accountable" (e.g. You're fired). The messier the disaster, the more legitimate the justification, or so it seems. It attempts to shift the spotlight onto a replaceable defect (the employee) and away from inner deficiencies, just as how Citigroup recently made an unnecessary $900-million transfer to Revlon's creditors and blamed it on her employee.

Incidentally, Citi's established safeguards for such errors were "manual checks" by other employees.

Given human error is all but natural, in an error-prone environment where stakes are high, are punitive measures the only way to ensure safety and quality? Do staff feel safe knowing that bad outcomes will always have severe consequences to their career? Would this make an organization safer, or would such a climate discourage bad news from surfacing? Would you feel safe being a staff or a patient at a hospital with a culture of fear?

Just Culture offers a framework to guide organizations, leaders, and managers in responding to incidents. While many theories exist, they aim to fundamentally establish a fair and just work culture. It promotes psychological safety to allow open discussion of errors and mistakes, and facilitates appropriate responses to protocol deviations and even reckless violations. Through a Just Culture, organizations can then proactively learn and take steps to improve problematic situations.

An effective Just Culture requires the organization to evaluate the safety incident objectively, impartial to the incident's outcome. After all, the same error of medication misadministration can mean life or death depending on a patient's allergies. You could even say the more severe the aftermath, the greater the test of an organization's Just Culture.

Safety investigations should be objective, comprehensive, and transparent. No staff should ever have to undergo trial by media. Indeed in the subsequent internal review  in Michigan, errors in the vote counting process would have been obvious, and picked up downstream by the Board of Canvassers. Even though the mistake did briefly make it into the unofficial count, the error was genuine and unexpected, and the system was robust enough to prevent harm from reaching the final point.

As this incident rippled across the nation, how do you think the clerical staff should be made to feel? How might your organization react if you're put in a similar spot? Would a Just Culture framework have helped your hypothetical scenario? 

For more information about Just Culture, check out these online resources:
Just Culture by Outcome Engenuity - https://www.outcome-eng.com/david-marx-introduces-just-culture/
Just Culture by NHS UK - https://improvement.nhs.uk/resources/just-culture-guide/
Just Culture by AHRQ - https://www.ahrq.gov/hai/cusp/videos/07a-just-culture/index.html


With over 13 years of human factors exposure, Dr Yin Shanqing, Assistant Director, Human Factors & Systems Design, KKH, has rich practical healthcare experience applying human factors principles and methodologies to healthcare research and solutions-planning. Prior to KKH, he served as the nation’s first healthcare human factors specialist at CGH.


We love mail! Drop us a note at lighternotes@sgh.com.sg to tell us what you like or didn’t like about this story, and what you would like to see more of in LighterNotes.