To take the test online go HERE. For more information, visit the Continuing Education tab.

LEARNING OBJECTIVES

Upon completion of this article, the reader will be able to:

1. Describe the Swiss Cheese model and the factors involved in errors.

2. Discuss human performance and safety using the GEMS model and describe common errors in the modes of this model.

3. Differentiate the key principles of Just Culture as it pertains to laboratory safety.

4. List and describe the factors involved in gaining trust in the reporting of safety issues.

In 1999, the healthcare world was shocked into assessing patient safety starting with the Institute of Medicine’s (IOM) transformative report “To err is human.”1 Estimating that as many as 98,000 people die annually due to hospital medical errors, authors Kohn, Corrigan, and Donaldson started a revolutionary approach to identifying, analyzing, and resolving safety risks in healthcare. We are, after all, humans and not perfect. Gone were the days of finger pointing, highly accusatory and severely punitive error assessment. Instead, the IOM report brought a new paradigm of thinking focusing not on “bad people” in healthcare but the consideration of “bad systems” that need improvement so good people can make better decisions.

This concept brought forth improved models to assess why errors happen such as the “Swiss cheese theory” defined by James T. Reason.2 Complex systems, such as laboratory medicine, often include numerous safeguards or defense mechanisms to avoid errors. However, many of these defensive layers contain weaknesses or holes that when stacked up like slices of Swiss cheese, can unexpectedly line up to create a pathway for error. Reason's work described both active and latent causal factors that lead to accidents, or in healthcare: adverse patient events.

Active factors include unsafe actions that can be directly linked to an error while latent factors can lie dormant for long periods of time before contributing. Plebani, in his article “The detection and prevention of errors in laboratory medicine” describes the laboratory testing defense layers as well-designed procedures and processes, simplification and automation, training, supervision, and effective lab/clinical interface. He goes on to describe the holes in the Swiss cheese as the complexity in total testing processes, behavioral and skill differences in professionals, actions outside of the laboratory’s control, staffing shortages, and the increasing complexity of test ordering and result interpretation.3

Reason’s theory points laboratory leaders to exactly where they should begin the investigation of all safety incidents — at the system. In a laboratory, that means not placing the blame for an incident on the staff. Investigations should always first go to written procedures, the physical environment where the incident occurred, and anything else that was directly involved in the event. That could mean engineering controls (such as a biological safety cabinet), the availability of personal protective equipment (PPE), or checking to see if any equipment was functioning properly. If these things do not provide obvious causes to the incident, perform a risk assessment of the task or equipment involved and look again for the location of those “holes in the cheese.”

Human performance and safety

James T. Reason also studied errors by categorizing them. Through this research he developed the Generic Error Modeling System (GEMS), a system that contains three performance modes within which errors can occur. These modes are determined by the individual’s familiarity with the task. Based on the familiarity with a specific task, an individual naturally pays a certain level of attention. If an individual is very familiar with a task, for example, their attention level is naturally low. Conversely, if their familiarity with that task is low, they naturally pay more attention to the performance of the task. In the laboratory, that sounds dangerous. The more routine work staff do while handling dangerous chemicals and biohazards, the less focus they have while performing tasks with those dangerous items.

The GEMS model also breaks down errors into three different modes: skill-based, rule-based, and knowledge-based. In the skill-based performance mode, a person does not consciously think about the actions being performed, they are acting from memory. Errors in this performance mode result from slips or lapses in execution due mainly to a lack of attention. Many lab injuries and exposures fit into this category.

Errors in the rule-based performance mode result from misinterpretation. The worker fails to recognize the changes in the routine task and therefore does not apply the correct rule to complete the task successfully. For example, when lab employees hear an overhead fire alarm, they close the doors and listen to hear if the fire is nearby in case evacuation is necessary. A mistake here could be a decision not to evacuate soon enough resulting in getting trapped in the department.  

Errors in the knowledge-based mode are a result of misdiagnosis. In situations that are unfamiliar to a laboratorian, they do not have or recognize all of the information needed to make an informed decision. People rely on assumptions to guide the decision-making process here, and the chance for errors with missing information and assumptions is very high. Think back to the COVID-19 pandemic — many had never encountered anything like it before. What decisions were made by labs and what incidents occurred that were a direct result of incorrect assumptions?

Humans make mistakes, but that doesn’t mean there is nothing that can be done. By performing risk assessments and focusing on the lab systems first, you can create safety barriers which positively impact how the humans of the lab interact with the hazards handled every day. Putting roadblocks to errors in place — engineering controls, administrative controls, PPE, etc. — will make the lab a safer place to work.

Staffing and safety

As laboratory professionals, we have long been aware of the impact of staffing shortages within our profession. While efforts continue among our supportive professional organizations to increase awareness of laboratory medicine and help recognize the importance of our work, one of the greatest tools we have for our own survival is retention. This means that the environment in which we work and leadership’s accountability for a positive culture is paramount.

As identified in a study completed by Gallup Inc., employees who receive recognition and praise that are authentic, personalized, and equitable have a positive impact on their teams and their organizations. The study continues to address workplace safety, identifying that if 2 out of 4 employees were to receive recognition or praise for doing a good job in the last week, the organization could see a 22% drop in safety incidents along with a 22% decrease in absenteeism.4

Why does recognition influence safety in the workplace?  People who feel appreciated behave differently and form stronger social bonds. They care about one another and want to perform their tasks safely, not because the rules mandate it but because they care about the friends they work with.4 Receiving recognition also reduces the risk of employees cutting corners. When employees feel like no one cares, they believe no one will notice if they make an unsafe decision. This is one of the holes in the Swiss cheese that can lay dormant if it is permitted to continue. As an example, who would notice if a scientist were to document controls that were not actually completed?  They always work, correct?  No, not correct. Allowing systems to continue without a quality control check can lead to inaccurate patient values and potential patient harm.

How does a lab find these latent errors? Do you have to wait until all the holes line up and wait for something terrible to happen or can laboratories be more proactive?  In response to the IOM report, the U.S. Congress formed the Agency for Healthcare Research and Quality (AHRQ) and commissioned them to implement strategies and tools to improve patient safety. One of the tools identified was taken from military and aviation safety called a “Just Culture.”5

Just culture in the laboratory

Recognizing that placing blame and penalizing individuals who make mistakes only led to underreporting and the inability to fix systematic failures, the AHRQ acknowledged the benefit of creating a Just Culture. According to Outcome Ingenuity, a nationally recognized just culture training organization, a “Just Culture refers to a values-supportive system of shared accountability where organizations are accountable for the systems they have designed and for responding to the behaviors of their employees in a fair and just manner. Employees, in turn, are accountable for the quality of their choices and for reporting both their errors and system vulnerabilities.”6 This shared accountability for recognizing and taking action for safety events is dependent upon a non-punitive approach to error reporting and requires creating a workplace environment where individuals are encouraged to report mistakes and near-misses without fear of punishment.

The key principles of a Just Culture include accountability without blame, encouragement of reporting, focus on learning and improvement, and distinguishing behaviors.7  These distinguishing behaviors help identify the cause of errors into three categories: human error (an unintentional mistake); at-risk behavior (a choice or action that increases the risk for error); and reckless behavior (a conscious disregard or willful act that deviates from safety practices or procedures).8

You cannot fix what you don’t know. Reporting systems in a Just Culture are the key to understanding the breadth of errors that help identify system malfunctions. In today’s healthcare world, there are many software applications that can help ease the burden of error data management but if you can’t get your employees to report, there is no data to evaluate.

One of the best indicators of a strong lab safety culture is when ‘near miss’ safety events are reported regularly. If that is happening in the lab, you can discern that real events are also getting reported transparently, and there is no punishment or blame when incidents are reported to leadership. This type of culture also means that the laboratory staff are actively seeking safety issues on a routine basis, and that they care about their safety and that of their co-workers. This part of a Just Culture mindset is something for which all lab leaders should strive.

Trust in reporting safety issues

What causes barriers to error reporting in healthcare? Often, trust is an issue. It will take some time for team members to understand the value of the data to improve patient care and the commitment from leadership to assess errors without blame. Trust within an organization can be looked at in three different ways that can both enhance or prohibit employees from error reporting: organizational, team, and experience factors.9 

Organizational factors include the ease and anonymity of the error reporting system as well as leadership style. Every employee can feel the tension from an unauthentic leader who wishes to harm more than understand. Expectations for leadership include self-awareness training, relationship building, and open and honest communication.

Team factors focus on building a positive culture where the team cares for one another (as described previously) and creating mutual understanding of the challenges faced within the team. This understanding enhances the acceptability to report. Team members might feel like they are “tattling” or that their reporting of other’s mistakes will get their friends in trouble. Continuous communication regarding the non-punitive nature of a Just Culture is necessary and must come from the top down.

Finally, the experience factor involves training and confidence. New graduates or team members who lack confidence in their skills often make more errors and unfortunately, are sometimes bullied by more experienced co-workers. There should be zero tolerance in the workplace for bullying. In fact, bullying behavior is one of the latent factors identified that leads to errors and must be reported.

The perceived risk of team retaliation in error reporting must also be addressed by leadership and additional training for struggling team members should be provided. Documenting issues outside of the lab is important too. There are many non-lab professionals in the healthcare setting that don’t understand the laboratory, our processes, and regulatory requirements. Often times, the lab is seen as an obstacle to patient care. We have all received the occasional phone call from someone who is frustrated and may be unprofessional towards the lab. Remember that this is always a response from someone who is focused on their patient and under significant stress by outside forces that you are unaware of. Remaining kind and professional when dealing with these calls will go a long way to help build that relationship of shared understanding. Placing a report in the error management software will also help the organizations set expectations of behavior for all employees.

Conclusion

There are multiple factors that affect the laboratory’s safety culture, just as there are several system circumstances which can lead to lab accidents. Understanding human behavior and having complete knowledge of the systems in place in the department can have major advantages in establishing a culture where safety is maximized. A focus on leadership training in these areas would be advantageous for any laboratory manager or safety professional.

Leaders who are present in their labs also make a difference. That visibility can affect behaviors and help staff to understand that they are cared for by their employer. Creating a culture where continuous and open communication is the norm and moving to a stage where even near-miss incidents are reported are the hallmarks of a strong safety culture where fewer incidents prevail.   

References

1.        Corrigan JM, Kohn LT. Donaldson MS Editors. To Err Is Human: Building a Safer Health System. National Academies Press; 1999.

2.        Reason J. Human Error. Cambridge University Press; 1990. doi:10.1017/cbo9781139062367.

3.        Plebani M. The detection and prevention of errors in laboratory medicine. Ann Clin Biochem. 2010;47(Pt 2):101-10. doi:10.1258/acb.2009.009222.

4.        Gallup Workhuman From Praise to Profits: The Business Case for Recognition at Work, 2023 Gallup, Inc. All Rights Reserved.

5.        Boysen PG 2nd. Just culture: a foundation for balanced accountability and patient safety. Ochsner J. 2013;13(3):400-6. 

6.        Just culture in health care. Justculture.healthcare. Accessed February 18, 2025. http://www.justculture.healthcare/.

7.        Mattew M. A Just Culture: Understanding Its Roots and Role in Modern Organizations. Safety Inc. Published online 2024.

8.        American Society of Health-System Pharmacists. Just Culture Toolkit. Accessed February 18, 2025. https://www.ashp.org/-/media/assets/pharmacy-practice/resource-centers/patient-safety/Just-Culture-Toolkit_-Final.pdf.

9.        van Marum S, Verhoeven D, de Rooy D. The Barriers and Enhancers to Trust in a Just Culture in Hospital Settings: A Systematic Review. J Patient Saf. 2022;18(7):e1067-e1075. doi:10.1097/PTS.0000000000001012. 

To take the test online go HERE. For more information, visit the Continuing Education tab.

ID 309524910 © Julija Matuka | Dreamstime.com
dreamstime_xxl_309524910
matawee matipano/iStock/Getty Images Plus/Getty Images
gettyimages1176254834
Trifonenko/iStock/Getty Images Plus/Getty Images, Jian Fan/iStock/Getty Images Plus/Getty Images
2410mloce