Featured

Brian Gestring Teaches Workshop on Building Reliability in Forensic Science with National leader in Collaborative High Reliability®

Brian Gestring Teaches Workshop on Building Reliability in Forensic Science with National leader in Collaborative High Reliability®

Brian Gestring has not been afraid to look outside of forensic science for solutions to the problems he has witnessed firsthand. This is how he first met K. Scott Griffith. Griffith, a world-class expert in Collaborative High Reliability® and Collaborative Just Culture®, had been a commercial airline pilot who rose to the rank of Chief Safety Officer for American Airlines.  It was in this role that Griffith became the principal architect of the Federal Aviation Administration’s Aviation Safety Action Program, a program that has dramatically increased commercial aviation safety in the United States since it was introduced.  

Gestring had always been interested in how industries like aviation and nuclear power prevented failures, and had met Griffith after a presentation he had given in a different industry.  They soon became fast friends and collaborators. Gestring convinced Griffith to give the keynote address at the 75th annual meeting of the American Academy of Forensic Sciences, and they worked together to teach a workshop, “Moving Toward High Reliability in Forensic Science,” for forensic professionals along with past AAFS president Laura Fulginiti.

A Field Evolving Under Pressure

Like aviation or nuclear power, forensic science is a field that can’t afford to make mistakes.  But unlike those other fields, forensic science does not have any external oversight or regulation, making it difficult to ensure that needed standards are met and reliable work is being performed.  While there have been significant advances in forensic science over the past decades, efforts to increase reliability still depend on providers voluntarily adopting more rigorous standards.

Recently, headlines across the national news have highlighted forensic failures, raising serious questions about the reliability of the work being done.  According to Gestring, these failures are not the result of one careless act, but a cascade of smaller problems that allowed the failure to occur.

Gestring described forensic practice as a “socio-technical system,” meaning it depends on the interaction of people, technology, and organizational culture. Understanding those interactions is the foundation for reliability.

At the workshop, Gestring, Griffith, and Fulginiti emphasized that a reliable forensic system does not depend only on new technology or stricter rules. It depends on awareness, communication, and the ability to recognize weak points before they lead to failure. 

Learning to See Risk

One of the first steps toward reliability is learning to recognize risk clearly. Every forensic scientist manages risk, whether in sample handling, data interpretation, or report writing. But not all risks are visible. Some hide in routine tasks, assumptions, or the way a team communicates.

Gestring explained that many professionals believe they are managing risk effectively because their processes have worked well up to now. The problem with this approach is that success can mask danger.  If you text while driving and don’t have an accident, the negative behavior is reinforced.  The risk of the action never changes and won’t be addressed until it results in failure.

If that failure doesn’t have a major negative consequence, the risk is often underappreciated. If someone is texting and driving and narrowly avoids an accident, it will not change the underlying behavior or increase the appreciation of the risk.  Gestring described both situations as expressions of something known as outcome bias.

This type of reasoning is all around us. Gestring pointed to how our society reacts to drunk driving. Two people might both drive under the influence. Still, if one is caught in a routine stop and the other causes a fatal crash, the response is completely different. In both cases, the behavior carried the same risk. The point he said is that risk must be addressed before the outcome exposes it.

Understanding Risk in Practice

Recognizing that a risk exists is only the beginning. Understanding why it exists and how it could unfold is just as important. The workshop discussed how perception influences risk management. What one person sees as a small problem, another may view as serious, depending on their experience and role. A technician who has seen equipment malfunctions may worry about maintenance schedules, while a manager may focus on broader organizational policies. When these viewpoints are not aligned, small risks can grow unnoticed.

They also reminded the audience that culture influences risk. In some forensic organizations, staff might hesitate to raise concerns because they fear being blamed. In others, repeated exposure to routine hazards can make teams numb to potential danger. Both situations increase the likelihood of mistakes.

To make their point, the presenters drew parallels to the early stages of the COVID-19 pandemic

Authorities were aware of the new virus but did not yet understand its behavior compared to past outbreaks. Awareness alone was not enough; understanding came only after careful study and the passage of time. Forensic organizations, they said, face the same challenge when dealing with risk. They must go beyond awareness to grasp the full scope of what could go wrong and how.

Managing Risk Through Reliability and Resilience

Once risk is recognized and understood, it must be managed. In forensic science, eliminating risk completely is not realistic. The goal is to create systems that reduce the likelihood of failure and recover quickly when problems occur.

Gestring and his colleagues introduced two key ideas: reliability and resilience. Reliability ensures that processes consistently produce accurate results. Resilience allows a system to recover from an error without long-term harm. The combination of these two qualities creates an organization capable of withstanding unexpected challenges.

To illustrate this, the presenters used a medical example. Administering medication carries an unavoidable risk. To reduce errors, hospitals rely on multiple safeguards, such as electronic prescribing, barcodes, and nurse double-checks. These steps make the system reliable. But if the wrong medication is still given, having an antidote available makes the system resilient.

The same rules apply to forensic science. Systems like blind verification can detect errors before they are reported, but it is still important to have systems in place in case an error does occur to recover quickly. In forensic science, Gestring feels that a customer working group is essential.  Forensic providers must have regular points of contact with all the agencies that use their services.  If something does go wrong, this can allow instant clear communications with anyone who could have been affected. 

Building a Culture of Reliability

Technology and procedures matter, but the foundation of reliability is culture.

Forensic professionals work under intense pressure where accuracy is non-negotiable. In such environments, mistakes can have professional and legal consequences. As a result, staff often avoid discussing errors for fear of blame, allowing problems to hide under the surface.  Griffith addressed this in aviation by creating the Aviation Safety Action Program, which was a unique collaboration between airlines, unions, and regulators that allowed blame and consequence-free reporting of issues before a failure occurred.

Griffith added that industries known for safety have reporting systems built on openness. Pilots routinely report near-misses, and these reports are used to strengthen procedures. Forensic science, he said, can adopt similar methods to encourage continuous learning and accountability. Silence prevents organizations from learning and improving.

The workshop emphasized the importance of leaders making it easy for individuals to talk about problems.  Supervisors and directors need to be open and honest and respond in a helpful way when problems come up.  When team members know they can report problems without fear of punishment, they become active participants in risk management instead of just watching.

This shift requires leaders to change how they view mistakes. Rather than treating them as personal failures, they should be treated as opportunities to improve systems. Every error reveals something about the process that can be corrected.

Practical Approaches to Reliability

The workshop also provided practical steps for participants to take back to their organizations.

The speakers urged labs to do thorough risk assessments at every level of their work. This includes looking at how technical procedures work, how workers are trained, how information flows, and how people in the culture feel about being responsible. By mapping out these areas, organizations can better figure out where problems are likely to occur.

Another important suggestion was to gather and study information on close calls and small mistakes.  Many things happen that the public doesn’t know about that might yet teach us important lessons.  By tracking these incidents, labs can identify patterns, such as recurring equipment issues or communication breakdowns, and address them before they worsen.

The presenters also emphasized the importance of redundancy in critical processes. Having multiple layers of review, or independent verification, can significantly reduce the likelihood of undetected errors. However, Gestring noted that these systems must be supported by people who understand their purpose. Reliability comes not from paperwork alone but from shared commitment and awareness among the entire team.

Why Reliability Matters

Forensic science’s reliability directly impacts the criminal justice system. The outcome of a case can be changed by any test, analysis, or report. Procedure breakdowns can lead to wrongful convictions, reversed verdicts, and a loss of public trust in the entire system.

Gestring reminded attendees that errors are inevitable in any human endeavor, but the way organizations respond to them determines their strength. He said that there is a pattern to how bad things happen, and there is also a science to preventing them. That idea captured the spirit of the workshop.

Looking Ahead

The workshop in Denver was more than just a conversation. The room was full of forensic professionals from every discipline, and the session was recorded so that others who could not attend the in-person session in Denver could still benefit.

Brian Gestring’s ability to look outside his field for solutions to forensic science problems resulted in this unique collaboration.  He sought out Griffith, a world-renowned expert on high-reliability systems and has been working with him to customize solutions that work for forensic professionals. Gestring and Griffith continue to collaborate in ways to increase both the reliability of the work forensic providers do and the resilience of the systems that they operate within.

Related posts

Why Is Anxiety So Common in America? Trauma Experts Weigh In

Natalie Nyugen

Melanie Boyack: The Mindful Speaker

Natalie Nyugen

The Compassionate Leader: Marsha Gay Reynolds’ Blueprint for Inclusive Healthcare

Natalie Nyugen

Leave a Comment