1-800-240-4601 david@davidsarkus.com

The knowledge gap within utilities, construction, and related industries is more of a growing concern than ever — especially when it comes to serious injuries and exposures.  I often speak with clients about their need for trainable or experienced workers in carpentry, electrical, plumbing, welding, or in HVAC systems.  And if we look further, this gap certainly extends to leaders who are expected to support workers with these skills, especially leaders who have limited experiences of their own.  Leaders who lack hands-on experience often lack the competencies to more fully understand various jobs and the situational hazards that could lead to various errors and harm producing incidents.    

We now have more tools at our disposal than ever before.  One evolving field of study that is showing more interest and application is that of the cognitive sciences, and mental errors, cognitive slips, attentional control, and situational awareness. There are many models and thoughts floating around but I believe our efforts should be directed toward using applied work that is readily understandable for both front-line leaders and workers.  The aerospace industries along with the medical researchers have taken the lead in this arena, especially over the past 20 years or more.  So, please let me throw some mud against the wall and see what may stick for you and your organizations.

Having a taxonomy of causal factors to categorize errors and appropriate interventions is a good starting point.  And situational awareness seems to be one causal factor that often becomes a major contributor to errors and subsequent accidents (Endsley, MR, 1995).  I believe in parsimony in creating a practical model for use by front-line supervisors and workers.  When we do this, we can obtain greater buy in and use but may limit the enhancements and comprehensive nature of a more sophisticated model.  But I’ll take buy-in and practical use, any day of the week.

Poor Planning at any level will cause a host of safety-related problems.  Good planning includes materials, people, tools, equipment, and processes.  Each of these are a big part of getting a job done efficiently, effectively, and safely. 

Failure to Recognize Hazards needs to be supported by training and follow-up.  Pre-job briefings, task hazard analyses, and related learnings requires attention and abatement of the concerns identified.   

Failure to Incorporate Hazards.  When work operations become busy, workers may recognize various hazards but forget about them throughout the course of their day, which may lead to unabated hazards and injuries.  Mental overload may be an underlying concern that should be addressed. 

Incomplete Information.  Some leaders make decisions on the floor or in the field with limited knowledge and proceed somewhat blindly.  Some hazards on a start-up job may not have been well identified, subsequently a line or job is launched, leading to a serious accident.  Something as simple as a guard or new tool that wasn’t properly installed, or appropriate training may be part of the solution.    

Procedural Deviations such as omitting an important step or task is an oversight, knowledge or execution gap that requires significant attention.  Salience in communications and a thorough understanding of procedures becomes increasingly important. 

Failure to Predict Consequences. In some cases, workers recognize a given hazard but fail to attach meaning to what they’ve seen. Individuals may continue working but fail to understand how that hazard could eventually impose harm.  For example, the movement of people and related line-of-fire risks, or an open container of low flash point solvents that could be ignited
though the use of nearby tools or equipment. 

Fatigue may play a big part in physical and mental miscues that underlie many near-misses and accidents.  Time-outs, appropriate breaks, diet and exercise are all part of the solution. 

I’m fully aware of other contemporary taxonomies that are more comprehensive, but some organizations, along with their front-line leaders and workers may not benefit from such complexities (Zhang J., Patel V.L., Johnson T.R., Shortliffe E.H., 2004).  Even more, today’s data-driven solutions provide similar models that may need to be enhanced. Our challenge, as professionals, is to provide meaningful categorizations of situational or mental errors, and corresponding interventions and training that can lessen their impact. 

Finally, today’s complex organizations, with intricate people interactions, processes, and human factors interfaces,  may require more robust but practical strategies.  However, having a starting point to limit cognitive errors and issues, through front-line leaders and workers, is a key factor in diminishing unwanted consequences. 

References

M.R., Endsley, A taxonomy of situation awareness errors. In Eds. R. Fuller, N. Johnson, and . McDonald (Eds.), Human Factors in Aviation Operations. Aldershot, England:  Avebury Ashgate Publishing Ltd.; 1995.

Zhang J., Patel V.L., Johnson T.R. and Shortliffe E.H.  A cognitive taxonomy of medical errors.  Journal of Biomedical Information 2004; 37, 193-204. 

David J. Sarkus, MS, CSP is a speaker, consultant, author, and coach with over 30 years of experience. He has written five books and more than 100 evidence-based articles. He is president and founder of David Sarkus International, Inc., which provides a full menu of safety leadership and culture driven services for some of the biggest and best run organizations in the world. Please visit www.DavidSarkus.com for more information. David can also be reached at
1-800-240-4601.

Share This