Web Hunter: Free resources for dealing with human error

July 12, 2006
Human error is a built-in feature we share, so let’s use something besides punishment to foster improvements. Russ Kratowicz peruses the Web and offers up free resources in his monthly column.

We all make mistakes. That’s how we learn. Mistakes, sometimes major, but always corrected after the fact, underlie all the science we exploit every day in keeping our plants operating. A dog learns the truth about its environment by experimenting with behavior and repeating the moves that produce some sort of physical or psychic reward.

Mistakes are a part of life and, often, it’s easier to get forgiveness than it is to get permission. It’s a truism that holds fairly well, except, of course, when you’re up against Mother Nature, who can be a most indifferent, unforgiving, nasty old hag. So, let’s find ways to stay out of trouble. Join me for another hop into that digital morass we call the Web in search of practical, zero-cost, noncommercial, registration-free resources focused on minimizing the effects of human error. Remember, we search the Web so you don't have to.

It’s inevitable

A case study posted on the Web is telling. The subjects who were tested made errors, even though they had been trained in the intricacies of the simple experimental task and had at their disposal a full set of written instructions covering the proper way to perform the specific actions involved. If the conclusions of that study are credible, it suggests that we shouldn’t be too obsessive about disciplining workers for the errors they commit. If you misdirect your mouse to http://acmqueue.com and enter the phrase “human error” in the search box at the upper right of the page, you’ll be rewarded with “Coping with Human Error” by Aaron B. Brown at IBM Research. This is a four-page, somewhat philosophical article that talks about the inevitability of human error and the perfectly understandable reasons that we sometimes make a mess of things. The second page addresses schemes for preventing errors when humans are responsible for developing software. The section on asynchronous replicas addresses schemes that IT professionals can use to minimize errors.

How bad is it?

It shouldn’t surprise you to learn that people can make a living out of studying our errors. Fortunately, we don’t make too many monumental errors that have fatal results. The rapid, flexible responses that human cognition permits served our early ancestors well. It was certainly one of the factors that helped us to develop into the dominant species around these parts. Dr. Raymond R. Panko at the University of Hawaii believes that the study of human cognition can provide insights into ways to eliminate or mitigate the effects of human error. If he can figure out how people think, he can figure out a way to prevent us from thinking ourselves into a big, bad hole. He reports on error rates he’s culled from many studies in this field. Pay a visit to http://panko.cba.hawaii.edu/HumanErr/ for Panko’s “Human Error Website.” Be sure to read the Basic Error Rates that cover a range of human activity. I found the results of the Dremen and Berry study to be the most distressing, given our capitalistic structure and the narrow range of error rates listed for the other studies.

Alibis galore

If they were planned events, we wouldn’t call them accidents. It’s human nature to assign blame when something bad happens, even if only a close call. But the plant floor is somewhat askew, as evidenced by the fact that blame has a propensity for rolling downhill in any organization. When it finally stops at the lowest practical level, the excuses you hear from the victim could be born of low self-esteem or other psychological factor. But, studies show that two-thirds of at-risk behaviors you see in the plant are the result of organizational issues, not examples of volitional behavior on the part of the participant. Dave Johnson, the editor of Industrial Safety & Hygiene News explains the details and offers five tips for responding to what might appear to be lame excuses that people offer. Use your digital prowess to find your way to www.ishn.com and click on “E-Newsletter” at the left side of the screen. When that page loads, scroll down almost to the bottom for the entry that reads “Excuses, Excuses.” It might change the way you respond.

From the Ivory Tower

Although the errors that we make are sometimes inconvenient and embarrassing, most of them can be undone and the world is restored to its pre-error condition. The last place you want to see errors is in your local medical clinic, especially if you’re the unfortunate victim of someone’s erroneous action. The British medical establishment is concerned about such events and clinics there seem to subscribe to one of two schools of thought regarding human error. The person approach involves flogging the people involved when they commit the moral transgression of making an error. The system approach, on the other hand, expects people to make errors, and responds by building a surrounding framework having a sufficiently robust set of error traps. You can find the details about the advantages and disadvantages of each approach in “Human error: models and management,” by James Reason at the University of Manchester. Send your mouse to http://bmj.bmjjournals.com/ and enter james+reason in the search box at the upper right corner of the page. When the search results appear, click on the “Full Text” option for the most effective presentation of the material. I’d be surprised if some of it couldn’t by applied to repairing machinery as well as repairing bodies.

Focus on systemic sources of error

Pop across the pond to Human Reliability Associates Ltd., a British consultancy, and you’ll be able to access a collection of nine articles by Dr David Embrey. These are housed at www.humanreliability.com/resource1.html, and they explain why involving the operations staff in risk assessment can lead to a best-practices culture. A weakness of the traditional model of investigation is its focus on the “what” and the “how,” rarely on the “why” that needs to be identified if you want to get at the systemic factors that determine the excess of human error we see. Another article discusses the importance of data collection in incident investigation and continuous improvement initiatives. Read about the performance influencing factors (PIF), which are the factors that combine with our tendency to make mistakes to produce the common error-likely situation. Add in an unforgiving environment and you’ve got a disaster waiting to happen. Predictive error analysis is a tool that you might find useful for minimizing the disaster potential. This site has some good reading material. By the way, the acronym SPAD means “signal passed at danger,” which is a type of railway error.

Fixing maintenance

Did you know that at least 18 human factors can prevent you from achieving safe and reliable maintenance? So says Steve Mason at Health, Safety & Engineering Consultants Ltd., a British firm. His article, “Improving maintenance by reducing human error,” also classifies human error into one of three types. This document rambles a bit, but finally gets to the point: you ought to be selecting machinery that exhibits the lowest demand for maintenance if you want to build a maintenance-friendly environment that won’t suffer from a bad case of unplanned downtime. Mason offers several suggestions that might help the human factor side of your maintenance world. The suggestions and other methodologies he presents are solution-driven and customized for people who have better things to do than become human factors specialists. Point your browser at www.hf.faa.gov/docs/508/docs/mason15.pdf, and you’ll find Mason’s 12-page report.

Down Under Among its other duties, the Australian State of Queensland’s Department of Industrial Relations provides information on managing industrial and business risks. This governmental body publishes a series of “Codes of Practice” that cover a wide variety of human endeavors -- abrasive blasting, cash in transit, equestrian schools and rural chemicals, to name a few. The relevant entry for our purposes is found at www.dir.qld.gov.au/workplace/law/codes/plant/appendix1/. That’s where you’ll find the 2005 edition of the Plant Code of Practice. The particular page gives an explanation of the nine typical sources of human error that designers, manufacturers and employers should seek to minimize in the workplace.

Halls of justice

If someone gets hurt on the plant floor, your company will likely find itself in court. There, a plaintiff’s attorney will probably argue that your company is guilty by virtue of the fact that you didn’t protect the injured party from their own stupidity. And, the other side will provide expert witnesses that will tell the world how badly you run your operations and it’s a wonder that the other workers don’t go through a mass die-off each and every workday. The expert witnesses will be discussing a discipline now known as human factors. According to Dr. Richard J. Hornick, human error is involved in every accident, and human factor expert witnesses show how that error is the root cause that justifies a large jury award. Hornick’s article, “Human Factors as a Field,” suggests that you’re not going to win that court battle. Use your ergonomically correct mouse to very carefully explore, using proper personal protective gear and adequate lighting, the article, which is attached very firmly at http://expertpages.com/news/humanfac.htm.

See the light

Marc Green, Ph. D., has a deep interest in the concept of human factors, which is the study of how mere mortals interact with the world at large. For example, one of Green’s Web pages claims that the cause of most aviation accidents is a mismatch between human abilities and predispositions and the design of the plane’s controls. No longer are such mishaps attributed solely to mechanical fault, weather or human error. Green’s firm, Toronto-based Visual Expert, offers in-depth knowledge about vision, perception, attention, human error and human factors. He cites studies that point to human error as the cause, but argues that the real problem is a poorly designed system, product or environment. I point you to “An Attorney's Guide to Perception & Human Factors,” found at www.visualexpert.com/why.html not so much for his ruminations on these studies, but for the links at the bottom of the page. These offer examples that illustrate the connection between our perception and the errors we make. It’s worth a look.

Careful balance required

Jens Rasmussen at the Accident Research Centre of  Monash University in Victoria, Australia posted an interesting article. In it, he mentions that analysts who investigate mishaps tend to keep searching backwards to identify a root cause. They stop searching only when they run into a contributing event they know how to circumvent. This approach leaves too much unexplained, Rasmussen argues. The competitive business environment we inhabit drives us to take unnecessary risks, and sometimes equipment gets destroyed or people get hurt. If you like living on the edge, you ought to learn exactly where that edge is located and learn ways to cope with its proximity. You can read “The concept of human error: Is it useful for the design of safe systems?” at www.monash.edu.au/muarc/ipso/vol3/ps1.pdf.

Caution for root-cause searchers

The hard part of the modern maintenance game is digging down far enough and analyzing with sufficient rigor to identify an unambiguous root cause of a problem or failure. The easy part is making the fix. But, how should you respond when one of your brilliant in-house thinkers announces with overbearing certainty that the root cause is human error? Fixing people is rather difficult, assuming, of course, that you want to do it in a relatively bloodless manner. No, my friends, you’d be well served to strike human error from your palette of possible root causes. At least, that’s the advice from Erik Hollnagel at Sweden’s University of Linköping in his paper, “The Elusiveness of ‘Human Error,’” which your error-prone mouse will find at www.ida.liu.se/~eriho/HumanError_M.htm. Hollnagel makes his argument from four perspectives – semantics, philosophy, logic and practice. Human performance always varies and you’d be better off trying to modify your systems to make them error-proof and finding ways to detect and mitigate the effects of bad performance.

Without comment

http://en.wikipedia.org/wiki/Shoot_the_messenger
http://madtbone.tripod.com/
www.wetrainindustry.com/boilersafety.htm
http://smi-web.stanford.edu/people/felciano/research/humanerror/
www.plant-maintenance.com/maintenance_articles_maintainability.shtml

Sponsored Recommendations

Arc Flash Prevention: What You Need to Know

March 28, 2024
Download to learn: how an arc flash forms and common causes, safety recommendations to help prevent arc flash exposure (including the use of lockout tagout and energy isolating...

Reduce engineering time by 50%

March 28, 2024
Learn how smart value chain applications are made possible by moving from manually-intensive CAD-based drafting packages to modern CAE software.

Filter Monitoring with Rittal's Blue e Air Conditioner

March 28, 2024
Steve Sullivan, Training Supervisor for Rittal North America, provides an overview of the filter monitoring capabilities of the Blue e line of industrial air conditioners.

Limitations of MERV Ratings for Dust Collector Filters

Feb. 23, 2024
It can be complicated and confusing to select the safest and most efficient dust collector filters for your facility. For the HVAC industry, MERV ratings are king. But MERV ratings...