AMSJ » Before the Red Light What does that amber light really stand for?
LATEST NEWS

Before the Red Light What does that amber light really stand for?

David J Broadbent examines some of the “stuff” behind hazard and near miss reporting.

When I was approached to write a piece on the nature of safety reporting it came at a time when I was reflecting on what seems to be a difficult period in the mining industry. Within Australia particularly there seems to have been a procession of mining fatalities’ over the last 12 months or so. Even in the week prior to my putting the fingers to the keyboard there has been two fatalities, numerous serious accidents, and a roof collapse in New Zealand.

I had also been spending a bit more time myself contemplating the true nature of ourselves, and some of the neuro-biology (brain function) underneath some pretty important aspects of decision-making. As an example I would like you to place yourself in the following situation:-

  • You are travelling in your car, to home, after a day’s work.
  • You are less than five minutes from your home.
  • As you approach a very open well lit intersection, the traffic light turns from green to amber.
  • What do you do?

Hang on. Before you answer that question I want you to honestly ask yourself: what was the very first “thought” that went through your mind?

For almost ninety per cent of people who contemplate this question, the answer is something like: Can I make it?

It is only later that something else “kicks-in”, and starts to try and regulate that initial

question. By the way, if the answer to that question is “yes”, what is the consequent behaviour?

In almost every case it is “speed up”. Having said that, the amber light has the same required behavioural response in every jurisdiction around the world. Slow down.

Before The Red LightSo we now know that for nearly all of us, when confronted with an amber light, our very first thought is to do something potentially very dangerous, and against the “training” that has been provided. This is not as unique an example as you might think. I could quite easily present you with a number of similar scenarios, many of which underlay serious safety system failures.

There are also some pretty forceful neuro-biological pre-determinants that contribute to this hypocrisy, and almost make it inevitable. Suffice it to say, maybe we are inherently hard-wired to seek the “easy way”. Some might even say (including myself) that we might be hard-wired toward that thing we call in safety a “shortcut”. If I really wanted to get into the neuro-biology I might say that there is a constant battle going on here between two very distinct components of brain function. These bits then directly and powerfully influence our decision-making and evidently our behaviour. They are called the amygdala and the frontal lobe. In terms of how this works, it has a lot to do with the speed of information processing within the brain.

Consider the amygdala as functioning at the speed of a bright red Ferrari. The frontal lobe has the pace of an ’83 white Volvo. You can see which one is always going to win that race. Now here’s the kicker. The amygdala is a part of our brain that handles emotional responses, while the frontal lobe is more caught up in the way we might more “rationally” process information that we are presented with, e.g. risk assessment.

So what does this mean? It means whenever we are presented with information, especially novel information, our initial response is almost always triggered by our emotion (amygdala). If we are under pressure of any sort or have an emotional desire to achieve, then it is quite likely our behaviour ends up being primarily directed by those “emotions”. That in itself is dangerous and can well lead to quite toxic outcomes.

Speaking of toxicity we now need to come back to the earlier observation referencing the number of fatalities within Australian mining over the last year or so. Now here’s where we have to be very careful. This does not mean there is a crisis within the Australian minerals extraction industry – there could be, we just don’t know. When we are dealing with such small numbers, a 50 per cent increase, from, for example, four to six, does not necessarily a crisis make.

What I am compelled to highlight though, is any such increase is the most significant event in the lives of the families and work colleagues of those who have been killed or seriously injured. It does not get any worse than having to walk up a driveway, knock on somebody’s front door, and tell them that their husband/wife, son/daughter, mother/father shall never be taking that same walk again. They have been killed while in your employ. I know this to be true from some surveys conducted by TransformationalSafety.Com. When senior managers are given a range of scenarios, and asked to rank them in order of that which they pray they shall never have to confront, it is that long walk up the driveway that wins the prize. Not by a small margin either. Ninety-seven per cent of respondents (n=1687) place this at the top of their list of workplace nightmares.

Regretfully this universal (and cross-cultural) fear does not appear to have been translated into senior leadership behaviour within many environments.

Consider for a moment the Upper Big Branch explosion in the United States, which killed 29 miners. There had been a number of issues reported to supervision, which, it turns out, were ignored. Local site leaders were required to report tonnages to the mine owners in Charleston every 30 minutes. Yes, you read that correctly. What message does that send to the operational workforce? Do you think workers would have any confidence with respect of any safety hazards that might be reported? I think not.

What To Do Before The Red LightLet me not single out the good old US of A. What about the Pike River explosion in New Zealand, which also killed over two dozen miners? Again we find that the conclusions of the Commission of Enquiry demonstrated that there were numerous warning signs of things not being right, and these were consistently ignored (despite being reported) – not only by local management, but by the regulatory authorities as well.

Now let me come back to a disaster that is well known throughout the world: the Deepwater Horizon (Gulf of Mexico) incident. Just prior to the explosion there was a discussion – described by witnesses as heated – between some drilling contractors (Transocean) and a senior client representative (BP). At that time the contractors were reporting safety concerns with the process. These guys had many years of offshore drilling experience between them. The BP guy was heard to say “Well this is how it’s going to be…” and the Transocean guys were seen to reluctantly agree. Those guys are now dead.

Douglas Brown (Chief Mechanic on the Deepwater Horizon) said the senior Transocean official on the rig was heard to mumble, “Well, I guess that’s what we have those pinchers for” — which he took to be a reference to devices on the blowout preventer, the five-story piece of equipment that can slam a well shut in an emergency. Guess what, the blowout preventer failed.

It is always a fatalistic risk to place reliance upon the final barrier within a perceived serial sequence of accident/disaster prevention. By the way, many disasters are a product of an apparent random correlation of contributory factors. So to continue to see accident causation as a sequential process is flawed within itself.

This draws us back to the very real question about the increased necessity to get “inside” what is happening within a process, and to try and “get ahead” of all this apparent entropy. While at the end of the day, the accident/disaster may appear random, it actually is not.

How often have you investigated an accident that has occurred and a significant number of your people say words to the effect of “I knew that was going to happen, just wasn’t sure when”?

The fact that you have even heard that shows that even these major disasters, and not just the smaller incidents, are often predictable. And if they are predictable, then they are almost always preventable.

So the people within your business, who know the most about what is going on, both on the surface and beneath it, are the operational workers. Yes, the technical guys may know how something has been designed, its functional parameters etc., but it is the operational workers who are exposed every single day, and can become sensitive to even the slightest deviations in the process.

I am drawn back to my past life as a metallurgist with BHP during the early 1980s. We had a special machine that you could place against a piece of steel to determine its composition. This was used if there was any concern that the process may have been contaminated, and the wrong grade of steel could be sent to the customer. It was a time consuming and cumbersome process. Well, we had guys on that plant, who had been around a while, who could perform the same function by sight. What did they do? They hit it with an angle grinder and could determine the grade by the colour of the spark given off. The operational worker was far more efficient, less prone to breakdown than the latest and greatest technology of the day.

Where this leads us, is the necessity to significantly increase the reporting of potential hazards, near misses, process deviations and more. Failure to do so shall result in the continuation of unnecessary death and disaster. Not any less significant are the life-changing injuries and occupational disease that also continue to occur.

What we do know is that operational workers are the key to gaining this level of information. What we also know is that the majority of operational workers keep this stuff to themselves. Why?

Now take yourself back to those rather confronting examples. We have to assume that the supervisors or managers involved in the decisions to functionally ignore these “warnings” did not deliberately set out to kill their employees. Yet the consequences were exactly that. The other significant consequence is that the workforces in these businesses become conditioned to not report. After all, why bother? The information they provide is “ignored”. In the language of psychology we might say that desire to “report” has been extinguished. In my own language I would suggest the desire to report has come up against the “Iceberg of Indifference”. We all know the damage that icebergs can do. The Iceberg of Indifference is no exception.

It becomes glaringly obvious does it not, that one of the first steps an organisation must take, if it genuinely wishes to see an increase in hazard and near-miss reporting, is to treat any such reports with due respect, and to respond to any such reports in an appropriate and timely manner. They must also ensure that the reporter is kept fully informed, and included in, any actions emanating from that report. This is such a simple and obvious process and yet the bulk of organisations cannot even get this right. I have lost count of the number of organisations I have visited who were concerned that their Take-5, Step-Back and similar programs were not working. In every case where this has been a concern, the underlying reason was as just described.

The organisations had these programs essentially running in one direction. They were trying to get information, and giving very little, if anything, back. It does not take long for people to realise these potentially very valuable interventions are being used as an administrative and political tool within the business, without any visible recognition at all. Thus the failure of reporting does not necessarily lie at the feet of operational workers. It lies very clearly at the feet, and up to the armpits, of the organisational culture itself. Where does that “culture” come from? In large multinational organisations it often comes from “above” and then is filtrated down from those lofty locations, via multiple levels of management. Even in these types of organisations where reporting is a problem, there are always locations that have far better reporting happening. When we look at why this so (as I have been asked to do), the answer is often again glaringly obvious.

The local management within these locations have acted as buffers from some of the less helpful messaging coming from above and created what I refer to as an “Island of Influence”. They have implemented the very basic models of didactic communication described previously. In addition they have very overtly made it clear that they embrace, encourage, and value all reporting of potential hazard and near-miss information. Not only that, they celebrate the organisational value of the reporters and they immediately act on the “report”.

In the next edition I shall highlight what we actually need to be reporting, if we want to avoid a continuation of the toxicity we are seeing in our workplaces. You may well be surprised. It is not what you see in an annual report.

David G Broadbent, B.A. (Hons), M.A.P.S., M.S.P.E., F.I.S.Q.E.M.
Safety Psychologist

David-G-BroadbentDavid G Broadbent is the director of global safety consulting firm, TransformationalSafety.com. He is recognised as a world leader in the areas of safety culture and safety leadership and the impacts that these constructs have upon accident causation. David was a metallurgist with BHP prior to changing career paths to the field of applied psychology.

Find out more about David and his work at www.transformationalsafety.com.

Add Comment

Click here to post a comment

AMSJ April 2022