SAFETY The Accidental Organisation – Lessons from the transport sector

Sometimes the most significant human error in the transport system can be the continuing failure by people to apply effectively and efficiently the bitter lessons of past mishaps in order to prevent future accidents.
According to James Reason, an emeritus professor of psychology at the University of Manchester in the United Kingdom, organisations can benefit from examining organisational issues which contribute to accidents. Reason advises organisations to look for what he calls latent pathogens, or “disease-causing agents”, which already lie dormant in any given organisation.
All organisations have many processes in place that are common to each. These may, for example, include: goal setting, policymaking, planning, managing safety, budgeting, communicating, information-handling, checking and controlling.
Organisational factors are the macro forces that affect safety in any organisation, especially those with potentially more risk, such as the aviation or maritime industries.
In aviation, for example, every cockpit, maintenance hangar, dispatch office and control tower is microcosm where action is shaped by the immediate decision making of those present – the dispatchers, shift supervisors and controllers. The microcosm, though, has already been shaped by decisions made elsewhere in the organisation. These decision-makers – remote in time, space or organisational linkages – set the stage.
In similar vein, Ron Westrum, professor of science, technology and society at Eastern Michigan University, has identified range to measure the organisational climate, and distinguishes three levels of organisational safety culture that vary in how information flow is handled. These are termed the pathological, bureaucratic and generative schemes.
In pathological organisation:
• information is hidden
• messengers are ‘shot’
• responsibilities are shirked
• bridging is discouraged
• failure is covered up
• new ideas are crushed.
A bureaucratic organisation is one where:
• information may be ignored
• messengers are tolerated
• responsibility is compartmented
• bridging is allowed but discouraged
• the organisation is just and merciful
• new ideas create problems.
In generative organisation:
• information is actively sought
• messengers are trained
• responsibilities are shared
• bridging is rewarded
• failure causes inquiry
• new ideas are welcomed.
Reason suggests that latent pathogens – unseen, unsafe conditions – tend to build up before an accident or unsafe event occurs. These dangerous pathogens lie dormant in any organisation, and information flow is the means by which these unsafe conditions are spotted and acted upon. When information flow is brisk (as in generative organisation) the latent pathogens can quickly be spotted and corrected. In pathological organisation, however, the person who may spot the pathogen is suppressed and the problem is not resolved. We can go further and characterise organisations according to how they respond to abnormal conditions.
There is spectrum of responses to anomalies. On one side of the spectrum we see denial responses. Those who see irregular or abnormal conditions are suppressed or isolated, unable to do anything.
In the middle is the more typical response – the immediate problem or condition is either explained away or perhaps remedied, but no deeper inquiry is undertaken. This ‘quick fix’ can take place when problems are first spotted, but where there has been no disaster to provide public awareness and compel action. Global fixes are often used to correct problems common to all units in specific area, such as on an aircraft type.
On the other side of the spectrum ‘inquiry’ takes place. This occurs in those rare organisational climates where leader decides to correct not only the immediate problem but to attack underlying conditions as well.
In the maritime world, Justice Sheen’s judgment on the causes of the capsize of the Herald of Free Enterprise off the Belgian coast in March 1987 where 188 people lost their lives led him to conclude that “… full investigation into the disaster leads inexorably to the conclusion that the underlying or cardinal faults lay higher up in the company … From the top to bottom the body corporate was infected with the disease of sloppiness.”
Just as the corporate climate is shaped by day-to-day actions, it is also shaped by major organisational changes. As an organisation grows and changes, new people and equipment are added. Here problems can occur even when management tries to foster positive climate. Growth and change can cause managers to become overloaded with responsibilities, which can have negative results. Information flow, vital to the correct functioning of any organisation, can again be suppressed.
Organisational processes – decisions taken in the higher echelons of the system – can seed organisational pathogens into the system at large. These pathogens can take many forms such as:
• managerial oversights
• ill-defined policies
• lack of foresight or awareness of risks
• inadequate budgets
• lack of legal control over contractors
• poor design, specification and construction
• deficient maintenance management
• excessive cost cutting
• poor training and selection of personnel
• blurred responsibilities
• unsuitable tools and equipment
• commercial pressures, or
• missing or flawed defences.
The adverse consequences of these pathogens are transported to the various workplaces where they can act upon the defences and local working conditions to bring about an adverse event.
Safety costs money, but having accidents can cost lot more. When economic pressures increase managers try to decide where they can cut costs, and safety is unfortunately often one of the target areas for costcutting. In these situations generative organisation that encourages free flow of information is vital to ensuring that the cost cutting has not gone too far.
Risk analysis and review is about understanding where failures are likely to occur, and why, and how to minimise the potential for them to occur, or the effect of any failure.
The investigation into the tragic loss of the Space Shuttle Columbia showed how an organisation came to normalise what were actually abnormal events. Over time, the perceived risks associated with the ‘loss of foam’ reduced and, as more events occurred, the luck of surviving these events became justification for no concern.
Associated with the shift towards minimising the perceived potential of incidents is one of defence and avoidance of blame within an organisation. As incidents stop being seen as having learning value, they can start to be seen as criticism about the failure of the organisation. Then the natural reaction of the organisation is to avoid blame by passing responsibility for the failings ‘down the chain’. In turn this can lead to culture of discipline and the failure to recognise the human factors involved in any error or omission, in that any system and individual is prone to failure.
While we must recognise the importance of individual accountability we must also recognise that personnel do not act in manner independent of company norms. Failure pinned on an individual can be viewed in isolation, rather than being seen as an opportunity to learn. In these situations information flow can be stifled (as in pathological organisation) because individuals may not report incidents out of fear or embarrassment, and the free lessons for effective risk analysis and review can be lost. The result is steepening downward spiral until more significant and costly event occurs.
By accepting that incidents are the precursors to more serious events, they can be analysed to reduce risks so that the potential for failure to lead to disaster is minimised. However, as the significant event or accident rate declines, as it has done in the areas of public transport, an impression of corporate immunity can be created

Visited 14 times, 1 visit(s) today
Close Search Window