By: Mark H. Goodrich – Copyright © 2011
It has long been understood that accidents usually result from a chain of individual and system errors, many relatively innocuous when viewed alone, but deadly when allowed to combine and progress. Indeed, it is often said that ‘breaking the error chain’ is a key to accident prevention. Yet, in the mind of the public, many regulators and far too many airline managers, accident rates remain the sign posts by which safety in aviation is measured, despite that it reflects a reactive, rather than a proactive, process.
We hear and read it all the time – “The low accident rate made this the safest year ever for aviation”. Too often, these words are spoken by industry leaders or regulatory officials who either do or should know better, but are unable to resist claiming a low accident rate measured over a short period of time reflects irrefutable evidence of how their policies and leadership have made aviation safer.But industry professionals cannot afford to be so self-serving. From company responsibilities to customer and employee safety, to higher insurance premiums, self-insured retentions and deductibles, regulatory fines, reduced productivity, and negative publicity, accidents in common carriage reflect an ultimate breach of the public trust, and directly affect the bottom line for operators.
Often unacknowledged is that breaking a chain of errors amounts to ‘incident prevention’, and is the key to ‘accident prevention’. Thus, a focus on incidents – rather than accidents – is the best way to foster a culture of safety. One problem with focusing on incidents is that they are difficult to track. From the ramp to the shop floor, and from the flight deck to the offices of management, a variety of stone walls – some purposefully erected and some merely the product of human nature – work to downplay, rationalize or hide incidents.
Imagine that maintenance corrects a reported oil leak, and discovers the cause to be the prior installation of an improper seal. The part numbers for the correct seal, and for that improperly installed, are nearly identical. A simple transposition of digits in the part number results in a certain loss of oil in operation. Does maintenance report the incident, or merely write it up as repair of an oil leak? Failing to report the incident makes it a near-certainty that others will repeat the mistake, but the desire to avoid admitting an error will likely prevent the incident from becoming a valuable lesson for employees of the company, and for other operators, as well.
As a second example, imagine that a flight crew, running a non-normal checklist, finds they have inadvertently exacerbated the situation by selecting the wrong checklist with a title close to, but different from, the intended procedure. Do they report their error, or remain silent and set the stage to trap other crews in the future? It is all too likely that a desire to avoid admitting the error to managers and regulatory personnel will result in the confusing title remaining uncorrected until later the focus of an accident investigation.
The current emphasis on Safety Management Systems (SMS), including new regulations in many jurisdictions that require formal submission and approval of an SMS Program, often brings into specific relief what has always been the primary difficulty in addressing incident prevention – that is, conflict with the safety culture of the airline and its people. In working with carriers to meet the new regulatory requirements, we regularly see that companies with an already well-developed internal focus on incident prevention find the pro forma sort of ‘one-size-fits-all’ approach in SMS regulations interferes with existing company systems, and decreases the effectiveness of incident reporting.
Regulators want to see a manual in the prescribed form and with the prescribed content, too often placing conformity between all carriers above effectiveness for any. In reciprocal form, those companies for which safety management has always been form over substance anyway immediately figure out how to functionally work around the mandates of a required SMS Program, while creating the appearance of using it as intended. If management does not believe in the long-term value of a safety culture, a mandated SMS Program does little beyond legitimizing a less-than-safe operation with the imprimatur of SMS.
Although most regulatory agencies and airlines have self-disclosure policies under which employees receive some assurance that voluntarily disclosing an incident will not result in formal punishment, the true facts are that most programs work far less well than intended. Regulators too often use exceptions to justify punishment anyway, thereby crushing the original intent of the policy. Managers too often attach the stigma of an incident report to an employee, even though no formal punishment has been assessed. Regardless of how it is titled, regulatory or managerial pressure that forces people to choose between job security and professional responsibility is a cancer that spreads throughout an organization, and impedes the discovery and correction of faults likely to become part of an incident chain.
The most insidious ways in which incidents are suppressed often occurs in the carpeted hallways of senior management. Although the laws of evidence in most countries provide that remedial actions voluntarily undertaken to correct defects cannot be used as evidence in litigation, many managers believe that admitting an error creates a basis for liability, when the reverse may be true. In addition, deregulation worldwide has elevated low ticket prices as a top priority, and budget cuts by management often fail to discriminate between areas critical to safety, and others that are not. Reduced oversight, inexperienced senior managers, and a blurring of the lines that once delineated ethical conduct in business create an environment where the new breed of manager often sees no downside to the indiscriminate slashing of budgets. Quality is shunned. Safety is assumed. Reporting incidents to a regulator – even under qualified immunity – is seen as unnecessarily ‘teasing the bear’. Too often, a decision about incident disclosure rests on whether it is likely to be discovered by regulators anyway, and not whether safety will be enhanced by its revelation and incorporation into training, or potential liability curtailed by correcting its cause.
For managers in regulatory agencies, the audit of records is rationalized to be as effective as observing training or operations – that is, actual oversight. Was safety enhanced when industry realized that the paper trail was more important to avoiding regulatory fines than the actual conduct of flight, training or maintenance operations? Was the press release announcing installation of a ‘hot line’ to report safety problems more important to agency managers than actually staffing the line with qualified people? Those who try to lodge reports on such lines often confront the reality that – for all the bureaucratic press releases to the contrary – incidents are simply below the radar for regulatory officials.
Few managers want to acknowledge the effects that budget cuts have on safety, and those who seek to establish a higher level of ethical behavior often find their careers adversely affected. For all of the current rhetoric about resource management and ethics, the reality remains that ‘team players’ are still rewarded while ‘whistle blowers’ are penalized.
To legislators sailing on the political wind created by deregulation, the adverse effects on safety that over-competitive greed creates were not considered. Many assumed non-critically that quality – often and easily defined as meals, blankets and comfortable seats – is expendable. But quality in aviation goes much deeper than superficial passenger comforts. The ways in which crews are trained, maintenance is conducted, and incidents are tracked, are fundamental parts of the quality and safety equation. Industry vernacular is ripe with ‘new management paradigms’, ‘new economic models’, and ‘new realities’. We are told that what was once ‘good operating practice’ is now ‘old thinking’. But, those with experience know fundamentals do matter, and rationalization of quality reductions will appear through at least increases in incidents. Unlike products at a clothing or home improvement store, ‘quality’ in aviation equals ‘safety’. Over the short term, cutting quality jury-rigs financial reports to increase executive bonuses that are stupidly tied to share price. Over the long-term, cutting quality saves no money, but progressively decreases safety in operations.
The history of aviation is ripe with this recurring scenario. Good operating practices based on years of experience are sacrificed to reduce costs. Incidents increase, but are either unknown to management or ignored by it. An accident results. Corporate managers and regulators say, “How could we have known?” And, lessons already purchased in money and blood – but trampled in the rush to reduce costs and lower ticket prices – are paid for again.
How often have we read reports of accidents like the recent B737-800 fire in Japan, or the A320 runway overrun in Brazil, only to learn in the following months that a number of incidents bearing directly on the same causes had gone either unreported or ignored – sometimes for years? It is too often only an accident that suddenly brings the preceding history of incidents into the harsh light of reality, and even then only temporarily.
Using accident rates as evidence of safety misses the point that multiple incidents usually precede an accident, and are inherent in its cause. Accident statistics reflect accidents – not safety. Indeed, when viewed in the abstract, or over periods of time measured in less than decades, accident rates are usually nothing more or less than a statistical aberration, whether in absence or occurrence. That is, the absence of accidents does not mean that an airline is being operated safely, and the occurrence of an accident does not necessarily mean that an airline is being operated unsafely. Incidents are the key to analyzing safety.
If two accidents occur in December, and two more 13 months later, does the absence of any during the 12-month calendar year between them mean it was a safe year? In one sense, there have been four accidents in 14 months, and in the other, a ‘zero accident rate’ for the calendar year. If incidents at an airline are up several hundred percent, does the fact that no accidents have yet resulted mean it is operating safely?
It has been said that safety is not the destination, but the journey itself. Incidents are what we see every day on the journey, and creating a culture of safety requires a focus on those incidents, and their prevention. That only works if the reporting of incidents is encouraged – even rewarded – at every level. Managers must lead by example. One cannot hide evidence of incidents in the carpeted hallway, and expect incidents on the ramp, shop floor or flight deck to be reported faithfully. Incidents not reported are not available as an educational tool to create increased awareness, and the breaking of a future incident chain.
Finding ways to encourage the reporting of incidents and their incorporation into training for a culture of safety should be the goal, and the first step is to stop pretending that accident rates are the key. Accident rates are the wrong target for safety – instead, targeting incidents is the way to break the chain of errors before an accident occurs.
Mark H. Goodrich – Copyright © 2011
Accident Rates: The Wrong Target for Safety was first published in the May 2011 Issue (Vol 8 No 2) of Position Report magazine.