Features

Aviation safety and human element

Published

on

By Capt. G A Fernando
gafplane@sltnet.lk
RCyAF/ SLAF, Air Ceylon, Air Lanka, SIA and SriLankan Airlines.
Former Crew Resource Management (CRM) Facilitator, Singapore Airlines Ltd.
Member, Independent Air Accident Investigation Pool 

If aviation is so safe,why do airplanes still crash?

Aviation is considered by many to be a well-defended hazardous technology. Yet airplanes are still involved in accidents and incidents. Experts say that air crashes are due to one or more of the following factors: mechanical failure; inclement weather; sabotage; and pilot error – among other rare, intangible and unique situations. Such that these days a designated chief investigator of air accidents will declare in his/her final accident report, the standard phrase the ‘most probable cause’. These final findings are monitored, published and circulated by the International Civil Aviation Organisation (ICAO).

By far the most dominant factor is ‘human error’. It is estimated that 70 to 100% of air accidents occur as a result of choices made by human decision-makers inside and outside the flight deck. The so-called ‘pilot error’ of the past has now been taken out of the air accident investigator’s vocabulary and replaced with ‘human error’.

Consider this. When a judge gives an arguably wrong verdict, it is labelled a ‘miscarriage of justice’. When a patient dies on the operating table, they call it ‘surgical misadventure’. However, when there is an air accident it is due to ‘pilot error’.

‘Pilot error’ was a convenient way to find a culprit in the early days of air accident investigation and reporting, as the dead can never defend themselves. Punishment was only self-satisfying, and did nothing to prevent the recurrence of a similar accident. Instead, air accident investigators now attempt to determine the ‘cause behind the cause’ in a non-punitive atmosphere while striving toward a ‘just culture’ in contrast to ‘blame culture’. Accountability is still maintained. The ultimate aim being future accident prevention by learning from one’s mistakes and making the system error-proof. ‘Culture’ quite simply means the way people do things in an organisation.

Human Condition is the final frontier

It is now acknowledged that intelligence and the ability to err are two sides of the same coin. Human error could and does exist in the general scheme of things. It is not surprising because it is the human being who designs, builds, operates, maintains, manages and regulates these systems. The experts have assigned various labels to human error, such as ‘slips’, ‘lapses’, ‘mistakes of omission’ or ‘commission’, ‘latent failures’, ‘active failures’ and, lastly, ‘violations’.

To give readers a quick and simple explanation, I shall use an example in the motoring context. If one was supposed to maintain 100 kph on an expressway and the speed slowly creeps up to 120 kph, it is considered a ‘slip’. If on the other hand one forgot to wear the seatbelt, it is a ‘lapse’. If one tries to overtake on a dual carriageway after checking for opposite-direction traffic, but encounters oncoming traffic at the last moment and is forced to cease the overtaking manoeuvre and gets back behind the preceding traffic, then it becomes a ‘mistake’, although action was based on a set of rules. There may be a driver who may deliberately cross a double line to overtake a vehicle. He has then carried out a ‘violation’ and is liable for punitive action.

Although in the last 120 years in aviation the mechanical system (hardware) reliability has improved in leaps and bounds, the human condition has remained the same. That is, fallibility regardless of culture, race or background, and subject to individuality, reasoning, fear, anger, love, joy, fulfilment, pride, grief, optical illusions, confusion, suffering, health, pain, mortality, etc.

More than 2000 years ago the Roman philosopher Cicero said: “To err is human”; long afterward Alexander Pope (1688-1744 AD) declared: “To forgive, divine.” Interestingly a signboard in a Royal Air Force (RAF) hangar, somewhere in the UK, states: “To Err is Human. To forgive is not RAF Policy.”

Then there is the matter of ‘situational awareness’, which is essentially knowing what is going on around you and being able to project what is likely to happen next. It is being ahead of the aircraft. The priorities for the pilot are to aviate (fly), navigate and communicate, in that order. With distractions and a high work load, a crew might lose situational awareness resulting in confusion. What is then recommended by the experts is to get away from the ground, if at low altitude, and re-establish situational awareness.

As an aid to understanding the human factors in any task undertaken, in 1972 Elwyn Edwards promulgated the SHELL concept. It was subsequently modified by Frank Hawkins and presented by him at a landmark International Air Transport Association (IATA) technical conference in Istanbul in 1975. In the acronym the letter ‘S’ stands for Software, ‘H’ for Hardware, ‘E’ for the working Environment, all these interacting around the ‘L’, for Liveware, which is the human element, i.e. the human operator, who, in turn, may not be working alone but with the second ‘L’,  other Liveware elements inside and outside the Flight Deck forming the team. (Other Crew Members, Air Traffic Control, Flight Despatchers, Engineers/ Mechanics, Ground Staff, etc.).

Software: Refers to all the rules and regulations, company manuals and instructions pertaining to aviation, navigation, flying training and maintenance. The data and paperwork governs the operation of aircraft, such as weather forecasts, air traffic control (ATC) clearances, flight dispatch, notices to airmen (NOTAMS), aircraft performance, and crew licencing and limitations.

In 1974, a Boeing 727 approaching Dulles Airport, in Washington, crashed into a mountain, with a loss of 92 lives. The accident was caused by a lack of clarity and inadequacies in air traffic control procedures and regulations. The absence of timely action by the regulatory body to resolve a known problem in air traffic terminology was also listed as a factor (Accident Report NTSB/AAR 75-16).

In 1979, a McDonnell Douglas DC-10 of Air New Zealand crashed into Mount Erebus in Antarctica. Information transfer and company data entry errors played a definite role in the accident (Accident Report No. 79/139, New Zealand).

Hardware: Refers to the aircraft’s mechanical components, computers and automatic systems. Airbus company instructors sometimes referred to the autopilots as the ‘brains’ of the aircraft and the Flight Management System (FMS) as the ‘heart’ of the aircraft through which the pilots interact.

Hardware could be at the root of a mechanical problem or a design fault. In 1974, a Turkish Airlines DC-10 crashed after take-off because a cargo door failed (it opened and blew out due to cabin pressurisation). The force applied by a cargo handler to close the cargo door, the door design and an incomplete application of a service bulletin, were cited as factors (ICAO Circular 132-AN/93).

Environment: Refers to the temperature, noise, stress and vibration within the flight deck. The work environment of the crew comprises elements such as weather, loss of visibility due to fog, mist or haze, rain, ice, snow and hail, wind velocity, wind direction and turbulence.

In 1977, two Boeing 747s, belonging to Pan Am and KLM, collided in bad visibility while on the runway at Tenerife, Canary Islands, with a loss of 583 lives. Breakdowns in normal communication procedures and misinterpretation of verbal messages were considered factors (ICAO Circular 153-AN/98).

Liveware: Refers to the human element which is the weakest link in the system. It is a truism that a chain is as strong as its weakest link. Much needs to be done to keep the human element safe, not by punishment but by participation. Good communications is, by far, a key element.

In 1974, a Pan Am Boeing 707 crashed during approach at Pago Pago in American Samoa, with the loss of 96 lives. A visual illusion related to the ‘black-hole’ phenomenon was a causative factor (NTSB/AAR 74-15). With no lights in the foreground, pilots carrying out a visual landing approach in the night might not approach in a straight line but in a curve, compromising terrain clearance.

(To be continued)

Click to comment

Trending

Exit mobile version