The human factors behind aviation accidents – a systems approach
Authored by: Dean Rotchin
Aviation is highly safe currently, nevertheless, 70-80 percent of accidents are caused by human error. Reducing the visibility of these accidents to pilot error is a shortsighted view of the picture. It does not take into account the functioning of the modern aviation system-and its ultimate failure.
The Beyond the Individual: Systems Thinking.
This principle is visible with respect to a close call during my early years of service as an instructor. It was the fourth early morning my captain was on his way. This landing was preceded by little rest and almost landing at a taxiway rather than a runway. It was an immediate reaction of the management on the pilot performance. A systems analysis indicated that there was no time to have long rest periods through scheduling. It was also weakly controlled on fatigue threats. In addition, the culture did not encourage the crew members to report fatigue.
It was not a bad decision by a weary pilot. Decisions made by the organization were the ones that provided an environment in which mistakes were bound to occur.
The SHELL Model in Practice
Human factors approach investigates the manner in which individuals relate to their surroundings in one way or the other. One model that will be useful is the SHELL model (Software, Hardware, Environment and Liveware). And accidents do not occur often due to one reason. Rather, they are typically a combination of failures. In this model, the defensive layers have holes, which are aligned, and the hazards pass through.
Take the case of Air France 447 of 2009.
The reaction of the crew on the failure of the pitot tubes played an important role. However, a systems analysis shows that there are a number of contributing factors:
-The sensors were left exposed by the designers.
-No high altitude stall recovery training.
-The crew was overly dependent on automation.
-The stress influenced the crew resource management.
-The knowledge on fly-by-wire safeguards was wanted.
-The layers were all weak, and when put together were fatal.
Organization Culture in a Safety Factor.
This has become evident in my undertaking of safety audits on local carriers. The safety culture in an organization indicates whether the organization is on the proactive or reactive approach to the human factor. Strong safety management systems (SMS) within the airlines encourage reporting of mistakes and near-misses. They do so without making anyone be punished. The approach collects valuable information. It assists in the identification of the areas of weakness within their systems, which can result in accidents.
One of the companies that I was employed at as an airline had a confidential reporting system. It showed certain disturbing tendencies in stabilizing approach in one direction. ATC frequently implemented late changes in runways during the rush hour. This was an additional burden to stick to the schedule. The airline escaped a controlled flight into terrain (CFIT) accident. They resolved this problem by enhancing the processes using ATC. They also eased schedules at such critical moments.
Practical Application of Safety Enhancement.
The system methodology offers viable steps of accident prevention.
-Good crew resource management (CRM) training should be the first step taken by companies. This training is directed at communication, decision-making and workload management. Recent CRM is aware that technical skills do not carry much weight when crews are unable to work effectively when under pressure.
-Replace duty time rules with the fatigue risk management system (FRMS). The fact that circadian rhythms, fatigue, and individual differences of scheduling are taken into account reduces error.
-Third, there is aviation organization culture that is supposed to be adopted. The culture assists in determining three categories of actions.
-White-collar errors require system repairs.
-Coaching is required in at-risk behaviors.
-There should be punishment for reckless actions.
Conclusion
The human factors approach changes the focus. Rather than point the finger at who is not the cause of the error, this approach looks at the system and asks. Why did our system allow such an error to occur? It does not take away individual accountability. It shows that human-considering systems result in safety improvements. They do not count on inhuman performance.
Authored by: Dean Rotchin, CEO and Founder, Blackjet