The future of education starts here...

This month, I've been a bit obsessed with the topic of air safety. Well, why wouldn't I be? Safety must always be our top priority. Data pulled from the Aviation Safety Network for the month of November shows there have been over 200 crashes, and that number doesn't include other incidents such as near-misses. Statistics show that an overwhelming majority of air accidents occur as a result of what are known as "human factors," and in my opinion, there is a critical need for the aviation community to do whatever it takes to find a way to reduce the number of accidents occurring.

A crashed aircraft

Human factors defined

So, just what are human factors anyway? This is a blanket term used to describe any mishap that can be attributed to reasons other than structural faults, system malfunctions, or "Acts of God." In other words, they are mishaps that result from humans being human.

Human factors can be the sole contributor to an accident, or they can occur simultaneously with system malfunctions or structural faults. The term can be applied to accidents caused by pilots, mechanics, engineers, cabin crew, ground crew, passengers, and any combination of these options.

Even people who are far removed from the flight deck such as airline managers can be a source contributing to an accident in the air.

The point I hope to make here is that everyone in the aviation community, including passengers, has a role to play in helping to keep aviation as safe as it possibly can be.


Other accident causes

Human factors are a leading cause of aviation accidents, but there are many other potential causes, so it is only fair to list some of them.

Other causes can include:

  • Structural problems
  • System malfunctions
  • Serious weather events
  • Natural disasters (eg. volcanic eruptions)
  • Unpredictable non-preventable events in flight (eg. poison gas emission)

Some of these are very far-fetched and therefore unlikely to occur often in reality. In fact, accident causes usually overlap, giving rise to the description of the "Swiss cheese model" of accident causation developed by the aptly named James Reason.

This model demonstrates that individual causes are rare. Most of the time it is a combination of many factors that lead to the ultimate failure. In aviation investigations, we refer to this as "the chain of events". If each event occurring in the timeline is represented by a block of cheese, then looking back we can see how the holes align to create the disaster. Ultimately we can often see that many accidents can be traced back to a cause far from the airport that has nothing to do with the cause identified as the primary cause in the investigation report.

For example, in the fatal crash of AA965, pilot error was found as the primary cause, however while this played a prominent role in the outcome, unraveling the chain of events reveals that the originating cause was sabotage of the Cali ground radar station by FARC forces.

Despite this being the originating cause (which means the crash is unlikely to have occurred at all if the approach radar had been functioning correctly), it is barely given a mention in the investigation report.

If you have never been to Cali, it may not be obvious why a functioning approach control radar is so important. This is a radar system used at TRACON centers to help to identify and locate aircraft as they approach and depart from airport terminal areas.

When there is no radar present or when it is not functioning, the controller has no way to visually confirm the location of an aircraft. Even worse, although not relevant to this case, the controller has no warning of potential separation conflicts (something that happens when aircraft are flying too close to each other). Alternative methods are employed to try and keep aircraft separated effectively, but this situation is far from ideal or even reliable.

Cali, like many other South American cities, is located in a wide valley between two very steep mountain ranges. This type of terrain can create unique weather phenomena, which adds to the potential problems for pilots and controllers.

Personally, I feel that traffic should have been diverted away from Cali until the damaged radar had been fixed. This would have been the "safety first" approach, however there are several reasons why authorities would have chosen not to do so:

  • It was Christmas time.
  • The most convenient alternative airport was at MedellĂ­n, 400km (249 miles) away, a journey of several hours by road.
  • Pilots, passengers, and locals would have been severely inconvenienced by a diversion.

So, while I believe diverting the airplane could have been enough entirely on its own to prevent the tragedy from occurring, there were legitimate reasons for why this was not done. Indeed, once the plane was suspected to have crashed, the authorities reportedly gave out a story to those waiting at the airport that the plane was circling above the airport because they had closed it.

With the originating cause secreted away, the other contributing factors must come under scrutiny. And it is here that investigators unanimously came to the conclusion that human factors were a primary causative reason for the crash.


How human factors create risk

Humans, for all their wondrous achievements, are still incredibly vulnerable creatures. Driven by a combination of hormones and sheer will, they can be weakened by fatigue, hunger, thirst, injury, illness, medications, and illicit substances.

Any of these conditions being present can cause the incredible human machine to malfunction. When multiple conditions are present simultaneously, serious errors can result.

As mentioned earlier, everyone connected to the aviation industry, including passengers, is subject to human factors. But those with the most responsbibility are pilots.

As a pilot, you need to be as aware of your own internal state as you are of what's happening inside and outside the cockpit.

The most important time to perform this self evaluation is just before flying. but you should also make it a habit to check your internal state at regular intervals during the flight as well.

Another task that should not be neglected is to establish good CRM. If possible, developing a rapport with your fellow pilot and other crew members will also contribute to a positive environment within the cockpit.

What gives the tragic fate of AA965 an extra edge is that both pilots were very experienced and well-trained. They should have been aware of the human factors that were influencing their bad decisions and should have taken corrective actions when they became aware that they were not functioning optimally.

One thing that has become apparent to me as an instructor is that when an inexperienced student is at the controls of an airplane, they will be paying keen attention to everything and will try to do everything "by the book."

It is when pilots have more experience that they can start to become complacent. With that in mind, let's take a look at the chain of events affecting AA965 from a human factors perspective and see if these factors could have been mitigated. For the sake of brevity, we will ignore factors that don't apply to the situation.

ACTUAL CHAIN OF (RELEVANT) EVENTS

infographic showing the timeline of the incident

With this timeline established, we now can make an analysis of the human factors impacting the flight crew and determine how these factors could have been mitigated.

  1. Sabotage: if the FARC guerillas had not disabled the radar, the controller would have noticed the aircraft was not in the position reported by the pilots and could have attempted to warn them.
  2. Taking the path of least resistance: There is a tendency in aviation management, which has spread to other authorities, to prioritize passenger convenience over safety. This unfortunate tendency has been a factor in many crashes. Instead of closing the airport to inbound traffic, which was an option (in my opinion, the only appropriate option), authorities chose to remain open in order to cause the least amount of disruption to anybody. To be fair, only one plane crashed that night. But perhaps that was one too many.
  3. Language: the controller later stated that he wanted to warn the crew that the plan to fly direct to Rozo did not make any sense, but that his English language skills were not sufficient to explain why. Furthermore, the Captain of AA965 exhibited poor use of Aviation English when he checked in with "American 965, leaving Flight Level two four zero, descending to two zero zero." This is non-standard phraseology similar to the error that had caused a fatal crash in Malaysia just 7 years earlier. While this slip didn't play a role in the crash of AA965, it was an unacceptable lapse of radio discipline when communicating in Aviation English with a non-native English speaker.
  4. Temporal anxiety: the flight left from Miami 2 hours late. This was a catalyst for creating severe anxiety in the Captain of the flight because of his concerns about the legal problems it would create for the crew. FAA rules require all crew members to take mandatory 10 hour breaks if the total number of flight hours within 24 hours exceeds 8 hours. There are also weekly, monthly, and annual limits on the number of hours that can be flown. The Captain's concern was that any additional delay in the aircraft's arrival time in Cali ran the risk of pushing the crew over the flight duty hour limit, and thus would prevent them from being able to return to Miami on the next flight. This appears to have induced a state of "get there itis" in the Captain, causing him to prioritize the arrival time.
  5. Poorly executed CRM: on multiple occasions, the crew exhibited technical errors in Crew Resource Management (CRM) procedure, including the Captain's fateful incorrect data input. If the pilots had been monitoring the CDU and EHSI as they were supposed to do, the error that caused the plane to go off course would have been noticed and corrected.
  6. Inattention / willful ignorance: on more than one occasion, the Captain gave incorrect readbacks to communications from the controller, and keyed in a direct approach to Cali when ordered to proceed via Tulua. When reminded of the importance of reporting in after crossing Tulua, the Captain failed to check and update the FMS, leaving it set direct to Cali.
  7. Poor planning: during the later stages of the flight, the pilots exhibited a lack of familiarity with the planned route, and gave clear indications of not knowing key information from the approach chart for the Rozo One arrival. As the flight was delayed for 2 hours before taking off, there was plenty of time to study the route and prepare properly for the flight.
  8. Inability to recognize and accept fault: once it became evident that the pilots were lost, they indicated a lack of faith in the navigational data displayed to them by the instruments and avionics, spending precious moments willing the waypoints to be where the pilots expected the waypoints to be. When they finally conceded and entered the correct code for ULQ, they still did not take the correct action. This would have been to retract the flaps and speed brake, accelerate and climb to the previously assigned altitude, and recommence the Rozo One approach from Tulua (ULQ). Instead they continued to decelerate and descend as they had forgotten to fly the plane.
  9. Memory lapse under stress: the First Officer forgot the speed brake was deployed, and even after the first stall warning, neither pilot checked the speed brake or attempted to retract it. This led to a second and fatal stall.
  10. Over-reliance on automation: both pilots trusted the FMS to guide them safely to their destination, and so did not perform the proper checks, apparently unaware that the FMS is just a computer that will unemotionally execute any order, even when the order is illogical and potentially dangerous.
  11. Over-reliance on Captain's experience: the report authors also decided that the First Officer had too much faith in the Captain's experience flying into Cali, and so had "relaxed vigilance" during the flight.
  12. Lack of situational awareness: the pilots failed to respond to every indication of being lost and remained ignorant of this until almost the last moment. They also apparently had no idea of having crossed the mountain range, and did not know their proximity to terrain until the GPWS activated.

Concluding remarks

In the dynamic landscape of aviation safety, we've witnessed commendable advances over the decades, particularly in recognizing the importance of CRM training and Human Factors education. Despite these advancements, the aviation community still faces a critical challenge: the stability in the number of air crashes over the past decade, predominantly attributed to Human Factors. The expectation is that safety should improve as knowledge expands, but with crash numbers remaining stable, it is a disappointing result.

This is not a call for despair but an opportunity for improvement. We've come a long way, and we can go further! As you embark on your flights, remain vigilant to the Human Factors influencing both you and your crew. Remember that air traffic controllers are human too, susceptible to errors. Communicate with politeness and accuracy, and don't hesitate to address potential mistakes.

Plan each flight meticulously, committing critical information about your route to memory. Maintain unwavering situational awareness throughout the journey. Most importantly, cherish and enjoy the flight. Your mood is not just a personal experience but a key factor in ensuring a safe and successful flight.

Let's build on our achievements, learn from our experiences, and collectively contribute to a safer future in aviation.