HF has evolved from the early Cockpit Resource Management (CRM) in the 1980s, through to Crew Resource Management in the 1990s, acknowledging that it was more than just pilots involved with the safe operation of an aircraft. The evolution continued through to the requirement today to have HF/NTS integrated into an SMS.
Personnel in aviation are highly trained in technical skills. It is, however, the cognitive ability associated with successful HF/NTS that complements technical skills to contribute to safe and efficient task performance. HF/NTS includes an understanding and awareness of:
- Leadership (and ‘followship’)
- Alcohol and Other Drugs
- Situational Awareness
- Decision making
- Threat and Error Management
- …(others – this is not an exhaustive list).
Human Factors can be considered as ‘smudges’ on a virtual window that separates reality from perception as detected through cognitive and physical senses.
Human Factors is about human performance, behaviour, error, and an understanding of how humans interact with procedures, technology, environment and other humans.
Ultimately it is unsafe behaviours and actions that lead to accidents.
Behaviours are influenced by multiple inputs, commencing in the formative years when our personalities are being formed.
The SHELL Model
The SHELL model is used by ICAO to identify where HF breakdowns might occur.
An encompassing view of culture needs to be considered to expand the scope, and to understand the full spectrum of HF influence, both positive and negative.
Breakdowns between a human (the Liveware component in the middle) and any of the surrounding components, can lead to HF consideration and possible error.
Consider the following table:
The above Breakdowns/Failures are examples with possible outcomes.
Consider the following table with the same Breakdowns/Failures but with different outcomes.
Clearly the consequences, cultural influences, and potential causes can be greatly varied. They may be the result of active or latent failures which may only become evident through the investigation of the event.
Other components of the SHELL model may also interact. For example, a weather incident (Environment) causes a ground equipment failure (Hardware) that has consequences for pilots (central Liveware). However it is clearly an E-H problem rather than an HF issue as it did not involve L-H or L-E application.
The Reason Model
The Reason model of accident causation describes how the breaches of multiple system defences can result in an accident. Reason argues that single point failures in complex systems, like aviation, that are well defended by layers of defences, are rarely consequential. Defence failures (breaches) can be both active or latent.
An Active failure is a decision, regardless of the motivation (mistake, error, lapse or violation) that results in a defence layer being breached.
For example, maintenance crews using work-a-rounds to achieve operational efficiency when they know a procedure might be contrary to SOPs.
A Latent failure is more insidious, it lies in wait and is unknown until discovered.
For example, An organisational manual that details company procedure, that happens to be contradictory to OEM manual perhaps prohibiting such action.
The PEAR model – an HF model for Engineers
Although appropriate and applicable to maintenance, both the SHELL and Reason model were conceived and fitted for flying operations. Use of both in the maintenance environment tended to be a ‘bolt on’.
The PEAR model was developed for HF specifically with maintenance in mind. It has been widely used in both EASA/FAR and FAA for HF training for maintenance personnel. It has now been adopted by CASA for HF training for CASR Part 145 organisations.
Human Factors Analysis and Classification System (HFACS) could be used as an alternative to SHELL model. It is based on James Reason’s accident causation model (Swiss Cheese) and was developed by behavioural scientists in the United States Navy.
The HFACS framework provides a tool to assist in the investigation process and target training and prevention efforts. Investigators are able to systematically identify active and latent failures within an organisation that culminated in an accident. The goal of HFACS is not to attribute blame, it is to understand the underlying causal factors that lead to an accident.
The aim of the models – to generate thoughts and discussions about the HF breakdowns involved in an incident. It is not necessarily to neatly categorise where the breakdown belongs. This will become apparent in Exercise 1
CASA Drama – Crossed Wires (From the CASA YouTube channel)
CASA Drama – The Right Connections (From the CASA YouTube channel)
What the Experts say: Human Factors within an Organisation (From the CASA YouTube channel)
CRM the old way!
A different perspective – the Elaine Bromiley case – UK 2005
‘what you and I do is remarkably similar; We take people from a place of safety – we dangle them in the jaws of death for several hours, completely unbeknownst to them – and deliver them to another safe place where they were really none the wiser of just how frightening their circumstances have just been.’ (RFDS Anesthetist to Pilot conversation 2014)
Elaine Bromiley was a 37 year old UK mother of two who was admitted for routine elective sinus surgery on the recommendation of her GP. She did not survive.
At 0835 she was put under general anaesthetic and the anaesthetist set about trying to administer oxygen through a laryngeal mask. He was unable to fit the mask and tried several sizes. He proceeded to try and intubate but encountered obstruction from the soft pallet. Realising the seriousness of the situation he called another anaesthetist and an ENT surgeon. The three doctors worked feverishly to try and get oxygen to Elaine. At 0847 her heart rate and Sats were both down to 40 and deep cyanosis was setting in. The senior theatre nurse observed Elaine’s colour and retrieved a tracheostomy kit and announced this to the doctors. The doctors looked at her but none spoke. Another nurse noted the symptoms and immediately notified ICU for an inbound patient. Again, the doctors ignored her and the room was cancelled. They continued a further 10 minutes before they managed to secure an airway, over 20 minutes without oxygen for Elaine. She died 13 days later when her husband made the decision to turn off her life support, having been in a coma since the operation.
It was a tragic outcome for a team of very experienced, well-meaning professionals doing their level best dealing with an unanticipated complication. The doctors, when interviewed, were collectively flabbergasted when it was pointed out that 22 minutes had elapsed without Elaine having oxygen – they simply couldn’t believe it. This was the human element acting under stress; time compression and a loss of situational awareness. The doctors simply lost track of time.
The other crucial aspect was that others in that team had the critical insight that could have averted tragedy. But for whatever reason they were unable to bring that information to the key decision makers in a way that would change the trajectory of this occurrence. Elaine Bromiley’s husband Martin was an airline pilot, who happened to specialise in HF. In the sad aftermath of having been told with his two young children, ‘Martin, we are terribly sorry…there were complications…it is simply one of those things…’ He naturally asked when the investigation would be so that the lessons learnt could be used and other families would not have to suffer as they had. He was told there was never an investigation unless litigation was involved.
As a group – let’s discuss the HF failures of this case study
Human Factors – Antarctica
Track maintenance using GPS
Track maintenance using GPS