LOGIN   |   LOGOUT   |   REGISTER

Christchurch International Airport Limited

Whats is SMS?

 A Safety Management System (SMS) is a set of integrated elements that work together to promote safety, create a safety culture and therefore enhance efficiency. It will protect workers, help to maintain quality, and promote the reputation of CIAL.

The NZ government follows the recommendations of the International Civil Aviation Organisation (ICAO) that SMS will be integrated into all commercial aviation operations. New Zealand legislation commenced February 2016 to enforce this and will be complete by Feb 2021.  CIAL is part of the first group to achieve compliance with CAA NZ in July 2017.

Compliance does not mean safety.  SMS will continue to evolve from present and suitable, to become operational and effective.  Part of this progression is training and awareness of all staff.

SMS consists of;

  • Safety Policy and Accountability
  • Coordinated Emergency Response Planning
  • Development, Control and Maintenance of Safety Management Documentation
  • Hazard Identification
  • Risk Management
  • Safety Investigation
  • Monitoring and Measuring Safety Performance
  • Management of Change
  • Continuous Improvement of the SMS
  • Internal Audit Programme
  • Management Review
  • Safety Training and Competency
  • Communication of Safety-Critical Information

How does this work as a system?  Which ones really affect me?

From CAA NZ Booklet 2
From CAA NZ Booklet 2

What is Safety?

“Safety is the state on which the possibility of harm to persons or of property damage is reduced to, and maintained at or below, an acceptable level through a continuing process of hazard identification and safety risk management.” (ICAO SMM)

Safety is relative, and acceptance of risk is a management decision that balances risk with profit.

In aviation safety is achieved by taking deliberate actions, to achieve predictable results, and by maintaining a sense of ‘chronic unease’.

Can we eliminate all accidents and serious incidents?

Can we reduce risks to negligible?

Can we eliminate all errors

Can we regulate our way to safety?

Practical drift and normalisation of deviation

PD and NoDPractical drift is where a system’s baseline performance ‘drifts away’ from the design parameters.  This could be due to a system being ulitlised more effectively than expected, and where performance exceeds the intention as the operator uses initiative and innovation to maximise efficiency.  Unfortunately it could also result in less than expected performance due to misuse, misunderstanding or a lack of appropriate training or supervision.

It is defined ‘drift’ because it will inevitably be the result of daily use where the movement away from baseline performance is barely detectable, and is due to external circumstances outside the design criteria of the system.

Normalisation of deviation is the intentional violation of procedure that occurs so regularly that it becomes the norm.  The lack of negative outcome produces the illusion that deviation from normal procedures is acceptable. Many accidents have occurred as a consequence.  The saying ‘…but we’ve been doing it this way for years…’ is often cited as a defense of the indefensible.

SMS vs QMS vs WHS

What is the difference?

 

Safety Culture

What is it?

 ‘…the responsibility you bear me, I bear you…’

From Professor James Reason
Without a safety culture, you don’t have a SMS – it’s that simple.  If people don’t feel empowered to contribute to it, they won’t feel inclined to be engaged with it, and therefore not be protected by it.
The safety culture in an organisation is integral to the Human Factors influence, both positive and negative.
Safety culture needs to be resilient. It must be able to be felt despite breaches of confidence, lapses by individuals and inevitable mistakes.  However it will never survive management indifference.

Preservation of the safety culture comes with Management commitment. When management ‘walks the walk’ all workers feel empowered to contribute to a positive safety culture. Damage to safety culture comes with management inflexibility, inconsistency or hypocrisy of safety standards.

Human Factors

Human make mistakes.  Humans are inherently susceptible to making errors, lapses, slips and even violations.  A complex safety system seeks to contain and control errors, not eliminate them.

Human Factors is the understanding of the influences that create errors, as a means of defending against their consequence.  These influences are;

  • CommunicationMechanics work over assembly of aviation engine
  • Leadership
  • Teamwork
  • Stress
  • Fatigue
  • Alcohol and Other Drugs
  • Situational Awareness
  • Decision making
  • Airmanship
  • Assertiveness
  • …(others – this is not an exhaustive list)

Aviation personnel receive Human factors training.  It is the understanding and awareness of these cognitive non technical skills that lead to greater efficiency and safety in the employment of the technical skills.

Human Factors is the skill set that allows personnel to participate in a Safety Culture.

  • Will I tell the boss bad news?
  • Do I have the courage for safety
  • Can is speak up if required?

Management Commitment and Accountability

Management is write a Safety Policy, Set KPIs and Safety Objectives as part of SMS.  Key personnel will be employed to manage safety. Managers will be held accountable for safety of personnel and the reputation of CIAL. But more importantly management will commit to a Just Culture.

 Fair and Just culture culture means protection of reporters, so long as there is a clear distinction of what acceptable and unacceptable behaviour is.
“If you report something – even if it is a mistake that you made – we will not take punitive action against you.  However, if you have willfully disregarded known procedures, acted negligently, criminally or flagrantly disregarded clear rules, you must bear the consequences of that decision…”
The Just Culture flowchart used at CIAL is derived from the Decision Making Tree by Professor James Reason
This flowchart tracks the decision making route taking in the event of an error made by an employee or contractor at CIAL

Reporting of Accidents, Incidents, Near misses and Hazards

The Heirich Pyramid (Skybrary)

Aviation is a high reliability industry.  It is interested in what has gone wrong, what could have gone wrong and what might go wrong.  This is how data is gained to defend against low frequency, high consequence events.

Reporting is the engine room for data gathering.
‘Experience is the cruelest teacher of all. It gives the results first, and the lessons after”
Oscar Wilde

The identification of hazards is the responsibility of every person at CIAL. Hazards can be identified through many sources;

  • Brainstorming – getting groups together in the workplace to nominate hazards

    Adaptation of the Heinrich pyramid (Safety Culture Blog)
  • Formal review of standards, procedures and systems
  • Staff surveys and questionnaires
  • Critically reviewing operations
  • Hazard and incident reports
  • ‘War-gaming’ risk scenarios
  • Trend analysis
  • State investigations of incidents and accidents
  • Collaboration with similar operations
  • Flight Data Analysis Programs

Your contribution to the SMS is critical to its success and progression.  The most meaningful and important contributions individuals can make is to report what they see, identify hazards as part of your team and have the courage to speak when required.

What sorts of things could we report?

  • A physical hazard that could injure a person
    • Example – fuel source located near an ignition source
  • A Procedure that compromises safety
    • Example – Baggage containers not being checked for servicability
  • A Practice that has become Normalisation of Deviance
    • Example – Not setting park brake when stopped after towing an aircraft
  • A Process that is inadequate
    • Example – No system to monitor expiration dates of security passes

CIAL uses SERF for the management of safety reporting

  1. Person reports an event at CIAL
  2. One-up manager will initially investigate/review the event
  3. Safety Team Admin review will determine the level of investigation
  4. There will be a simple or ICAM investigation depending on the risk level
  5. Corrective actions will be determined and recommended
  6. The investigation will be finalised with delegated actions to be carried out

What does this look like?

It is important personnel understand that ‘an investigation’ is not a blame allocation exercise.  It is about understanding what went wrong, how lessons learned can be gained, and how to prevent a re-occurrence.

CIAL personnel can expect a ‘closing of the loop’ – feedback – on reports submitted.

Reasons to report – and why people do not…

Yes - I should report itNo - I will hide it
I don’t want an accident to occurI might get fired
I know I’m supposed toNothing will change
It’s easy to fill out a reportI’ll look foolish
It’s the lawI might lose my licence
The system will support meThey will look for someone to blame
It might happen againIt’s too hard
We can learn from thisIt’s unimportant
Somebody saw me…Nobody saw me

 Safety Leadership

Good managers contain complexity
Good leaders influence people
It is possible to be both
It is possible to be neither
Leaders are not necessarily managers
Everyone is a safety leader
Safety is a shared responsibility