NGO Security Incident Reporting; Towards a “Just Culture” (Part I)

NGO Security Incident Reporting; Towards a “Just Culture” (Part I)

Written by: Ebe Brons, Director Centre for Safety and Development, ebe@centreforsafety.org

 

“The single greatest impediment to error prevention … is that we punish people for making mistakes.” Dr. Lucian Leape, Professor, Harvard School of Public Health

 

Getting security incidents reported within NGOs is incredibly difficult. Many NGO managers are aware that only a small percentage of the security incidents are reported . They just do not know the exact numbers and keep feeling they are missing out on something. They are probably right.

In this article we will review how other industries deal with this challenge (part I). In a following article (part II) we will look at practical NGO cases and how we can learn from them. The intention of both articles is to develop a practical way of working for incident reporting that can be used throughout the NGO community.

 

Improving security

“People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.” Don Norman, Author, the Design of Everyday Things

 

The problem of a no-reporting culture is more than just not being “in the know”; it is about not being allowed to take responsibility for the safety and security of your colleagues and about improving your security system. Without the input of incident reporting you never really know what works and what doesn’t when you try to improve the organisational security. Knowing what goes wrong inside your organisation enables you to learn and adapt. You can take appropriate action in order to prevent the incident from happening again.

The idea is that security systems are not flawless and have to be adapted all the time.  They are never in an end-state. A security system is an organic system that continually has to adapt to a changing environment and needs to be fed with information. This information comes from aid workers who work in environments where security incidents can occur. The aid workers benefit from the security systems because they reduce the risk of security incidents.

 

Improving adaption in a Just Culture

Why are security incidents underreported by our staff? Simply put; Fear and ignorance. Firstly: Fear, because incidents are embarrassing and they can lead to punitive action, not reporting incidents saves your staff shame and punishment. If you do not acknowledge these factors, you will never get the right culture where reporting is considered professional. You have to address the fears of your staff and build a system that allows them to report without fright of punishment or humiliation.

Secondly, ignorance. Most NGOs are dealing with security on a more professional level and with higher standards for several years now. These new security standards are not always commonly known in the field. Therefore staff members on the ground do not always register an incident as a reportable situation. When an unsafe situation ends well, staff are relieved and go back to work. The idea of reporting an incident does not always seem logical to them as it is not common practice.

For incident reporting to work, staff need to know when to report, must feel empowered to report and must know that management has their back because they are on the same side.  In the aviation and medical industries this culture is called a Just Culture.

Dr. James Reason (Department of Psychology, University of Manchester) described a Just Culture as an atmosphere of trust in which people are encouraged, and even rewarded, for providing essential safety-related information, but in which they are also clear about where the line must be drawn between acceptable and unacceptable behaviour.

 

Acceptable and unacceptable behaviour

Just Culture is not the same as a blame-free culture. Staff are accountable for their actions but there is a distinction between acceptable and unacceptable behaviour. This line between acceptable and unacceptable behaviour is key to establish a Just Culture. The Institute of Safe Medical Practises describes the different sorts of behaviour that are related to security incidents on their website:

 

How does the organisation respond to human error, at-risk behaviour, and reckless behaviour?

Three types of behaviour should be anticipated in an organization. Each type of behaviour has a different cause, so a different response is required.

 

Human error involves unintentional and unpredictable behaviour that causes or could cause an undesirable outcome; it is not a behavioural choice—we don’t choose to make errors. Since most human errors arise from weaknesses in the system, they are managed within a Just Culture through system redesigns that reduce the risk of errors.

Discipline is not warranted or productive because the worker did not intend the action or any undesirable outcome that resulted. In a Just Culture, the only just option is to console the worker who made the error and to redesign systems to prevent further errors.

 

At-risk behaviours are different than human errors. Behavioural research shows that we are programmed to drift into unsafe habits, to lose perception of the risk attached to everyday behaviours, or mistakenly believe the risk to be justified.

 

Our decisions about what is important are typically based on the immediate desired outcomes, not delayed and uncertain consequences. Over time, as perceptions of risk fade away and workers try to do more with less, they take shortcuts, violate policies, and drift away from behaviours they once knew were safer.

 

These at-risk behaviours, often the norm among groups, are considered to be “the way we do things around here.” In a Just Culture, the solution is not to punish those who engage in at-risk behaviours, but to uncover and remedy the system-based reasons for their behaviour and decrease staff tolerance for taking these risks through coaching. 

 

Reckless behaviour, in comparison to at-risk behaviours, means that workers who behave recklessly always perceive the risk they are taking and understand that it is substantial. They behave intentionally and are unable to justify the behaviour (i.e., do not mistakenly believe the risk is justified). They know others are not engaging in the behaviour (i.e., it is not the norm).

 

The behaviour represents a conscious choice to disregard what they know to be a substantial and unjustifiable risk. In a Just Culture, reckless behaviour is blameworthy behaviour. As such, it should be managed through remedial or disciplinary actions according to the organization’s human resources policies.

Source: http://www.ismp.org

 

This description of acceptable and unacceptable behaviour from the Institute of Safe Medical Practices makes clear that you have to differentiate between responses. Although many types of behaviour can be in a grey zone, this diagram from David Marx (www.justculture.org) shows that they can be categorised. Depending on the nature of the error, at-risk behaviour or recklessness, an appropriate response can be chosen.

 

Diagram: accountability of behaviour

Type of Behaviour:

Human Error

 

At-Risk Behaviour

Reckless Behaviour

Behaviour:

 Inadvertent   action;

Slip,   lapse, mistake

 

 A   choice;

risk   is not recognised or believed justified

 

Conscious   disregard of unreasonable risk

Response:

 Manage through changes in:

 

  •   Processes
  •   Procedures
  •   Training
  •   Redesign   of the system

 Manage through:

 

  •   Removing   incentives for at-risk behaviour
  •   Creating   incentives for healthy behaviour
  •   Increasing   situational awareness

 

 Manage through:

 

  •   Remedial   action
  •   Punitive   action

Type of Response:

 Console

 

 Coach

Punish

 

Case study

In 2002 the Danish government agreed on a law for non-punitive, confidential reporting in Naviair, the Danish Air Traffic Control Service Provider. This case, is shortened to the highlights (full case description available on flightsafety.org). Although the aviation industry is not the same as the NGO world, we can learn from this case and use it for our own benefit.

 

The implementation of a Just Culture was highly successful within Naviair. The number of reports in Danish air traffic control rose from approximately 15 per year to more than 900 in the first year alone.

 

The Danish system includes the following:

• Mandatory: Air Traffic Controllers must submit reports of events. It is punishable not to report an incident in aviation.

• Non-punitive: Reporters are ensured indemnity against prosecution or disciplinary actions for any event they have reported based on the information contained in the reports submitted.

• Confidential: The reporter’s identity may not be revealed outside the agency dealing with occurrence reports. Investigators are obliged to keep information from the reports undisclosed.

 

After implementation the following lessons were learned:

 

• Trust/confidentiality – one break in this trust can damage a reporting system; all reports must be handled with care.

• Non-punitive nature – it is important that information from self-reporting will not be used

to prosecute the reporter.

• Ease of reporting – Naviair uses electronic reporting, so that controllers can report

whenever and wherever they have access to a computer.

• Feedback to reporters – the safety reporting system will be seen as a “paper-pushing”

exercise if useful feedback is not given.

 

How would this work within a NGO?

A Just Culture starts with implementing a policy that is mandated by the (board of) directors. This is crucial as it gives the employees the guarantee that this is an organisation wide pledge. The policy states a clear goal, for example that reporting is encouraged to improve the security system and to prevent incidents from happening to staff and the organisation in the future.

The policy states that you will not be punished when you report an incident. The information is used to improve the system, not to gather evidence against the reporter.

That being said, behaviour can have consequences when this action is not acceptable. Conscious reckless behaviour can lead to punitive action. This distinction must be made clear to all staff in the policy and through different means of communication.

Not reporting an incident is reason for disciplinary actions from the manager. Basically this is the reverse from most present systems. For this to work, staff need to know what constitutes an incident and when they have to report the incident.

The policy must state what an incident is. There must be clear examples of situations you have to report. This knowledge is communicated throughout the organisation in an effective manner. Simply put; everybody must be informed and know what to do.

The policy also states that the report is confidential.  This allows the reporter to report without fear for embarrassment.

Reporting must be made easy for all staff.  If it is difficult for staff to report, less incidents will be reported. A hindrance can be; complicated procedures, no knowledge how to report or difficult access to means of communication. In the NGO world flexibility is key. Email or web based systems will not always work in low bandwidth environments. Reporting per sms, telephone or other communication means must be possible.

Reporters should receive feedback and be kept up-to-date with what happens concerning the information from their reports. Not seeing any results from your report is demotivating. It is important that regular analytical documents are made and disseminated to all involved. In these papers reporters can see the result of their information sharing. Of course, adapting and improving the system is the best way to show that your organisation appreciates reporting. Because that is the goal, through reporting improving the security system to make working for an NGO safer.

 

Conclusion

NGOs operate in ever changing and sometimes very volatile environments. Adapting your security system to these environments allows you to perform better in delivering your humanitarian services. You need information to adapt your security systems, especially incident reports.

Implementing a Just Culture in your organisation can increase the number of incidents being reported. With more information at your disposal, you can learn and make more informed decisions.

A Just Culture makes incident reporting a positive and professional action. It allows open en transparent communication and supports a positive atmosphere  inside your organisation. A Just Culture can lower the number of incidents that happen in the field. Thus improving the safety and security of your staff and the performance of your organisation delivering your services or goods to the beneficiaries.

http://www.centreforsafety.org/

Views: 286

Comment

You need to be a member of INSSA to add comments!

Join INSSA

Comment by Ebe Brons on January 5, 2015 at 11:42am

Dear Christopher,

Thanks for your reaction and the comparison with NASA. I think this “get out of jail” free card feels very counter intuitive for many organizations. It does feel counter intuitive for me! But I do think it works and that it changes incident reporting from negative to positive behaviour. In the end the security system improves and everybody is safer. To be honest, it is not easy to get NGO staff to report incidents in the first place.( as we are running the Simson system). I guess we, as NGO security managers, do not always succeed in making clear why security incident reporting is important. There is room for improvement!

Comment by Christopher Mayer on December 30, 2014 at 5:49pm

Excellent article. I would like to add that the aviation system in the United States has a similar program to that described for Denmark. NASA operates the Aviation Safety Reporting System. In this system, anyone who sees something that impacts on safety is encouraged to submit the report. The important feature here is that no adverse action can be taken against someone as the result of the report. It is designed to collect information to improve overall safety. The NASA system goes one step farther. If you self-report a safety violation in a timely manner, the Federal Aviation Administration may not take action against you (with the exceptions of gross negligence and willful reckless behavior). In this way it is a "get out of jail free" card. The idea is to improve overall safety and an honest report is taken as a demonstration that the aviation professional wants to be part of the process to improve safety. 

I think that is a model that can be used in other environments, such as NGO safety.

Comment by John Schafer on July 23, 2014 at 6:01pm
Very good article

Donors

INSSA Supporters

NGO Safety & Security News

© 2017   Created by INSSA Online.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service