A Thoughtful, Analytical Approach to NGO Security

NGO Security Myths

Black Swan Lessons - You Can't Graph the Future

Something about most UN and NGO security reports has always made me uneasy. Don’t get me wrong. It’s not that they aren’t thorough. A lot of work goes into fact checking and ensuring that what they say is ’correct’. It’s just that the typical security report is a comprehensive list of recent past incidents combined, if we are lucky, with their assessed causes. Incident statistics are then charted and 'trends' are identified. This always made me a little nervous.

To be fair I never really knew why it made me nervous until I read “The Black Swan”. Nicholas Taleb raises several points that help explain my unease.

The first is that more information is not necessarily better. Its very easy to get bogged down in detail that has no real relevance to the issue at hand.

The second factor is what Nicholas calls the Ludic Fallacy. In brief this is the assumption that the unexpected can be predicted by extrapolating from statistics based on past observations. Nicholas argues that while this holds true for theoretical models based on games of chance it seldom holds true in the real world for the following reasons:

We don’t know what we don’t know. (See the Unknown Unknown)
Very small (perhaps imperceptible) changes in the variables can have a huge impact in the outcome. This is commonly referred to as the Butterfly Effect.
Theories based on experience are fundamentally flawed as events that have not occurred before (or are outside living memory) cannot be accounted for.

The Washington Post graphic below, which shows the frequency and lethality of suicide attacks since 1981, illustrates the problem. If we had examined the chart in 2000 would it have led us to predict 9/11(a classic Black Swan)? If we had re-examined it in 2003 would it have led us to predict the sudden increase in the frequency of attacks in 2007? What does 2007 tell us about 2008? Looking at the trend from 1981 to 1989 how many researchers would have concluded that suicide attacks were in decline and opined that such attacks were ineffective in accomplishing the attackers goals.


Odds and Ends

A couple of weeks ago I emailed Paul Currion and happened to mention that I wanted to plot RSS news feeds on an easily accessible map. Paul passed my question onwards and it mushroomed into an interesting conversation between some very clever people. Numerous hat tips and thanks to you all. I’m still experimenting with some of the ideas that were shared and I’ll update everyone at some time in the future.

So far I’ve run into some stumbling blocks:

  • In Google Maps Sri Lanka is a big empty space. The only thing missing is a ‘here be dragons’ label
    • RSS to GeoRSS utilities tend to encode the first place name encountered. This means that a story about Trincomalee will be plotted to Colombo if Colombo is in the by-line
    • Some utilities don’t work well on some platform/browser combinations
    • It seems the IT section’s web filters are causing some problems as well
    • Popfly seems to work pretty well but so far the Geonames database they use only covers the US

Common sense update

No sooner did I post my common sense rant then I came across this picture.

lightning strike

My common sense tells me that aircraft getting struck by lightning would be an extremely rare and very dangerous event. Apparently my common sense has let me down as this article and the reader comments explain.

The Common Sense Myth

I don’t know how many times I’ve heard some variation of this statement; “Security is just common sense.” I’ve seen NGOs use this belief to justify not dedicating resources to security. “We don’t need a security officer… we have experienced staff with good common sense.” Worse, I’ve seen people use it to justify breaking security procedures. “Its all just stupid rules… my common sense will keep me out of trouble.”

There is only one problem. Common sense is NOT good for security! Common sense is based on a whole series of faulty assumptions, biases and quirks.


Assumption of Common Knowledge

The first assumption is sometimes referred to as “Assumption of Common Knowledge”. In other words you know something so well that you think everyone must know it too. It just seems so obvious to you that it must be common sense.

The faulty logic of Assumption of Common Knowledge is revealed when common sense is cited in instructions. Reliance upon the term common sense when giving instructions pre-supposes that the instructed already has a grasp on the subject, and therefore needs no specific detail.

Some examples from security advice I have read:

“When working in high-risk areas use your common sense.”
“When travelling in a foreign country use common sense to avoid offending people.”
“If you are involved in a motor vehicle accident and an unruly crowd begins to form use common sense.”
“Use common sense during first aid emergencies.”

Do these statements make sense to you? Consider that in most security manuals, immediately after the “use common sense” statement, you’ll find a checklist of things that you should and shouldn’t do when in such a situation. If it is truly common sense why is the checklist needed?

Not convinced? Try these:

“When deciding whether or not to bilaterally transect the artery use common sense.”
“When connecting new wiring to the building mains use common sense.”
“If you are alone when you go into labour use your common sense.”
“When conducting sensitive hostage negotiations use common sense.”

Do you feel a sudden need for more detailed instructions? If saying, "use common sense" worked security procedures wouldn’t be nearly so wordy.


Cultural Norms

Every culture and subculture has norms which members are immersed in and unable to distinguish from common sense. Anyone with experience working in other cultures has had the experience of running up against cultural practices that seemingly lack any semblance of common sense. Eventually of course you realise that members of other cultures may also view your own cultural norms as similarly nonsensical. The physical reality of the world remains the same wherever you travel but the ‘common sense’ rules people use to navigate it change.

Examples:

“Saudi flag on football = good PR” vs “Saudi flag on football = insult to Allah”
"When negotiating “be open and honest” vs “allow participants to save face”
“Using weapons to protect NGO personnel and property decreases security by sanctioning violence” vs “Weapons are necessary for protection. Its just common sense!”

By the way, the last statement was expressed to me by national staff members of a large INGO. While they were willing to accept that the ‘soft’ international program staff might not see the necessity for self-defence weapons they really could not believe that a security officer couldn’t see the common sense in it.


Plausibility

In effect it sounds like it makes sense. Clichés often fall into this category. “Opposites attract” - common sense right? “Birds of a feather flock together” – common sense too!

Clever arguments, well stated, can be persuasive even if built on a foundation of bias lacking evidence. Take for example the politician who exclaims, “What we need is a common sense solution to the conflict” leaving the majority nodding their heads sagely while overlooking the fact that the opposition never actually supported a solution devoid of common sense.

“Mosquitoes can spread AIDS.” By now most of us should know that that just isn’t true. Surprisingly many people still believe it. To them it just sounds plausible. “AIDS is spread by body fluid to body fluid contact… mosquitoes transfer body fluid… mosquitoes spread AIDS.” It’s just common sense, right?


Cognitive Biases

There are a large number of cognitive biases that cloud our thinking and skew our common sense. Covering them all is beyond the scope of this article but consider these:

Optimism bias the tendency to be overly optimistic about the outcome of intended actions
Recency effect — the tendency to consider recent events as having more import than earlier events
Zero-risk bias — the preference for reducing a small risk to zero over a larger reduction in a greater risk


Pre-Pubertal Learning

Ideas learned before puberty are difficult to unseat and are generally believed by the holder to be common sense. Thought processes change during puberty allowing most of us to more readily consider inconsistencies, question assumptions, and assess the grey areas of life. However, what we have learned prior to puberty generally remains unchallenged.

Most children accept "morality of authority" in which truth is what a credible authority figure has stated is truth. In effect parents, primary school teachers, and religious instructors all shape our ‘common sense’. Later in life it is difficult for us to unlearn these truths. How many of us still believe that sound travels better through liquids than through air or that Ben Franklin's kite was struck by lightning? How much harder is it to change our perceptions of risk?

Other Stuff

Subscribe to Patronus in a feedreader
Subscribe to Patronus Analytical RSS Feed by Email

Low on bandwidth? Try this low graphics version


Lijit Search

Bloggers' Rights at EFF Global Voices: The World is Talking, Are You Listening?



Support CC - 2007

Creative Commons License
This work by Kevin Toomer is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 2.5 Canada License.
Jun 2008
May 2008
Apr 2008
Mar 2008
Feb 2008
Jan 2008
Dec 2007
Nov 2007
Oct 2007
Sep 2007
Aug 2007
Jul 2007