Over the weekend I read a paper by Geary Sikich, “Are we seeing the emergence of more White Swan events? Exploting new challenges in Enterprise Risk Management.”
You may have seen it published on Continuity Central, under the shorter title
“Exploiting new challenges in Enterprise Risk Management”
My review here is based on the version published on Scribd, I am not sure if it exactly the same as the article published on Continuity Central.
I assume we are all familiar with Taleb’s Black Swans. (If not you should be). Sikich is using Taleb’s base definition of the Black Swan events, (highly improbable, unpredictable, massive impact and after the event we create a explanation that makes it less random and more predictable) – then adding these other hues to address the wider range of risk events. Building on the same four aspects we get;
- Grey Swan – is highly probable
- it is predictable and impact can easily cascade
- after the event we make the focus on errors in judgement or other human failings
- “operator error” with respect to blowout gaskets on an oil rig might be an example
- White Swan – is also highly probable
- it is certain and carries an impact that can easily be estimated
- after the fact we recognise this certainty – but again shift the focus to judgement or other human error.
- What is summarised as “ineptness and incompetence when it comes to certainty events and their effects.“
An interesting contrast – from the things we cannot see coming (but after the event pretend we can see the next one) to the things we can see coming but appear unable to deal with effectively. What then is the purpose of these predictable Swans?
Simple, we are human and do not like to think our solutions are not perfect. We implement these grand Risk Management programs and we have ego and emotion invested in them. When they fail – because a risk has been realised – then it must have been somebody else’s fault.
Any risk or continuity program is based on a whole pile of assumptions we make, both consciously and unconsciously. I am sure we have all seen programs that are based on some really bizarre assumptions – I have actually seen some that are based on the assumtion that nothing really severe will ever happen!
The greatest risk is the one that destroys our assumptions, as we have no mitigations and protections in place for that.
Sikich identifies two specific areas that can create problems for a Risk Management program. Areas where our bad assumptions may originate.
- False positives – when we claim to be prepared but the answer is misleading (or delusional)
- he uses the example of firms that after 9/11 improved their IT recovery and claimed that was increased preparedness, but did little for people and process
- false sense of security – assumptions just waiting to be destroyed
- Activity Traps – where process and procedures become a goal in themselves, and often the real goal is lost
- too often also the effectiveness of what is being done is lost, but we keep doing it (the same way)
Too often we explain away evidence of failures in our programs, those early warning signs that something may not be right, because we know that success is achieved by following the implementation of the program exactly as designed. Our perception adversely effects our ability to critically assess the program.
The Sikich article is rather long, but worth the effort to confront this issue about the validity and testing of our assumptions.
Sikich is also exploring the role of perception in how risks are assessed, in particular he (like many others – myself included) has been influenced in this way by Peter Sandman’s famous formula
RISK = HAZARD + OUTRAGE
Sandman is a story for another day.
Have you heard of Peter Sandman before? If not, you should go and read some of his stuff.
Who had not heard of either Taleb or Sandman? Then you should ask yourself why am I reading this blog?
G Sikich says
Ken:
Thank you for the review of the article that was on Scribd. It is essentially the longer version of the Continuity Central article. I am working on a follow up to this one and appreciate any comments, suggestions, etc. from you and your readers. One has to remember too that Taleb is essentially a financial markets guy and is steeped in the math of the markets. When applying the “Black Swan” criteria to business events that can be catastrophic one needs to take a long hard look at the criteria and rethink the financial aspect, translating it into operational terms and perspectives.
Again, thanks for the review.
Geary
Ken Simpson says
Thank you for taking the time to comment Geary, review was my pleasure, I have followed your work for some time. I look forward to the follow on article.We see so many instances where the process is considered perfect and the operators are at fault, so a critical review of assumptions and perceptions is very relevant.It was interesting that today I was at a software vendor forum and so much talk about how these products enabled Business Continuity. A classic example of the potential for false positives – no understanding that BC is more about people and process than it is about technology.Another case of mistaking the tool for the process – http://www.blog.vrg.net.au/bc-practice/how-often-we-mistake-the-tool-for-the-process/