I took my family to the Concert for Diana today at the new Wembley stadium, despite hearing about yesterday’s incidents at Glasgow and in London, despite hearing about Britain being put on its highest state of security alert, despite even knowing that there would be considerably increased levels of security at Wembley as a result.
Does that make me foolhardy? I hope not, I hope I was able to assess the risks and take a balanced decision rather than be bullied out of living a normal life. My intent was not to gamble on the safety and security of my family.
Yesterday, I was informed about the explosives-laden vehicle found near Tiger Tiger in Haymarket, and considered for a few minutes what I should do as a result. Why? I was hosting a 50th Anniversary lunch for Past Presidents of the British Computer Society at the BT Tower, a location that has been the target of terrorist attacks before. My decision was to proceed with the lunch as long as the police had not issued any “Do not travel except in emergency” directives.
The BT Tower yesterday. Wembley this afternoon. Both potential terrorist targets. In between those I was at St George’s Chapel, also a potential terrorist target, attending my daughter’s school speech day. As part of his speech, the headmaster rued the increasing nanny-state-ness of the environment we live in, lamenting that our children were losing out as a result. Losing out because they weren’t able to learn about risk, about the expectation of gains or losses, about knowing whom to trust or distrust (and, more particularly, why).
This almost-institutional avoidance of risk can be seen everywhere. Many years ago, when I saw the unnecessary panic caused by Y2K, I tried to fight back. I remember using the examples of Lance-Corporal Jones from Dad’s Army, as well as Zaphod Beeblebrox from Hitch-hiker, to try and illustrate how we should behave in such circumstances. That all projects carried risk and that avoidance of risk was always expensive and often impossible as well. That instead, an understanding of the risks was what was required, along with clearly expressed mitigation strategies. That risks could be prioritised, as also the actions taken to mitigate them. That risk premia could therefore be calculated and compared against risk likelihoods and impacts, and that paying the premium wasn’t always the smart thing to do.
My employers listened. Thankfully. And we ran a lean Y2K programme.
Sadly, much of the time, this is not true. People are more apt to be Zaphod-like, allowing their glasses to turn opaque at the first sign of danger. I guess it’s a level of risk aversion that goes with large-institution blame cultures. This may be fine for many people, but I wish people were aware of what they were doing as a result. It’s a bit like Information Security telling you that your computers would not be affected by viruses if you enveloped them in six foot of concrete. For sure those computers would be virus-free. But not much use for anything else either.
It’s the same with bringing up children. A child putting its hands near a fire learns by that experience. That learning is important, and we need to get better at helping later generations learn rather than legislating to prevent their learning. I think it was Esther Dyson whose e-mails consistently reminded me “Always make new mistakes”. Which I guess was a variance of the Edisonian “I have not failed. I have found ten thousand ways that do not work.” Iteration is a critical component of agile strategy.
As we move more and more into dealing with problems of extreme scale, we’re more and more likely to use problem-solving approaches that emulate natural selection and evolution. Today, we are so enmeshed in blame cultures that organisations often get into Failure-Is-Not-An-Option syndrome. What happens in this syndrome is that people hide failure rather than prevent it, and over time that hiding culture gets deep into the organisation. This culminates in an even worse syndrome, The-Emperor’s-New-Clothes syndrome. Here, everyone knows that what they say is not true, yet no one does anything about it.
Without risk there is no learning. Without learning there is no life. We need to be careful about being too careful.
If you’re interested in reading about The Risk Management of Everything, you could do worse than read Michael Power’s tract on that subject. I loved it. [Incidentally, I notice he’s written a book recently called Organising a World of Risk Management; haven’t read it yet, will provide my comments after I get it and read it].
JP – Your post reminds me of Feynman’s views on Challenger investigation where his estimate for shuttle disaster was 1 in 25. Among NASA engineers, it was 1 in 100 and among NASA management it was 100,000. It is so true that we are living in a culture where we take it as a given that airplanes should not crash, soldiers should not die in wars, trains should not collide, or power grids should not collapse. Complex engineering systems are prone to failure and to make them more robust is a work in progress which needs continuous innovation and investment. I guess it goes back to the fact as human beings we love to fool ourselves by hallucinating that we are living in a deterministic “failure-is-not-an-option” world as opposed to the fact that we live in an uncertain world where things break and people need to learn to fix it and move on. JP, as you say, while “blame culture” is definitely responsible and plays a vital role in the way we run our public conversation, I think it is also critical to recognize how everyone in the organization has a role to play when is comes to stating the facts as facts. So often, we end up blaming the management for disasters, but it is important for engineers, technicians, scientists, and mathematicians – people who are closer to the ground problems – to realize that they are responsible too. Problems may get mellowed down or swept under the carpet as they move up the food chain, but lack of activism, resignation, fear-of-losing-my-job, I-will-mind-my-own-business syndrome demonstrated by scientists and engineers adds on to the problem. I am not saying it is easy to do, but just like anything else having integrity towards one’s trade is hard. But the rewards are great. An organization built with such a focus will have greater competitive advantage to reap the rewards of managing risk and complex systems than the rest.
Perhaps because it’s easier and makes better headlines, media reports tend to only discuss relative risk. Recent examples: http://news.bbc.co.uk/1/hi/health/6229516.stm http://news.bbc.co.uk/1/hi/health/6709101.stm http://news.bbc.co.uk/1/hi/health/4904082.stm. But what matters is absolute risk. The third example is classic; they miss the shocking result that over 10% of the study developed Alzheimers in favour of the higher numbers of 40% lower risk.
Frank Duckworth produced an absolute “Richter scale” of risk to make it easier to compare absolute risks: http://www.counton.org/thesum/issue-01/issue-01-page-02.htm has a summary, with more detail in http://www.dartmouth.edu/%7Echance/chance_news/recent_news/chance_news_7.11.html#risk%20index. Nice to know that continuing to smoke 40 cigarettes a day is almost as risky as one round of Russian Roulette.
Nice to hear from you, Andrew.
The-Emperor’s-New-Clothes syndrome. Now what fascinates me most about this is: Why does no one say anything? I’ll check your reference. Here’s another: DeMarco, Waltzing with bears. Nice preface about Clifford and the ethics of belief (http://en.wikipedia.org/wiki/William_Kingdon_Clifford)
Hello,
I have taken the liberty of writing a text, on my blog (in French) about this text and your blog.
I work mainly with CIOs from large organisations on Web 2.0 solutions and the basic idea of my text is that all French CIOs should link to your blog by RSS.
http://nauges.typepad.com/my_weblog/2007/07/dsi-allergie-au.html
Congratulations for the very high quality of your texts and ideas.
Louis Naugès
President
Microcost
It was nice to read your views on these issues, JP. I am a corporate risk manager and had not read “The risk management of everything.” I’ve read most of it now, and will read the rest soon. Thanks for pointing me to that. It’s a good read and a good perspective.
Just a couple of days ago I was explaining to a friend my observations on The-Emperor’s-New-Clothes syndrome within the firm where I work. What seems to be driving the syndrome (at least where I have most recently seen it) is a desire to prevent someone from nixing a plan that the instigator believes to be truly best for the business. I have watched people profess that a given decision will have “no negative effects” when everyone in the room knows that there will be a few negative effects, but the good outweighs the bad. In practice, however, the people pitching the idea don’t seem to feel like they can say, “Hey, there is real risk here, but also real reward. Our analysis shows that the risk is a good one to take!”
Lots more here to talk about, so I am looking forward to more posts from you on this topic!