Sunday, May 31, 2015


When approaching strategic, tactical & operational planning one needs to know about ostriches, denial, and prayers



by Franco Oboni and Cesar Oboni
Franco Oboni


(Riskope, Vancouver B.C. Canada; email: foboni@riskope.com)



While the world was feeling the aftershocks of the Great Recession, Riskope wrote about 16 common Human traits when dealing with hazards and risks. Some of those 16 traits lead individuals and often their organizations to assume stances called Ostrich, Denial or Prayer, resulting in flawed Strategic, Tactical & Operational planning options and occasionally exposing them (and their neighbors, their environment, society at large) to significant hazards which could generate large unintended consequences, i.e. large risks leading to catastrophe. Happily, there are also more positive stances we will discuss later in this text. Are these stances defensible from an ethical point of view? We are not talking about blatant errors or negligence cases here, but we are focusing on planning alternatives that seem acceptable, “safe enough” at some level of scrutiny or a playable risk by their decision maker(s). Examples abound: tailings dams, automotive design details, IT systems, banking system, etc. in almost any field of commerce, industry and economy.

Let's start from the potential results: unitended consequences and their causes

The idea of unintended consequences is an old one, it was indeed discussed by Adam Smith. However, it was the sociologist Robert K. Merton, incidentally the inventor of focus groups, who popularized this concept in the first half of the twentieth century (in his 1936 paper, "The Unanticipated Consequences of Purposive Social Action").
Merton cited as possible general causes of unintended consequences:
  • complexity (so, the “buzzword” is actually an “old” theory), 
  • perverse incentives, 
  • human stupidity, 
  • self-deception, 
  • failure to account for human nature or other cognitive or emotional biases among others. 
Risk, mitigative costs, vs public outcry.
It is nice to see that, although we are neither sociologists nor psychologists and we used politically correct language in line with present day conventions, our former post respected Merton's ideas. 
As anticipated above, there are also more positive stances: the Short Term Thinkers and the Long Term Thinkers. These are definitely better stances for decision makers and risk and crisis management, with the second clearly being the winner, but not necessarily the most likeable or popular. Hence Long Term Thinking is generally not the first choice for politicians, decision makers and CEOs driven either by re-election, immediate benefits or short term shareholders gains objectives. As we will see, even the Long Term Thinking stance has its own flaws which only very skilled planners can bypass.

What are the links between the five stances, Merton's flaws' driven behavior and the resulting risk management attitude/method?

The table below shows the five stances together with the resulting "General Behavior", the corresponding "Merton's flaw" and finally the resulting "Risk Management Attitude/Method".

Stance
General behavior
Merton's flaw (general) & details
Risk Management Attitude/Method
OSTRICH
Generally fail to evaluate uncertainties, seek comfort in one single answer from subject matter experts or other sources, go for "feel right" solutions, often refusing to properly evaluating/seeing evidence, hiding their head in the sand.
(Human stupidity).
Ignorance, it is impossible to anticipate everything, so why bother. This leads to incomplete analysis.
Follow one solution, go blind into new ventures, no risk management, no mitigation.
DENIAL
Grab evidence that works in the direction they want to go while recognizing multiple perspectives. They use evidence in a rational way, but then jump to conclusions and give their opinion the highest value while stating that all opinions are valid. They deny the rational evidence they have gathered.
(Self-deception).
Errors in analysis of the problem or following habits that worked in the past, but may not apply to the current situation
Mitigate what they consider "hazardous (could be nothing), trust their opinion and deny potential situations as "incredible". End-up throwing money in doubtful directions.
PRAYER
Can describe a problem and it's environment, properly evaluate evidence from various perspectives, attempt to control biases, but cannot prioritize across alternatives and cannot decide: they are prone to paralysis by analysis, being left with a prayer to decide.
(Complexity).
Certain actions which may have positive results cannot be taken because they are not recognizable in the magma of options. Long-term consequences may eventually cause changes in basic values altering the perspective in the future.
As they are paralyzed by the analyses, victims of overwhelm syndrome cannot take decisions, the only thing they are left with is praying for the best.
SHORT Term Thinkers
Look for objective comparisons after analyzing available data 
and including in the process key persons. They often bias towards the short term, neglecting long term and strategic issues, thus missing present limitations and future needs.
(Perverse incentives).
Immediate interests overriding long-term interests may lead to backfiring results, greater risks than in the Status Quo.
This is the common attitude for savvy politicians and politically oriented leaders. They are very careful about anything that could go wrong under their watch.
LONG Term Thinkers
Are prone to "build knowledge" and find better solutions by prioritizing limitations, updating information and their interpretation.
They consider the long term in strategic terms.
(Failure to account for human nature or other cognitive or emotional biases).
The excessive trust in overly detailed planning generating significant risks if anticipated risks do not materialize as expected (nature, intensity, occurrence).
This is the "durable manager" attitude, the method that will bring and maintain value in the long run, building a robust organization.

The reason we entered in the long discussion above is simple. In a world where many organizations, professional groups ask themselves important questions about:
  • Ethics, 
  • Corporate Social Responsibility (CSR) and 
  • how to maintain Social License to Operate (SLO), and
  • where the public reacts vehemently to new projects and is in permanent opposition to promoters,
it becomes important for decision makers to have a clear path forward and defensible stances. At Riskope, we believe that without a strong sense of ethics:
  • neither Corporate Social Responsibility nor Social License to Operate can be achieved/maintained, 
  • management will not be “durable” and 
  • public opposition will only grow stronger. Back in 2013 we wrote a paper showing how faulty risk management approaches and blatant cases of conflict of interest were key in the development of a negative attitude against new projects.
How different risks are likely to influence stance development in decision makers

In this section we are going to examine how different risks can influence stance development in decision makers. The risk categorization is based on a metaphoric description developed by German researchers between the '80s and the '90s . In this section we merge it with the data from the table above.


Stance
Chances of Unintended Conse-quences
 Likely trigger  (German metaphor)
Excusable behavior/
Negligence/ Ethical
Examples
OSTRICH
Extremely High
Sword of Damocles
Not today / Negligent /
 Unethical
Many when confronted to emerging issues (cyber attacks, for example). Also, nuclear power plants (Fukushima), large-scale chemical facilities (Bophal, Seveso).
DENIAL
Extremely High
Sword of Damocles
or Pandora’s Box
Not today / Often confused for Diligent, but basically negligent / Often considered ethical, but basically unethical
Banks before Lehman Brother and many would argue even today.
Also large hydro-dams and meteorite impacts are typical examples. Tailings dams and long term toxic/radioactive dumps often trigger the denial stance (we are better than others, it will not occur to us).
PRAYER
High
Cyclops or Pythias
Not today, but quite common / considered main stream / Can be somewhat ethical or unethical
Many when confronted to “world-changing” issues like, for example climate change (if they do not deny it) or extreme natural events such as volcanic eruptions (Vesuvius), earthquakes (various “large one” like San Francisco, Tokyo, etc.) and floods belong in this category. Self-reinforcing global warming or instability of the West Antarctic ice sheet, with far more disastrous consequences than those of gradual climate change.
SHORT Term Thinkers
High
Cassandra
Common, to be phased out / can be diligent / Can be somewhat ethical or unethical.
Many who relocate for fiscal reasons, or develop business based on fiscal incentives. Also, when the distance in time between trigger and consequence creates the fallacious impression of safety.
LONG Term Thinkers
Can be minimized if solutions are robust
Medusa
To be fostered / Diligent / Ethical
Sustainable and durable businesses placing high value on long term issues.


So, changes are needed to develop strong ethical values, some stances are becoming hazardous and others should be phased out. How can we foster the change leading to more positive stances, hence a more ethical approach?


What can be done to foster the turn to positive stances?

In a recent paper by Harward Business Review (HBR), John Beshears & Francesca Gino recognize it is extraordinarily difficult to rewire the human brain to undo the patterns that lead to common mistakes and, we would argue, to the faulty stances described above. This is obviously not a new subject as in 1979 Kahneman & Twersky discussed this topic. However the HBR's paper suggests that altering the environment in which decisions (and stances) are made is susceptible to strongly inhibit the birth of poor decisions and negative stances. HBR also claims that Leaders can do this by acting as architects. Drawing on extensive research in the consulting, software, entertainment, health care, pharmaceutical, manufacturing, banking, retail, and food industries and on the basic principles of behavioral economics (i.e Kahnema & Twersky and their followers), the authors have developed an approach for structuring work to encourage good decision making.
Beshears' and Gino's approach reportedly consists of five basic steps:
  1. Understand the systematic errors in decision making that can occur,
  2. determine whether behavioral issues are at the heart of the poor decisions in question, 
  3. pinpoint the specific underlying causes, 
  4. redesign the decision-making context to mitigate the negative impacts of biases and inadequate motivation, and
  5. rigorously test the solution.
A poor or clouded understanding of the risk environment of a decision is one of the major reasons for systematic errors. Furthermore this paper has clearly shown that certain types of risks can trigger the adoption of flawed behaviors (stances) resulting in unintended, possibly catastrophic, consequences, thus in critical undesirable risk exposures. Persisting in using poor risk assessment methodologies which introduce biases and censoring reality is unethical.

Conclusion

The Romans (probably, but not certainly, Seneca) knew already, over 2000 years ago, the conclusion of this discussion when they wrote: "errare humanum est perseverare autem diabolicum, et tertia non datur" (To err is human; to persist [in committing such errors] is of the devil, and the third possibility is not given). 
Many industries/human systems have erred enough to date, persisting is of the devil, there will not be a third chance, or, if we continue to act unethically, we will not give our future generations a third chance.
Redesigning the decision-making context is indeed necessary and ethical. It represents our possibility to stop erring, avoid the evil of catastrophic mistakes and leave a worthwhile legacy.