30 second summary: the quality of a strategic decision should be judged on the process followed to arrive at that decision and not the resulting outcome. Why? Because the resulting outcome is a function of both the process followed, which is in the decision maker’s control, and uncontrollable factors, namely good or bad luck. So, the decision maker should take all reasonable measures to follow a process that produces a decision that maximises the expected value of the outcome that the decision maker seeks, but also accepting that the inherent uncertainties might, in some unlucky cases, deliver an undesired outcome. It is the process that has been followed, and not the outcome achieved, against which decision makers should be measured and rewarded.
Main article follows …
Robert Rubin’s Harvard commencement speech of 2001 included: “Any individual decision can be badly thought through, and yet be successful, or exceedingly well thought through, but be unsuccessful, because the recognized possibility of failure in fact occurs. But over time, more thoughtful decision-making will lead to better overall results, and more thoughtful decision-making can be encouraged by evaluating decisions on how well they were made rather than on outcome.”
The most important principle of strategic decision making is also the most controversial. Whether a good strategic decision has been made should be judged by the process followed to arrive at that decision, not by the outcome resulting from that decision.
I have covered this topic before in Reward good effort, not outcome. In that same article, I mentioned that, in stark contradiction to the principle above, most business executives (7 out of 10) make the mistake of judging the quality of a strategic decision by the outcome achieved [1].
Ron Howard [2], Academic Director of Stanford’s Strategic Decision and Risk Management program, sums it up succinctly: “You can’t tell by the outcome whether you made a good decision”.
I continue to encounter resistance to this principle when I discuss it with clients or present it in lectures or workshops. And the flu example that I have previously used to argue the case is often challenged because of people’s differing perceptions regarding vaccinations.
So, in this article, I introduce another thought experiment, with a personal and professional angle, that I hope will more convincingly explain why the decision should not be judged on the outcome.
While I appreciate that there are tons of other factors that you might consider in assessing the case, for the purposes of a clean and simple thought experiment, consider only the information below in making your assessment.
*** case starts ***
The child-friendly internet filter
You have signed up for your 10-year-old kid to go to winter school. The kids are asked to bring their own iPads. And parents are advised to buy and install a child-friendly filter because they will be browsing the internet as part of the projects that they do.
From reliable experience, the organizers say that without a filter 90% of the kids land on inappropriate web sites while browsing; the other 10% are just lucky or less curious in their browsing. And with a filter 95% of them land on appropriate sites only, but still 5% of them somehow get to inappropriate sites because the filter is not perfect. Installing the filter would take you a bit of time and cost a small amount.
You would consider it a very bad outcome if your kid was exposed to inappropriate content, but based on your assessment of the odds, you choose to buy and install the filter and send your kid to this winter school.
So you buy and install the filter and send your kid off to the winter school. Then, let’s say that your kid was one of the unfortunate 5% who, despite, having the filter, landed on inappropriate content. Did you make a good decision in buying and installing the filter?
*** case ends ***
Hopefully it is apparent from this thought experiment that if you bought and installed the filter, then irrespective of the outcome you made a good decision. In other words, for 95% of the parents who made this decision they had a good outcome, but 5% of the parents who made that same decision simply lucked out and had a bad outcome. But they assessed the same information and made the same decision.
John Worrall [3], Professor of Philosophy of Science at the London School of Economics, has a technical term for following a good process or policy and getting a bad outcome. “Sometimes shit happens. If you follow what was the right policy ahead of time on the basis of the evidence, then you follow the right policy even if it didn’t turn out well”.
Similarly, for the parents that chose not to install the filter, 10% of them struck it lucky with their kids not stumbling onto inappropriate content. But, although they had a good outcome, did they make a good decision? I hope you’ll agree that they did not make a good decision irrespective of the outcome.
Consider the parallel of this thought experiment in a corporate setting for a moment. Replace the parents with executives charged with making a cyber-security decision that has a 95% chance of a good outcome if they implement the cyber-security solution and 90% chance of a bad outcome if they don’t. Imagine that two executives make the same decision relating to implementing cyber-security, but one executive had a good outcome and the other a bad outcome. Similarly imagine two other executives who made the same decision not to install the cyber-security solution and one had a good outcome and another a bad outcome.
McNamee and Celona call it out nicely when they say: “It is no more logical to punish the maker of a good decision for a bad outcome than it is to reward the maker of a bad decision for a good outcome.” [4]
The decision outcome, in this case whether the kid lands on inappropriate content or not (whether the company is penetrated by a hacker or not), is a function of (a) the decision, i.e. whether you installed the filter (cyber-security solution) or not, and (b) luck, i.e. the inherent uncertainty in the situation because the filter (cyber-security solution) is not perfect.
All aspects of how you make the decision are within your control, e.g. what information you collect and consider, etc. But, luck, by definition, is outside of your control. If you accept that luck plays a part and that you cannot influence luck, then the best you can do to achieve your desired outcome is to focus on how you make the decision. In other words, be clear on what you value, consider multiple alternatives, evaluate those alternatives on the scale and likelihood of value that they will deliver, then select the alternative that delivers the highest expected value.
If you don’t like the fact that your outcome is susceptible to luck, then you might assess the nature of uncertainty that you face and determine whether you can reduce that uncertainty in some way. And, sometimes you can … to some extent. For example, in the case above, perhaps there is a better filter (cyber-security solution) that you can buy that provides a 99% (vs 95%) chance that your kid will not land on inappropriate content (hackers will not penetrate your systems). Even then, you have reduced, but not eliminated the uncertainty.
For most real world strategic decisions, some aspects of uncertainty remain unavoidable and material to the decision. Examples of such unavoidable and potentially material uncertainties are: the weather, the exchange rate, commodity prices, market adoption of a new product. This means that you still need to decide in the presence of uncertainty.
If we accept that maximising value creation in strategic decision making requires following a good process, then surely, it is just that, and not the outcome, that people should be rewarded on? It is rare to find this practice in corporates. But, as Richard Thaler [5], Professor of Behavioral Science and Economics, says, it is essential.
“Good managers will reward people based on process rather than outcomes. This is very hard to do, but essential to running a good company” [6].
References:
[1] Accenture Research, December 2015
[2] https://www.gsb.stanford.edu/faculty-research/faculty/ronald-howard
[3] http://www.lse.ac.uk/philosophy/people/faculty/#john-worrall
[4] “Decision Analysis for the Professional”, Peter McNamee, John Celona
[5] https://www.chicagobooth.edu/faculty/directory/t/richard-h-thaler
[6] http://www.bbc.co.uk/programmes/p02tzvc5