September 20, 2023
Someone just asked me what the premise of my book was, what readers would take away from it. So I typed this up.
My premise is that a cult of prediction has turned the 21st century into a series of avoidable shocks – 9/11, the failure of the Iraq War, the Global Financial Crisis, and COVID-19 (and you could throw in the entire political circus we are the spectators to right now).
Stuff that simply cannot be predicted is still being predicted by “the smartest guys [sic] in the room,” because they think prediction (especially, lately, using quantitative methods), is the only way to deal with the blank canvas of the future.
But ALL DATA IS ABOUT THE PAST. Every algorithm (whether a sophisticated computer program, or simply “received wisdom” of academics, generals, epidemiologists, or other experts) that relies on past data to predict future outcomes is ultimately doomed to be shown up by reality.
Now, is nothing predictable? Obviously, basic “hard science” – physics, chemistry, astronomy, even, more and more, biology, feature understandable causal relationships and are increasingly predictable.
In fact, it’s the very success of these “hard sciences” over the past few centuries that has lulled us into thinking that everything is predictable.
However, even as prediction has failed us over and over lately, our addiction to it has only grown. Nate Silver’s book The Signal and the Noise begins with the statement, “If prediction is the problem, it is also the solution.”
It is my contention that this is exactly wrong. Prediction itself is the problem; specifically, prediction of things that are by their nature unpredictable. A long series of bestselling books (Thinking, Fast and Slow; Noise; Superforecasting), often about the few areas in which prediction might seem to work, has perpetuated Silver’s delusion that we can predict our way to perfect prescience. We cannot.
The irony is that as stuff becomes predictable, it by definition becomes less strategically important. Engineers, used to dealing in straightforward cause and effect, end up as CEOs, because they convince the rest of us they can understand, control, and predict everything. But they can’t. What Yuval Noah Harari calls “intersubjective realities” – agreed-upon fictions that we all accept (like human rights, the Capital Asset Pricing Model, the “fact” that housing prices never decline across the entire country, the implausibility of terrorists striking the U.S. mainland, the likelihood of an Iraq War being a “cakewalk,” the “fact” that pandemics are best controlled by keeping children home from school (as well as the “fact” that Americans will abide by public health recommendations), etc. etc.), do not have to abide by the principles of engineering. They are not bound by physical limits; they can turn on a dime. And any model that assumes they are predictable, the way the movements of the planets are, will be swept aside.
This is how a 9/11 is allowed to happen; an Iraq War is allowed to spin out of control; financial derivatives are allowed to cluster like sticks of dynamite around the previously boring mortgage markets; and a pandemic ends up killing a million Americans. Some unexpected extraneous factor that is not included in the model sweeps in from an unexpected quarter, wiping out all the pretty predictions of the algorithm.
Strategic leadership is about dealing with what the economist Frank Knight called “that higher form of uncertainty” that is fundamentally unpredictable. And anticipating those tsunamis from outside the models takes imagination, not measurement and prediction.
Leaders in this century are going to have to imagine end-states in the future that are fundamentally different from today, and then tell stories back to the present to make sense of these plausible but alien-sounding outcomes.
(The terrifying truth is that, once you start doing this, you realize that it is quite possible to tell a very convincing and logical story for how we get from 2023 to almost any 2050 end-state imaginable – so we’d better start preparing for a wider range of possibilities.)
One of the conclusions I reach is that we are just terrible at doing this anticipation/imagination game, as a society — even grizzled scenario-planning consultants like me. WE NEED TO GET BETTER AT THIS. I outline the way that we approach scenario planning in the book – without probabilities, challenging assumed causalities, telling stories back to the present – but I openly admit we are not very good yet, especially at getting it plugged into organizational planning processes.
Hence Fatal Certainty, which is a call to arms.
But I will say this – I did write a scenario about a coronavirus arising in China and spreading across the Pacific, killing hundreds of thousands of Americans, and fundamentally disrupting world trade and economic growth, and finally making video-conferencing a real business – in 2003. (The book opens with this scenario.)
I’m no genius – others can achieve this type of anticipation as well. But we have to start imagining, rigorously.
(I also throw in the Oracle of Delphi, Nostradamus, and some other stuff to show surprising parallels between allegedly irrational ancient ways of foresight and the kind of arrogant goofiness that caused the Global Financial Crisis.)
That’s my premise, folks… whaddaya think?