February 5, 2024
The ignorance of even the best-informed investor about the more remote future is much greater than his knowledge, and he cannot but be influenced to a degree which would seem wildly disproportionate to anyone who really knew the future, and be forced to seek a clue mainly here to trends further ahead.
John Maynard Keynes, A Treatise on Money, 1930
The other week, I ran across a rather glowing review of Robert Rubin’s The Yellow Pad: Making Better Decisions in an Uncertain World, by an old classmate of mine, Dr. J. Bradford DeLong. Brad writes the following in his superb and torrential Substack:
Decision-making in the face of uncertainty. Robert Rubin is a master at it. In The Yellow Pad: Making Better Decisions in an Uncertain World, he tries to give a flavor of how he does it, and guide people in a direction in which they will be able to do it too.
So I decided I had to read this book, since Brad was generally acclaimed to be the smartest guy in our college class (though a recent Nobel in Chemistry by another classmate may make that competition interesting). Brad also says this:
Rubin’s approach throughout his career has been pragmatic, probabilistic, and introspective. What is actually likely to happen and how can we possibly affect it? What are the odds, really? And what biases are we bringing to the table that are likely to cloud and block our vision of the situation, our ability to affect it, and the real odds? This is especially important when one is reacting, because then one’s immediate judgment is likely to be bad. Good decisions are made by delaying response until more information is in, which gives an opportunity for “thinking slow” in Danny Kahneman’s terms.
And I have to say that, as far as his praise goes, I have to agree that Robert Rubin, for his time, and even our own, is about as good as it gets with respect to decision-making, leading teams dealing with uncertainty, and being open to alternative viewpoints. He is among Keynes’ “best informed investors.”
But as Keynes notes, even the “best informed” are more ignorant than knowledgeable about the future. And yet every decision is about the future. How does Rubin go about making decisions?
With a yellow notebook pad. Here is how Rubin describes his approach:
I would write out, by hand, a list of possible outcomes in one column, and my estimated odds of each outcome occurring in the other. When I worked in the markets, the potential outcomes could generally be expressed in dollars, so I could multiply a selection of possible outcomes by the probabilities of each and add up the numbers to find what economists would call the “expected value” of each decision. Then I could choose the course of action that provided the highest total.
Sounds reasonable, right?
But it’s not, not really. Because it assumes that Robert Rubin already knows the “list of possible outcomes.” He doesn’t. How do I know? First, because I’ve been helping clients do “decision-making in the face of uncertainty” for over thirty years, and not only have I yet to meet one person who knows that list, I have discovered that “list” is an infinity of infinities, and the odds of getting it right (possibly outside of the spheres of trivial games, sports, etc.) asymptotically approach zero.
But also because of a couple of notable failures of his approach that he notes, quite honestly and to his credit, in his own book. These are:
- “In 1980, the Goldman Sachs arbitrage department—which I was running at the time—achieved a rather dubious milestone. We lost more money in just one month than the entire firm had ever made in a full year. What happened, broadly speaking, was this: we had a large number of positions whose valuation was affected by the market’s widespread expectation that inflation would go up. Much to everyone’s surprise, the Fed chair at the time, Paul Volcker, raised interest rates dramatically, which sent inflation down.”
- “[A]t the meeting on September 12, 2007, …we were told that Citi had $43 billion of assets backed by subprime loans on its books, a substantially larger position than any other financial institution—and that, … these assets had not appeared on the ‘risk report’ shared with senior management and the board…. Across the financial system—from asset managers and the Federal Reserve to Washington policy-makers and journalists—only a handful of people saw the crisis coming. I regret that I wasn’t among them. In hindsight, if I had better seen what was coming, perhaps I might have been better able to call attention to it.”
Rubin makes several recommendations to avoid such debacles in future:
- He says that “a rigorous process of judging one’s prior actions is critical to learning lessons that will improve one’s future choices.”
- He says that leaders must be open to those who differ with leaders’ points of view.
- He says that rather than seeing risk as one number, leaders should see it as “a range:” “Decision-makers can pick a few key potential outcomes—outcomes that reflect the full range of risk—and then use their judgment, based on facts and analysis, to attach a probability and magnitude to each.”
The problem is that when you see risk as “a range,” you are assuming you know what variable you ought to be worried about. And you are still seeing a graphical image that has no reality about it.
It is very hard to let go of the idea that probabilities have any reality. They really do not. Our concepts of reality are not reality. They cannot capture the entirety of what may affect the outcomes we are concerned with.
Rubin’s last point above starts to approach this insight. Later he heads a chapter “Labels Are No Substitute for Thought.” But he does not proceed to the logical conclusion, which is that anything you put on a yellow legal pad is going to be just one more concept – a single way of analyzing an unknowable situation, when you should be generating multiple approaches (i.e., rigorously imagining) and taking each seriously on its own terms.
There’s such a rush to go to ANALYSIS, because the mathematically inclined know how to do that. You know what they stink at? IMAGINING. When faced with a blank sheet of paper, they panic. They must fill that paper with familiar numbers and diagrams, even if those numbers and diagrams are quite likely to end up being irrelevant at best and positively misleading at worst.
Rubin talks a lot about “fat tails.” But 2007-2008 was not about “fat tails.” Probability distributions are completely bogus, in a very important way. They LOOK real. “27%” or “83.2%” seem very concrete, because they are numbers. But they are imaginary. All of these probabilities resolve into either 1, or 0. Unless you can use the 27% or the 83.2% to sell off your risk to a willing buyer, they have no reality. And the 2007-2008 crisis came about not because bankers failed to appreciate that their investment risk curve had a certain “fatness of tail,” but because those bankers thought their valuations of mortgage-backed securities had any relationship to reality whatsoever.
The only way to ameliorate one’s radical uncertainty when dealing with decisions about an unknowable future is to imagine, rigorously, up front, a variety of frameworks for that future, and to concentrate not on probabilities, but on impacts. Because that is what you should care about. All probabilities resolve to zero or one, right? Only impacts on your world of work can possibly matter.
But to get to those impacts, good and bad, you need to step away from your immediate world of work. You need to figure out what few factors are critical to the continued thriving or destruction of your enterprise, and you need to vary them systematically, to paraphrase Keats, “without any irritable reaching after odds or probabilities.” Then you need to tell the story backward from the future as to how you got to that “good” or “bad” place (though “good” and “bad” increasingly lose their meaning as we move away), far enough into the future that you are removed from your current, tactical, probabilistic concerns.
Robert Rubin’s yellow pad, in the end, contains one explanatory framework for the future, with “a list of possible outcomes in one column, and my estimated odds of each outcome occurring in the other.” His generosity and openness are real, but they seem only to extend to allowing others to adjust the percentage chances and the “tails” of the distributions he has walked in with. When his guess is wrong, he is open to after-action analysis. But there are hundreds of ways to be wrong about something, and far fewer ways to be right. Closing one barn door after the horse has fled – in a barn with a thousand doors – may be rational, but it won’t solve the ultimate problem.
In the introduction to his book, Rubin says:
Rather than seeking to settle debates, my goal is to provide a framework for better debates—debates more likely to lead decision-makers to the best possible course of action.
But even “better debates” seem to presume a starting point and a decision based on a point prediction of what is likely to happen. (Someone presumably wins the debate, right?) And the starting point will almost always be the viewpoint of the senior person in the room, around which all the debate will revolve. But what if that starting point is dead wrong?
The 2007-2008 financial crisis is a perfect example of this. Rubin wishes that he had been among the few people on Wall Street who anticipated the crisis. But I would submit that even those people, glorified in Michael Lewis’ gripping book The Big Short, may have been simply lucky. They happened to have stumbled across the right data, with the right mental makeup and inclinations, and they made a prediction that things would not go well. They happen to have been “right.” But have they gone on to be right about other things? Have any of the amazing characters in that book gone on to achieve the status even of a Robert Rubin? If they have, I’m not aware of it.
If Wall Street really wanted to avoid a repeat of 2007-2008, they would make it part of their jobs to rigorously imagine and flesh out multiple scenarios of the future of finance. They would identify what factors will be determinative of their (and our) long-term future, and they would systematically vary them, without respect to probabilities at all – only impacts. They would not have one yellow legal pad with one list of outcomes. They would have multiple “yellow pads,” with entirely different future worlds included in each. Then they would tell a story of how each of these future worlds came to be.
But at this point, there may be no percentage in doing a better job at decision-making under fundamental uncertainty. If everyone on Wall Street is thinking in this linear way, why bother imagining something else? Finance in our age seems to be a contest between a very few, ultimately ill-founded, predictions of the future values of financial instruments. My quote from Keynes at the top of this blog piece continues:
…But if this is true of the best-informed, the vast majority of those who are concerned with the buying and selling of securities know almost nothing whatever about what they are doing. They do not possess even the rudiments of what is required for a valid judgement, and are the prey of hopes and fears easily aroused by transient events and as easily dispelled.
Some Robert Rubin has written something on his legal pad, and “the vast majority” pile on to his analysis, with their and their clients’ money, until the yellow pad is ultimately (and inevitably) shown to be manifestly no longer in accordance with reality, and a lot of those people lose their shirts.
So, is Robert Rubin really a “master” at “decision-making under uncertainty?” Maybe in comparison to those who “know almost nothing whatever about what they are doing.” Which, unfortunately for all of us, may include the vast majority not only of humanity but even of those who specialize in managing our portfolios. “In the land of the blind, the one-eyed man is king,” goes the saying. But in the land of the blind, far too many one-eyed men don’t even bother to open their eyelids. Why should they, if their actual competitive advantage is in their instinctual ability to sway masses of the blind to follow their ill-founded predictions?
Numbers, far too often, kill thought. Brad DeLong says, rightly:
The call for pragmatism and probabilities is, I think, especially welcome here and now. Right now, far too much of American politics in particular and public life in general is simply stupid—devoted to rallying troops via emotional appeals without thinking whether these are the troops to be rallied and whether these are really the hills to take and to die on. This, in Rubin’s view, has massively undermined our ability to make good collective decisions, as learning and analysis are submerged under “rushing toward absolutes and simplistic answers”. These habits are poison for curiosity, receptiveness, recognizing the real trade-offs, plus the “what, really are the odds?’ question—“probabilistic thinking”, “thinking probabilistically”, and very similar phrases show up on more than one-tenth of the pages of the book.
It is true that in 2024 there are far cruder ways to do decision-making than Rubin’s probabilistic approach, just as there were less elegant ways for Croesus of Lydia to decide whether to go to war against the Persians than to go ask the priestess of Apollo at Delphi what she thought. But “thinking probabilistically” inevitably assumes that you already know the full range of realities you are facing – in other words, they sum up to a probability of 100%.
And knowing that the entire edifice of deregulated and uncontrolled derivatives trading upon which Citibank had been built might just explode was both: (a) a plausible future that Citibank’s top management might have wanted to explore in the early 2000s; and also (ii) one that was never, ever, ever going to show up on anyone’s probabilistic “list of possible outcomes” yellow pad before the whole place went kablooey, turned out to have been far more important than anything appearing on Rubin’s legal pad at the time.
Why are even dumber, more populist, emotion-driven ways of deciding arising now? I would submit the following as a possibility: the mass of Americans have not been served well by experts who are numbers-obsessed but imagination-impoverished. Far too often they have been told with a false certainty and arrogance that experts know what the future will bring.
Keynes is very famous for another quote:
The long run is a misleading guide to current affairs. In the long run we are all dead. Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is past the ocean is flat again.
John Maynard Keynes, A Tract on Monetary Reform, 1923
I would paraphrase: “Leaders set themselves too easy, too useless a task if, when dealing with fundamentally unpredictable phenomena, they can only tell us what the answer would be if the phenomena in question were, in fact, predictable.”
The long run can be, in fact, quite useful as a guide to current affairs, in another sense. If one abandons probabilistic thinking, and focuses purely on impacts, one can use the long run to tell convincing stories as to how seemingly quite improbable things can occur that will have huge impacts on our lives and work.
And though it may not yet pay Wall Streeters to take rigorous imagination seriously, I would insist that for society as a whole, and for great nations, this “negative capability” (Keats again) is absolutely essential. This millennium so far has been an almost unremitting series of shocks to the United States and the developed western world. Terrorism, war, financial and economic meltdown, and deadly pandemics have rolled in like the multiple waves of one huge tsunami, destroying fortunes, disrupting and even ending lives, fraying our societal cohesion, and making us doubt that there is a positive future to be had.
Not one of these shocks appeared on a Robert Rubin-style yellow pad. None were, nor could they have been, anticipated via “probabilistic thinking,” because, like every Powerball win in history, each was utterly improbable ex ante, even though some equally improbable outcome or other was inevitable.
But each could have been, and even was, anticipated through rigorous imagination. Our organization has written scenarios for clients of terrorism, war, economic catastrophe, and pandemic before any of these things had occurred in reality. It can be done. Will it be done, before the next wave hits?