Blinders, Blunders and Wars is a smart, rational book. Perhaps too rational. The authors — David Gompert, Hans Binnendijk and Bonny Lin — served on all US administrations from Nixon to Obama and held four National Security Council appointments between them. This is heavyweight insight into American thinking, a typically RAND-esque critique of how 'strategic-decision models' are formed, and how flawed mental frameworks have misinformed leaders throughout history.
In this thinker's guide to avoiding dumb war, the authors examine a dozen crises from Napoleon to the Iraq invasion. They find that bad, avoidable decisions ('blunders') often stem from institutional barriers to processing information ('blinders') and from leaders with 'unwarranted faith in their ability to control events.' The authors' cognitive-science approach does not judge history with hindsight but rather examines what information was available at the time and how it was used to form decisions:
At the heart of every blunder is a flawed cognitive representation (of) how the world works, of circumstances at hand, of variables that determine the future, of choices available, and of expected results.
The greatest challenge is to avoid cognitive biases, in particular the exclusion of inconvenient facts that might challenge deeply held beliefs (the things we know for sure that just ain't so).
Don't mistake strategic blunders for 'accidental war.' Geoffrey Blainey has convincingly argued that there's no such thing. In Asia these days there is worry about rogue pilots and captains, as if a mere spark could set the 'tinderbox' alight. The RAND authors are careful to note that 'misjudgments and miscalculations are different from accidents.' They are more concerned with how the conditions for antagonism are constructed in the first place.
They attribute the blunders found in their twelve historical cases to three root causes: (1) the intuitive leader, (2) the blinding idea, and (3) the 'only apparent option'.
The intuitive leader
'Attributes often associated with strong, inspiring decision-makers — persuasiveness, resolve, boldness, certitude, command of loyalty, unity, absence of doubt, clarity — may overpower reasoning', the authors write. Charismatic leaders are consistently too optimistic, expecting adversaries to adhere to their script. They exhibit a puzzling failure to anticipate asymmetric responses. They perceive lower risks and have excessive confidence in their ability to control situations.
It is easy to recognise modern-day examples. America not long ago had George W Bush, known as 'the decider', whose confidants 'already knew the answers; it was received wisdom.' China's leader today is also a man of great moral certainty who self-identifies as a military commander. His doctrine that 'we will surely counterattack if attacked' is the language of personal resolve, not strategic reassurance.
The blinding idea
The 'blinding idea' is the rigid ideological belief in how world order is, should and will be constructed: 'The stronger the belief (that) the future is predestined, the weaker is the force of new information and the greater are the probability and scale of blunder.'
Manifest Destiny, the China Dream, the Washington Consensus— all are examples of such 'psycho-strategic' paradigms. In the modern day Asia Pacific, the RAND authors say groupthink around each superpower's central vision could hinder rationality. Washington today echoes with a cacophony of views (if there was ever an American consensus on China, some claim it is now collapsing or becoming belligerent), but outsiders are struck by the taut unanimity of Chinese rhetoric.
Interestingly, the authors cite Japan's 1941 pre-emptive attack on America as the #1 example of 'arrogance, egotism and hubris...based on conformity, obedience, and intuition.' This conclusion is debatable. Pearl Harbor more persuasively belongs to the third family of blunders.
The only apparent option
Under growing US embargo and facing encirclement, a lack of perceived alternatives drove Tokyo's thinking. With Japan's generals in control, all pathways to empire looked like military ones. True, it was the 'blinding idea' of imperial expansion that led them there, which is what makes this third type of blunder the most difficult blunder to analyse.
According to RAND, today's superpowers understand and trust each other little, and both hold subjective views of the other that can be corrected through exchange and interaction. But a century ago, Britain and Germany knew each other intimately, suggesting that their conflict was not simply some intellectual error. Instead, at least one side saw war as unavoidable and even desirable.
Is it possible that China and America could understand each other perfectly, and still clash? What if the problem is not one of information or imagination, but of pure stubborn interest? RAND would argue that calamity still arises from those three informational failures. But with the recent passing of John Nash, we are reminded that even enlightened, rational actors making 'maximising' decisions can end up in lose-lose outcomes.
Photo by Flickr user Jean Francois Chenier.