Published daily by the Lowy Institute

It’s complicated: psychology and national security decisions

Thinking about the plethora of security threats all at once is hard ­– which can increase the risk of bias intruding.

Humans are naturally averse to complexity: Tokyu Plaza Omotesando Harajuku, Shibuya‑ku, Japan (Ivy Barn/Unsplash)
Humans are naturally averse to complexity: Tokyu Plaza Omotesando Harajuku, Shibuya‑ku, Japan (Ivy Barn/Unsplash)

The claim that Australia’s current national security environment has few precedents has almost become a cliché. Home Affairs Minister Clare O’Neil most recently asserted that “Australia faces the most dangerous set of strategic circumstances since the Second World War” and that “there would be few five-year periods in which Australia’s national security picture has changed so much”.

But what really distinguishes Australia’s current predicament from earlier eras? It’s not necessarily danger: the Cold War, which dominated Australian national security for decades, was more dangerous than many now remember. It turned hot in Australia’s region more than once. Australia has been at war many times since 1945.

Australia’s national security environment, however, may now be more complicated than ever. As O’Neil points out, the spectrum of threats ranges from the planetary threat posed by climate change though cyber threats to new mutations of political extremism. It also encompasses strategic competition between the United States and China, which in itself poses far more complex challenges for Australia than the Cold War did, not least because of US domestic volatility.

When it comes to China, Prime Minister Anthony Albanese has reiterated “we will cooperate where we can, disagree where we must, and act in the national interest”. This echoes US Secretary of State Antony Blinken’s neat formulation of Washington’s China policy as being “competitive when it should be, collaborative when it can be, and adversarial when it must be”.

All parts of the national security bureaucracy are susceptible to cognitive biases.

This approach sounds relatively straightforward, yet it’s proving hard in practice, partly because old distinctions between economics and security are being replaced by blurry geopolitics. Cooperation on trade, public health, and climate change has been overtaken by geoeconomics, vaccine nationalism, and guarding of green technologies. The resulting tensions between competition and cooperation are at least recognised by the latest US National Security Strategy. But they’re far from resolved.

The gaps in the US strategy show just how hard it is to think about climate change and China at the same time, let alone the plethora of other security threats. It’s much easier to focus on a single issue, even a complicated one such as China, than it is to ask the larger questions. How should Australia make sense of its environment, for instance, hedge against uncertainty, or decide where to cooperate and prioritise scarce national security resources for competition?

That may be because humans are naturally averse to complexity. We are hardwired to avoid ambiguity and uncertainty. We prefer to seek cognitive closure as quickly as possible. There are good evolutionary reasons for this. People need rules of thumb to get by in the world, especially when they need to think fast. Mental shortcuts can sometimes produce the best decisions: experienced firefighters develop an intuitive sense of when to get out of a burning building before it collapses.

But mental shortcuts more often impair decision-making. In the 1970s, the Israeli psychologists Daniel Kahneman and Amos Tversky began showing how these shortcuts (or simplifying heuristics) produce systematic errors (or cognitive biases). An early and famous study showed how framing a respondent’s circumstances as “winning” or “losing” could easily influence their risk appetite. They were much readier to take risks when they thought they were losing, and vice versa. Following Kahneman and Tversky, ever more cognitive biases have been identified. Some of them, such as “groupthink”, have become common knowledge.

Deciding where to cooperate and prioritise scarce national security resources for competition? Gold Coast, Australia (Josh Withers/Unsplash)
Deciding where to cooperate and prioritise scarce national security resources for competition? Gold Coast, Australia (Josh Withers/Unsplash)

What does this psychology have to do with Australian national security? International relations theory normally assumes that states are fundamentally interest-maximising rational actors and so, when it comes to foreign policy decisions, personal factors are of secondary importance at most. But the more that is learned about how governments actually make these decisions, the weaker that assumption appears. The role of groupthink in the 2003 decisions by the United States and United Kingdom, supported by Australia, to invade and occupy Iraq has been well documented.

All parts of the national security bureaucracy are susceptible to cognitive biases. In the aftermath of faulty assessments of Iraq’s weapons of mass destruction program, Five Eyes intelligence agencies endeavoured to minimise the distorting influence of cognitive biases, chiefly through the use of more structured analytic techniques. But the benefits of these improvements will be lost if the same biases are simply reintroduced at the decision-making stage. Can that be prevented?

Daniel Kahneman’s most recent book, Noise: a flaw in human judgement, co-authored with Olivier Sibon and Cass Sunstein, tackles this issue. The authors describe how to reduce the influence of both cognitive bias and “noise” (identifiable flaws in decision-making that cannot be attributed to known cognitive biases.) They advocate “decision hygiene”.

When you wash your hands, you may not know precisely which germ you are avoiding – you just know that handwashing is a good prevention for a variety of germs … Similarly, following the principles of decision hygiene means that you adopt techniques that reduce noise without ever knowing which underlying errors you are helping to avoid.

The specific process they advocate is not extremely onerous. It could be adopted by the National Security Committee of Cabinet when grappling with complex problems. The authors call it the “mediating assessment protocol”. In essence, it requires considering the various dimensions of a given problem as discretely as possible before the moment of decision is reached.

The main purpose of this staggered approach is to counter that natural impulse to simplify. A common cognitive bias is the tendency to “satisfice” – that is, to choose the first “good enough” option. Deciding that Australia should acquire nuclear-powered submarines simply because they are “more effective” would be an example of satisficing.

The mediating assessment protocol aims to reduce the “excessive coherence” sometimes imposed on complex problems. It would be especially useful for the sorts of complex problems Australia’s national security environment will generate. It would better ensure that security, economic and environmental dimensions are factored in when decisions are made, and it would help prevent decision-makers losing sight of long-term strategic objectives as they grapple with the pressures of a more complex and competitive world.
 

Ben Scott’s Lowy Institute Analysis “Sharper choices: How Australia can make better national security decisions” was published this week.




You may also be interested in