The good functioning of liberal democracies relies on a healthy information environment. For citizens to make well-informed, free choices, public discourse must be authentic and (for the most part) based on facts.
If such an idyllic vision ever existed, it now feels further out of reach than ever. Nonetheless, when the Lowy Institute’s Ben Scott argues that “liberal democracies should … resist the temptation to disseminate disinformation in situations short of war”, it’s hard to disagree. A core purpose of the national security and foreign policy of states such as Australia should be to protect and uphold the integrity of the institutions that underpin liberal democracy.
The trouble is that the temptation for democracies to play cynically in the information domain arises when actors with fewer scruples attempt to distort democratic discourse or use free speech to advance their own dangerous agendas. For Australia to behave similarly would not only mean ceding the moral high ground; it would undermine a principled commitment to the inherent value of authentic public discourse. By adopting the tactics of illiberalism and authoritarianism, Australia and other liberal democracies risk fuelling a race to the bottom where mutual falsehoods compete for short-term advantage, all the while corrupting confidence in democratic institutions.
The array of informational threats posed by malign actors, however, demands that countries such as Australia take an active approach to defending their interests. But as Scott says, Australia needs a “strategic compass” for its information operations. How far, for instance, should Australia go to push back against Beijing’s disinformation campaign on Canberra’s one-China policy? Is simply correcting mistruths enough? Or should Australia sometimes purposefully bend the truth?
To start answering this call for a “strategic compass”, I propose three principles for Australian information operations. These apply to the “sharper” end of state information activity: those aimed at cynically manipulating public discourse, seeding disinformation, and leveraging institutions to advance illiberal agendas. These would not concern, for instance, conventional public diplomacy, government information campaigns, and political speech.
First, any such operations should be defensive and proportionate. Australia should only ever act in response to a threat posed to it by another actor (or to prevent an immediate threat from materialising). Actions should aim to eliminate or counter the disinformation or nefarious manipulation of public discourse – but no further. This means developing transparency protocols that identify falsehoods and set the record straight. It also means shutting down inauthentic voices. The Foreign Influence Transparency Scheme, meanwhile, is a good first step to making institutions of credible knowledge generation, such as universities, more robust.
The second, related principle is that information operations should not be used to achieve broader foreign and security policy aims. Wartime aside, Australia should never actively disseminate disinformation, even against those who perpetrate it. Further distorting the information environment of another country, even an autocracy without a free press, is antithetical to the liberal values Australia and other democracies profess to defend. Moreover, in a deeply interconnected world, disinformation sowed in one place can be reaped almost anywhere else.
This is qualitatively different to offensive cyber operations against industry or infrastructure, for example. Though costly and serious, shutting down an electrical grid doesn’t go to the essential character of a nation in the same way that manipulating public discourse does.
The more difficult question is around deterrence. What if the best way to stop or deter a disinformation campaign in Australia is to mount one against the perpetrator? Given the high premium that, say, the Chinese government places on controlling its own information environment, manipulating certain pressure points – such as discontent around Covid restrictions – could be a valuable source of leverage. In theory, this could send a message: “don’t mess with us, or else we’ll mess with you.” Doing so, however, would create a bigger problem: legitimising disinformation tactics and suggesting that Australia does not see public discourse as sacrosanct.
Finally, counternarratives – while effective and necessary – should be employed carefully. Often, the subtler but more effective form of disinformation comes in a narrative, telling a compelling story about why things are or should be a certain way. The power of a disinforming narrative is that it harnesses and puts coherence around latent beliefs or grievances, while drawing credibility from its selective use of real evidence. Countering such a narrative is hard because it requires more than simply disproving a factual claim or picking apart a distortive story. Instead, it can require establishing and spreading an alternate narrative for people to believe.
Having worked on countering violent extremism in the Australian community, this is something I’m familiar with. Simply rebutting the claims of Islamic State was not enough to stop its appeal to audiences with a pre-existing set of beliefs. What was needed instead was an alternate narrative that acknowledged the legitimacy of certain grievances while providing a more constructive vision for a reconciled Muslim Australian identity.
The lesson here is that for a counternarrative to be consistent with liberal democratic values, it must have a very high standard of factual accuracy, while its origin and motivations must be transparent. In short, a counternarrative should aim to reinform the disinformed, not displace one distorted narrative with another.
With disinformation threats growing, the Australian government would benefit from articulating principles such as these. Doing so would generate a trust dividend, reassuring the Australian people that disinformation is being addressed while seriously considering the potential risks to liberal democratic principles.