Published daily by the Lowy Institute

Disinfopreneurs and infodemics

Black Ops social influence campaigns are big business. Some difficult questions and a few bold ideas are on the table.

The attraction of social media is its capacity as a low-cost, effective, unlimited channel for campaigns of influence (Camilo jimenez/Unsplash)
The attraction of social media is its capacity as a low-cost, effective, unlimited channel for campaigns of influence (Camilo jimenez/Unsplash)

The use of social media in influence campaigns, including grey-zone activities and hybrid warfare, is becoming more complicated, more diverse, more profitable and more dangerous.

This is being led, in part, by the privatisation and industrialisation of “weaponised information”.

Referred to as information disorder or an infodemic, deceptive social media influence operations include mutations that have the ability to impact societies differently, contingent on political and economic factors, and a growing list of potential mitigating responses.

Cheap, targeted, effective

Social media began as an irrelevance, then a distraction, then a liberation, then an addiction, then a threat to democracy and a public health menace – a boon to extremists and authoritarians and activists and dodgy snake oil sellers.

Earlier hopes that social media might have liberalising impacts in repressive societies have been dashed – the 2011 Arab Spring being the principal case in point.

Now it’s all of these and it’s everywhere and it’s exhausting.

The popularity of social media platforms, their unfathomable networked communications structure, the ease of content creation and re-creation, the tech companies’ irrepressible desire for growth without restriction, and the subsequent tardy regulatory responses to the effecs of these elements are all factors forming the bedrock of the platforms’ ubiquitous and uncontainable influence.

The fundamental attraction of social media is its capacity as a low-cost, effective, unlimited channel for campaigns of influence and persuasion.

Content can be created cheaply by anyone and distributed at a speed and scale that makes moderation difficult. Audiences can be micro-targeted with precision; messages can be honed through A/B testing – a simple controlled experiment in which two samples are compared. This can all be done while largely evading detection or obligation.

These methods have boomed and bloomed. The social influence industry has undergone a kind of Cambrian explosion – a genesis from which all other more diverse and sophisticated forms have evolved – over the last few years to take advantage of the opportunities afforded by Covid-19, distrust in authority, and advances in digital marketing and surveillance-based data gathering for profit and/or power.

The breach of the US Capitol in Washington DC on 6 January as Congress debated the 2020 presidential election Electoral Vote Certification was fuelled by social media (Saul Loeb/AFP via Getty Images)

We are now beginning to better understand that social influence campaigns via social media can destabilise strong democracies while supporting strong authoritarian regimes. Earlier hopes that social media might have liberalising impacts in repressive societies have been dashed – the 2011 Arab Spring being the principal case in point.

The malign hand

Social media platforms are also allowing violent extremist groups to become tactically agile – the same groups behind the 6 January attacks on the US Capitol are now veering away from mass gatherings to organising, coordinating and fundraising online.

A further complicating factor is the explosion in number and type of professional actors in the disinformation industry. These individuals pose threats in their own right and are also being contracted by state actors to conduct campaigns at arm’s length, similar to the use of private military contractors – it’s efficient, probably cheaper and certainly more deniable than using state apparatus.

An even more worrying future scenario of online manipulation is based on automation and artificial intelligence: influence operations at an industrial scale.

On one hand, the rise of these individuals – let’s call them “disinfopreneurs” – has escalated. In 2016, the online world marvelled at a Macedonian cottage industry of ideologically-agnostic grifters making more than $100,000 by posting uber-popular pro-Trump “fake news” and raking in the advertising revenues.

In 2021, it’s much worse. A small motley crew of anti-vaccination disinfopreneurs are allegedly making millions for themselves and, through advertising, billions for the social media platforms, by hawking dangerous promises to desperate people. Despite the US Congress calling on the tech giants to regulate these types of Covid disinfopreneurs, they have largely failed to do so.

An even more worrying future scenario of online manipulation is based on automation and artificial intelligence: influence operations at an industrial scale. One such example was spawned in Taiwan, a “Content Farm Automatic Collection System”, internet servers crawling the web for text from existing articles and using AI to re-package it into new content for mock websites, which is automatically promoted via fake social media accounts to the newsfeeds, messaging in-boxes and search engine results of his target audiences.

Another model is based on the exploitation of manual labour, including so-called “troll farms” that use employees to influence political opinions and harass journalists online. In some cases, the labour force is made up of university students supporting their studies on a salary the equivalent of a McDonald’s cashier. In the Philippines, it is digital communications and marketing professionals operating in a business that offers cash, thrills, comradery and power.

Social influence campaigns via social media have the power to destabilise democracies while supporting authoritarian regimes (Jason Howie/Flickr)

The so-called “Black PR” or “digital black ops” companies currently using these methods sell their services to companies, political parties and governments. Their business models are built around influence campaigns that create authenticity and familiarity, using skills in deception, distraction and persuasion.

Combating a growing threat

Business is booming and includes government clients paying firms to run foreign interference campaigns. Social media companies can and do detect and remove these accounts, but Facebook’s head of cybersecurity policy, Nathaniel Gleicher has admitted the “professionalisation of deception” is a growing threat.

Countering disinformation is one thing, engaging in interference campaigns is another.

In the face of this chaos, what is to be done?

  • Don’t (only) re-fight the last war. While the threat of a Russian-style “firehose of falsehood” disinformation campaign remains – one in which high numbers of digital channels are used – it is now more likely that such efforts would be detected. The game has moved on.
  • Do reach out, engage, partner with like-minded entities – international problems require international solutions. This is an area where Australia can offer leadership. It is in its interest to do so. Australia is ahead of the game compared to some of its neighbours and friends.
  • Be open to learning from and collaborating with international partners – nations, non-governmental organisations, researchers and companies.
  • Focus on new forms of resilience and resistance. Some of these will be technological – detection of deceptive content, tracking and tracing disinformation to its source/s and thus attributing responsibility. Some of these will be social – declining trust in institutions and in political leadership is a major contributor to people seeking out, and believing, alternative sources of information.
  • Address the trust deficit. This is both difficult and necessary. It requires an appreciation that people believe in groups. Communication, including disinformation, is a social act.
  • Conduct influence campaigns. It is inevitable, and desirable, that Australia conducts public diplomacy campaigns, as well as continuing to develop counter foreign interference capabilities. This is a live issue here in Australia, with several government agencies now involved in investigating how to adapt and respond.

Countering disinformation is one thing, engaging in interference campaigns is another. How far should Australia go? Is it ok to deceive adversaries? What are the costs in terms of trust and reputation? What limits should exist and how should these be observed and accounted for?

These questions, and more, need to be considered well beyond the realms of the digital landscape. They should be posed as questions of military doctrine where the rules of warfare need to evolve and adapt to a world littered with disinfopreneurs and infodemics.




You may also be interested in