Published daily by the Lowy Institute

How social media aids repressive regimes and undermines democracy

There are few limits on means and plenty of opportunities to manipulate politics via social media.

How social media aids repressive regimes and undermines democracy

Remember the Arab Spring? Bliss it was in that dawn to be alive. But to be a social media activist was very heaven. Back then, social media platforms such as YouTube, Facebook and Twitter were hailed as vehicles for democratic uprisings. Enthusiasts gave TED talks and published in prominent foreign policy journals about how social media could 'strengthen civil society and the public sphere'. The internet and the smart phone promised political freedom.

Maybe they weren't wrong then, but things are different now. Social media has achieved pariah status. Like Saturn, the internet revolution is devouring its children.

Freedom House's latest Freedom on the Net report details declining internet freedom for seven years in a row. This year's report varies from previous years, when issues like privacy, access and censorship dominated.

The report follows similar recent work by Oxford's Computational Propaganda Research Project and Sam Wooley at the Digital Intelligence Lab in cataloguing the breadth, variety and impact of the manipulation of social media through the use of paid and unpaid online agents, as well as bots (automated accounts) to post, share, like, quote and re-post content with a view to influencing politics.

The three main tactics used are feigning grassroots support ('astroturfing'), smearing opponents and disrupting online campaigns through distraction. The use of bots greatly amplifies impact: one human user can direct hundreds of bots to automatically generate thousands of posts and comments. The large amount of content and interactions pushes its way to the top of the pile and is more likely to appear on a social media feed. Bots increasingly use machine learning to replicate humans, and they're getting very good.

A common tactic uses bots to counter antigovernment hashtag campaigns by either promoting alternative hashtags (bumping others down or off the list of trending hashtags) or by posting nonsense or irrelevant material using the antigovernment hashtag, thereby rendering it useless (known as 'hashtag poisoning').

The report has plenty of examples. Sudan has a unit in its National Intelligence and Security Service known as the 'Cyber Jihadists'. Reports from the Philippines refer to the government's 'keyboard army'. In Turkey, some 6000 trolls are reportedly enlisted to manipulate discussions and harass opponents of the Erdogan government. In Mexico, 75,000 automated accounts known as 'Peñabots' have been employed in hashtag poisoning campaigns.

In China, where state employees are supported by a vast army of online patriots, the preferred modus operandi, according to Harvard-Stanford-University of California San Diego research, is to avoid argument in favour of distraction and changing the subject when controversial topics arise. To achieve this, the government generates estimated 448 million false posts per year.

In our region, Freedom House lists elections in 2018 in Cambodia, Malaysia and Thailand as being at high risk of interference through Internet restrictions.

The role of social media in the 2016 US election, currently the subject of congressional investigation, is the most high-profile and widely reported example of apparent manipulation, mostly through fake news and timed leaks. It varies from those above as the key manipulator wasn't the US government. In fact, a rogue's gallery of professional political operators and hyper-partisan activists, opportunistic entrepreneurs (many located in Macedonia – who saw that coming?), and foreign agents (specifically, Russia's Internet Research Agency, or IRA) appear to have either colluded, acted separately but with shared interests, or simply taken advantage of circumstances.

Facebook's response includes developing a tool for users to check if they have been unknowingly following IRA accounts. Well, one, that horse has bolted and, two, if the IRA can't hide their efforts via sub-contractors and shell accounts, they aren't worth two kopeks.

The 2016 Brexit referendum and recent European elections are under similar clouds. Could it happen in Australia?

The occasional problematic online advertising during election campaigns aside, there are probably sufficient checks in place to prevent an Australian government from operating such as those profiled in the Freedom on the Net report. Freedom House lists Australia as 'free'.

But as for manipulation by foreign actors and sub-contracted or independently operating political activists, it absolutely could happen. I think it almost certainly will and it probably already has occurred and gone undetected, or at least unreported.

A comprehensive analysis of US and USSR/Russian interference in foreign elections by Dov Levin at Carnegie Mellon University suggests that meddling has been widespread and often effective since 1948, with the US being the historical main culprit. Oceania, says Levin, is the only region that has escaped. Perhaps we're due?

Three things are required for Australia's democracy to be manipulated via social media: means, motive, and opportunity. The means (the online tools, including bots) are readily available and the opportunity (the social networks themselves) are ubiquitous and popular.

These are practically impossible to police. Recent amendments to the Electoral and Other Legislation Bill extend existing laws related to electoral matter (material designed to influence elections) to include online and social media. These now require authors of social media posts to be identified to prevent astroturfing. The Parliamentary Library digest on this Bill notes the difficulty in enforcing this.

Presently, there's little prospect of preventing anyone with even a limited budget or skillset from attempting to manipulate politics via social media. The US-based Council on Foreign Relations is more optimistic but some of its recommended solutions, such as making government responses 'as interesting as the fake news they are countering', read like wishful thinking.

In short, as it stands, there are few limits on means and plenty of opportunities.

This leaves motive. Who would be interested in covertly exerting influence over Australian politics?




You may also be interested in