Published daily by the Lowy Institute

After Christchurch: Mapping online right-wing extremists

For all the hate, sometimes extending to talk of violence, the extremist milieu is also a highly social space.

SoulRider.222/Flickr
SoulRider.222/Flickr

When Australian man Brenton Tarrant conducted a mass-shooting terror attack last year at two mosques in Christchurch, New Zealand, guns were not his only weapon. Before commencing, he posted links to a Facebook live-stream of the attack, while also uploading a self-penned manifesto onto 8chan’s “/pol” board. As Tarrant killed 51 people and wounded 49, he had effectively weaponised social media platforms in order to broadcast the propaganda to a global online audience.

In the wake of this international terrorist incident, researchers from Macquarie University in Sydney, in collaboration with Victoria University in Melbourne, mined and analysed data from six social media platforms to examine the spread of right-wing extremist sentiment across New South Wales, Australia’s most populous state, where Tarrant had lived before travelling in Europe. A mixed-method research design was created to investigate what had been this unstudied online community.

From August to November 2019, anonymised data was collected from Twitter (37,422 tweets from 3,321 users), Gab (1,357,391 toots from 23,836 accounts) and a sample of archived message boards on Reddit, 4chan and 8chan (now named 8kun). In addition, ten years of historical data was sampled from 30 Australian right-wing extremist Facebook group pages.

Movement from low- to high-risk platforms exposes users to increasingly smaller communities with less access to different content or opinions, reflecting the potential for the construction of an increasingly extremist social identity.

A snapshot of the project’s key findings shows related levels of risk emanate from the online right-wing extremist milieu.

One is a creeping threat to liberal democracy. The findings show a shifting of the acceptable window of political discourse towards an extreme end point, described as a shifting of the Overton window, creating an insidious and creeping threat to political norms in Australia. This environment is characterised by narratives that challenge the fundamentals of pluralist liberal democracy through exclusivist appeals to race, ethnicity and nation.

Within this right-wing extremist milieu, there are individuals who advocate for the use of violence as a tactic for expressing political grievances. Communities on more “high risk” platforms engaged in increasingly extreme rhetoric, including narratives supporting violence.

The findings also showed online right-wing extremist communities largely consist of networks of socially connected individuals who engage and share content across diverse social media platforms. Online right-wing extremism in NSW can be thought of as a loosely connected “milieu”, rather than series of clearly defined extremist or terrorist groups. Within this wider social milieu, formal extremist groups do exist and pose a threat. For instance, despite proactive attempts by Facebook to moderate right-wing extremist groups, by 2019 the groups were steadily returning to the platform.

Although hateful and extreme, the right-wing extremist milieu is a highly social space. Social connections are created and maintained around shared values and norms engendering positive experiences for those involved in the networks. The content often conceals its revolutionary anti-government agenda behind appeals to nationalism and “traditional” Australian values. These extremist perspectives are often presented through online content that is entertaining, provocative and supposedly ironic.

The right-wing extremist milieu makes use of multiple social media platforms that afford users different forms of social interaction. These create a right-wing extremist “social media ecology” that individuals move through searching for rewarding social connections. The research classified this ecology from low to high risk in relation to the strength of echo chambers and level of platform moderation. Twitter, Facebook and Reddit were lower risk than Gab, 4chan and 8chan. Movement from low- to high-risk platforms exposes users to increasingly smaller communities with less access to different content or opinions, reflecting the potential for the construction of an increasingly extremist social identity.

The content analysis of extremist posts also revealed that online extremist communities in NSW have a high interest in real-world issues occurring in Australia and the United States, in particular American populist politics. This is highlighted in the hashtag frequency analysis with #MAGA and #auspol occurring in the top five hashtags in posts by Twitter and Gab users.

Conspiracy theories were prevalent across the dataset, with #Qanon, for example, being the fourth most popular hashtag present in the Twitter data. These conspiracy theories create an alternative reality that instils a sense of unprecedented crisis in the audience and the need for an authoritarian response.

Conspiracy theories serve to delegitimise government, present right-wing extremists as protectors of the social good, and opponents as belonging to an international cabal that is inherently evil. At the same time, consensual facts about reality are disputed, eroding the potential for meaningful communication between right-wing extremists and others.

The theme of “white identity under threat” was observed in some form across all platforms, relying on a shared narrative that posits white identity as vulnerable and requiring robust authoritarian defence from hostile others. Irony and humour provide users with opportunities to innovate and adapt to the specificities of given platforms. However expressed, these narratives consistently seek to delegitimise liberal democratic government while dehumanising the “other”, in particular Muslims, Asians, Jews, women, liberals and members of the LGBT+ community.

On low-risk platforms, references to “Trumpism” and conspiracy theories such as #QAnon provide users with an opportunity to engage with content that references the defence of a “white identity under threat” without violating platform moderation policies.

On high-risk platforms, this theme is framed in far more explicit terms that consistently draw on antisemitism, Islamophobia and other forms of violent “othering”. On messaging boards such as 8chan, these themes are explicitly extremist and incorporate a willingness for violent action in response.
 

This article is part of a year-long series examining extremism and technology also available at the Global Network on Extremism and Technology, of which the Lowy Institute is a core partner.


Related Content



You may also be interested in