Published daily by the Lowy Institute

Bluesky dreaming? Social media shifts signal tech power and influence

Or does X mark the spot?

The Bluesky and X, formerly known as Twitter, icons superimposed over a photo of Elon Musk (Jaap Arriens/NurPhoto via Getty Images)
The Bluesky and X, formerly known as Twitter, icons superimposed over a photo of Elon Musk (Jaap Arriens/NurPhoto via Getty Images)
Published 3 Dec 2024   Follow @Miah_HE

Shifts in social media platforms are part of a larger tech power play and critical in understanding how information is challenging democracies. Algorithmic curation which contributes or leads to radicalisation, abuse or violence is not yet a crime. It should be.

Social media platforms need to contribute positively to democratic debate and ensure a safe space for users, free from abuse, hate speech, algorithmic manipulation, propaganda and state-sponsored information operations.

The US election, in all its polarising glory, seems to have been a turning point for microblogging platforms. With X (formerly Twitter) in decline, both as a credible platform and in usership, disillusioned users are searching for alternative platforms, such as Meta’s Threads, and decentralised social media, like Mastodon and Bluesky. Decentralised media are known for improved data ownership and privacy, resistance to censorship, and community governance.

Traffic on Bluesky is up 500% in recent weeks and X account deactivations in America quintupled. Bluesky’s audience is still much smaller than Threads or X. However, the user base is around 24 million (visual counter here) shy of the all-important 100 million mark but growing at multiple users a second. US daily active users for Bluesky have now overtaken Threads.

Bluesky is different to X, despite its 2019 start as a Twitter-funded project. It’s a protocol, not a platform. A protocol essentially means that it is more like email or the internet. Bluesky CEO Jay Gruber said it is “billionaire-proof”, since the company aspires to be an “open and decentralised standard for social media”, making control by a person or company difficult. It has also stacked moderation, enabling user autonomy on content and allowing algorithmic control over their feeds.

Removing ourselves –or being removed – from platforms with different views and perspectives is not the solution.

X was used globally by journalists, media outlets, politicians and intelligence agencies. It had been an invaluable resource as a source of news and current affairs, as well as temporal “OSINT”, or open-source intelligence.

Few actions have done more to make a social media platform a haven for disinformation, extremism, and authoritarian propaganda than the changes made since Elon Musk bought Twitter in 2022. The user experience has continued to decline significantly and usage fell by a fifth from Musk’s purchase to 2024. Problems include the prevalence of mis- and dis-information, conspiracy theories, harassment and abuse, a sharp rise in hate speech, sexualised violence towards women, porn and platforming Nazi and hate groups. Following Musk’s purchase, X disbanded its trust and safety teams, revoked bans on extremist and dangerous accounts, removed “state affiliation” labels (including Russian and Chinese propaganda outlets), censored journalists critical of Musk, neutered the block feature and made basic verification (to reduce imposters and fake accounts) a payable subscription. In contrast with common-sense, existing multi-factor authentication was rolled back, available only as a premium (payable) service. As I wrote in 2023, this began the decline of X as a trusted platform.

What Musk has done to “Twitter” flies in the face of research and practice on how to make social media resilient to hate, harassment, and authoritarian governments’ information operations. X harms citizens of democracies and benefits autocratic governments.

The exodus from X has been argued as the demise of social media. That seems unlikely. Consider the growth in digital platforms, the decline in traditional media use, increase in socials to access news, the polarisation of politics and the money being invested in tech companies that run social platforms – as well as their much sought after influence capability.

Whether Bluesky will replace X is another question entirely. Network effects are crucial for social media success. This means platforms only became engaging and interesting once a quorum of users is active on the platform. You want to be on a social network where your friends or colleagues are. It’s not much use using a ride sharing app which has no cars. And of course, this creates tension with competition policy.

It’s impossible to talk about X or Musk without acknowledging some big power and influence shifts. Even before Donald Trump’s win, the relationship between tech companies and the US government has been transforming, as Marietje Schaake wrote in Tech Coup: How to Save Democracy from Silicon Valley. Musk has been vocal in support of right-wing issues and Trump. Early research findings showed algorithmic bias in engagement metrics favouring Musk’s personal X account and that of some Republican-leaning accounts.

It’s tempting to leave X completely. While there are legitimate concerns about how it is being run – and no one should have to suffer illegal behaviour or abuse – leaving platforms you disagree with, especially en-masse, creates a vacuum and reduces potential for social cohesion. As we saw with the way podcasting played out in the US election, absenting alternative perspectives from platforms does not result in much needed constructive democratic debate.

Some warn Trump’s election win will merge tech and state power, where the interests of select technology companies become indistinguishable from US government policy. For democratic nations outside America, this power shift threatens their ability to govern digital spaces. This merger of state and tech power isn’t just another shift in the landscape. It’s a fundamental challenge to democratic governance in the digital age. The work of protecting democratic values in this new landscape has only just begun.

Removing ourselves (or being removed) from platforms with different views and perspectives is not the solution. Ensuring they are safe for users and contribute positively to democratic debate is critical. We should also explore effective ways to limit the influence and interreference capacity of algorithmic curation. Where it leads to or supports radicalisation, abuse or violence, it should be an enforced crime.




You may also be interested in