Most people know TikTok as an entertaining app with an endless stream of snappy, addictive short videos that make them smile after a long day. Kids jiggle away in cute dance routines, while US comic Sarah Cooper rose to “TikTok fame” with her lip-synching impressions of US President Donald Trump. Another video miming former Australian prime minister Julia Gillard’s famous misogyny speech also caught alight.
Yet TikTok has also become a geopolitical football between China and several of the world’s largest democracies, most notably the United States and India. The two countries’ efforts to ban TikTok, which is owned by the Beijing-based tech multinational ByteDance, have led China to accuse them of breaking WTO rules.
TikTok’s recent strife reflects more than the world’s growing distrust toward Beijing’s global ambitions. It also reveals democracies’ uneasy relationship with a new system of social media algorithms that is potentially reshaping how we consume, navigate and interact with information.
As this success model is bound to proliferate through imitation, its implications for democratic values such as freedom of expression, accountability, justice and non-discrimination deserve more of our collective attention.
We should start discussing how its peculiar platform model is reshaping the way we consume and engage with information.
It is worth noting that TikTok is a cultural wonder in itself. Few among the team of engineers who built it have ever lived outside of China. But in just three years, the app has swept up 1 billion users across 154 countries, becoming the world’s most downloaded app in 2020 and absorbing on average nearly an hour of each user’s day. It also made ByteDance the world’s most valuable start-up, leapfrogging Uber with a whopping $75 billion evaluation.
The app’s meteoric rise into the West’s cultural mainstream is a testament to the explosive success of its specialised recommendation system. All social media platforms have native recommendation systems — a set of interacting algorithms aimed at predicting what users like in order to deliver more personalised content.
In the internet age, where information is abundant but human attention is finite, the ability of a social platform’s recommendation system to retain and sustain users’ interest is directly correlated to its chance of survival in the competitive attention economy. And TikTok has clearly nailed it.
TikTok’s recommendation system is unlike anything we have seen before. Older platforms rely solely on our active online behaviours (e.g., following, friending, subscribing, liking or clicking) to gauge our preferences. But TikTok captures even our passive and subtle behavioural patterns to teach its algorithms about us in real time, as we consume videos. These patterns include how many times we let a video loop, how quickly we scroll past certain content, and whether we are drawn to a particular category of effects and sounds. This hyper-responsive recommendation system allows TikTok users to remain completely passive, if they so wish, while still arriving at an engaging, personalised content feed much faster than on other platforms.
The fact that the platform no longer relies on people’s social networks to recommend engaging content makes TikTok extremely accessible for both content consumers and creators. For one, it relieves people of their performative social burden of accumulating friends, cultivating a follower base and building an audience. Once entering TikTok’s recommendation engine, a video from someone with no follower base could still go viral as long as it is engaging enough. So far, the platform has generated an entire entertainment industry made up of young TikTokers who rose to fame overnight.
As such, the rise of TikTok has in many ways redefined social media’s role in today’s information ecosystem. It no longer has to aim at connecting more individuals. Instead, it can sidestep social networks altogether by matching individuals directly with their interests – the more efficiently it does so, the more it is rewarded by the global attention economy.
But as TikTok’s popularity soars, the “everything goes” platform increasingly hosts heavier content, such as personal traumas and politics. It is worth recalling Facebook’s path from a tiny dating platform among college kids to a tech behemoth at the centre of global politics and (mis)information.
Before TikTok – or any of its future embodiments – gets there, we should start discussing how its peculiar platform model is reshaping the way we consume and engage with information, and the potential challenges (or opportunities) it raises for public dialogue and democratic norms.
One potential challenge is an intensified “filter bubble” effect. The filter bubble describes a scenario in which social media reinforces our existing beliefs rather than exposing us to new information that we may dislike. While filter bubble is a contested phenomenon not exclusive to online environment, TikTok’s hyper-efficient recommendation system could be more prone than traditional platforms are to shrinking the diversity of the content users consume.
A heavy-handed and secretive approach to content control sets it apart from traditional platforms and invited public scrutiny.
By aiming content directly at groups and subcultures sharing the same tastes, TikTok de-prioritises the human network that underlies other platforms and the organic diversity that derives from human networks. Further, this platform model could also intensify political polarisation. An experiment shows that TikTok’s politically neutral, fun-loving feed can turn conservative and far-right in just a day of looping, liking and sharing certain content.
ByteDance claims it is aware of these issues and has vowed to work on diversifying its recommendations. However, it is a highly difficult engineering problem to diversify content to a meaningful extent while maintaining the precise content targeting that underlies the platform’s strong user experience and popularity.
Another potential issue resides in the platform’s centralised approach to content control. Born in the market-driven environment of Western democracies, Twitter and Facebook are reluctant and reactive content moderators at best. In contrast, TikTok starts its recommendation engine with content moderation. Before any new upload ever goes public, a team of both human and algorithmic moderators tags the content based on various categories, including sounds, hashtags and captions. This is also when certain content is quietly taken down, without being flagged to its creator or publicly tagged.
Such a heavy-handed and secretive approach to content control sets it apart from traditional platforms and has invited public scrutiny. It was only after a leak of internal materials that the public found out that political content, including references to Black Lives Matter protests and Tiananmen Square, was downrated or hidden by the recommendation system. And videos from creators who are disabled or deemed “ugly” were similarly suppressed. The matter was heated further when the TikTok account of a teenager, Feroza Aziz, was suspended after she tried to spread awareness of Xinjiang’s Uighur internment camps.
These content censorship scandals have led many to link TikTok with the Chinese Communist Party itself, which did not help TikTok’s case when it was suspected of foreign interference by countries including India, the United Kingdom and Australia, and when Trump declared the social platform a national security concern, forcing TikTok to face an imminent shutdown in one of its biggest user markets.
However, it is interesting to consider whether these challenges to the principles of free speech and non-discrimination can be easily fixed even when the platform, together with its new mode of content recommendation, is taken over by an entity native to liberal democracy.
TikTok’s role as a centralised, assertive information dispenser goes hand-in-hand with platform users’ role as the passive information receivers, most of whom seem perfectly happy to stay this way. This means that TikTok’s challenges to democratic norms may be rooted in the very platform model that explains its success: a content community consisting of individuals who may not care about why they are seeing what they see, and what has been left out.
The Chinese start-up which survived not only the cut-throat competition of China’s tech scene but also China’s state censorship itself has simply found a way to please the most.
But at what cost? Perhaps digital social networks and inclusive public dialogues would be the first elements to recede from the social media landscape, as more and more Western companies, including Facebook, compete to copy the model of TikTok.
Social media has been under the spotlight for its contribution to the degradation of democratic values. But TikTok is structurally different from other social media platforms in ways which deserve our attention. Regardless of who ends up owning TikTok in the US, it is time to examine the ways TikTok interacts with information in democracies, and how we might adapt the platform’s algorithmic power towards positive social change, rather than the decay of democratic norms.