From Delhi to Washington to Canberra, the future of the digital economy may be heavily influenced by how one question is answered:

What to do about TikTok?

The popular short-video platform owned by Beijing-based parent company ByteDance has been at the centre of a storm of controversy. Concern over its ties with Chinese authorities has already led to its removal from India’s app stores, along with 58 other Chinese-backed apps. Similar concerns have driven scrutiny in the US, Australia and other nations that have seen their relations with China deteriorate in recent years.

Yet the company seems determined to maintain its global presence, hiring a former Disney executive as its CEO and exploring a plethora of ways in which TikTok could detach itself either operationally or organisationally from ByteDance. There has also been public pushback against the prospect of a ban, including accusations of anti-China xenophobia.

The debate has been a frustrating one – in part because there seems to be no shared consensus on what, exactly, the debate is about.

In essence, the question of what to do about TikTok is about at least three different things: data, content, and the norms and expectations of bilateral trade in the digital economy:

Data, privacy and security

Perhaps the most straightforward cause for suspicion of TikTok is the issue of data privacy. Indeed, TikTok has track record of playing fast and loose with what data it collects from its users, including until recently clipboard data and data from children who use the app.

Sloppy and invasive data practices, however, seem to provide insufficient cause for an outright ban of the app. TikTok was far from alone in its keyboard-snooping practices, and data-management transgressions are commonplace among internet companies wherever they are based.

China’s tech firms are playing a poker game in which everyone else’s cards lie face-up on the table, while they continue to hold theirs close to their vest.

What is distinct about TikTok’s data dilemma is its connection to Beijing. ByteDance is required to cooperate with state intelligence-gathering activities, and regardless of which law says what, the nature of China’s authoritarian system does not provide many pathways for resisting its authorities.

The company maintains that its US user data is stored in the US, with a backup in Singapore. Regarding its data-sharing relationship with Beijing, company spokespeople have repeatedly denied handing over user data to the Chinese government, and claim they would not if asked. Regardless of whether true or not, such statements are less than sufficient. After all, it is not generally considered prudent to craft national security policy simply based on a company’s promises.

Algorithms and content curation

Perhaps less discussed than data management is the way in which content is disseminated on TikTok, which lacks transparency and has a high potential for misuse.

Machine learning–enabled recommendation systems are not uncommon on social media platforms and have received ample criticism. The perceived radicalising effect of Youtube’s “rabbit holes”, for example, has been a subject of attention for researchers and regulators alike.Yet while many media platforms include recommendation features, TikTok and similar short-video apps are distinctive in that they place the recommendation function at the core of the user experience. Upon opening the app, the user is immediately shown a continuous string of videos. In contrast to platforms like Facebook or Instagram, which are based upon a user’s social network, or Youtube, which requires the user to click and open a video, TikTok removes as much user decision-making friction as possible. The recommendation engine does the work, while the user’s likes and swipes provide insights as to which types of videos will keep them most engaged.This approach has been one of the biggest reasons for TikTok’s success. Yet it also means that TikTok is playing a very interventionist curatorial role in its content. In the spectrum between “social network” and “media company”, TikTok is far more “media company” than, say, Twitter or Facebook.

Like a media company, it promotes and recommends certain types of content. Also like a media company, it censors and suppresses other types. TikTok has reportedly instructed its moderators to “suppress posts created by users deemed too ugly, poor or disabled” for the platform. In an incident which raised suspicion that it would censor content based on the standards of the Chinese Communist Party, the company faced a media firestorm last year for banning the account of a US-based teenager who had posted videos critical of China’s treatment of Uighurs. Although the ban was reversed, the company’s explanation for it did little to allay suspicion of its motives. A 2019 internal document leaked to The Guardian also instructed TikTok moderators to censor certain content considered sensitive by Beijing.

It is the opacity and uncertainty of TikTok’s content moderation and recommendation which make it so tricky to evaluate as a potential security threat. Given the benefit of the doubt, many of TikTok’s moderation misdeeds might be considered genuine mistakes (as the company has claimed), understandable for fast-growing internet company. But it is nearly impossible to know for sure.

 

 

A partial analogy may be television networks in the 1980s and 1990s, when a handful of influential outlets made programming decisions that determined what millions of viewers would watch each night. In our current digital environment, a handful of tech platforms are the ones with that kind influence. But instead of on living room TVs, the content is displayed via smartphones, and rather than each network airing the same programming to all viewers, each platform curates content for each individual user based on an algorithm.

In the case of TikTok, the analogy is further complicated because the algorithm determining what content its users see is developed in Beijing, and resides under the ultimate authority of a government that is currently adversarial to the countries home to TikTok’s largest and most valuable international markets.

The internet as a trade issue

Much of the debate over a potential TikTok ban has centred on the notion of banning an app based on its country of origin. For those opposed, it would undermine the ideal of a free and open internet, and represent another step in the formation of a Balkanised global “splinternet”. Yet others are quick to point out that the walling of the global internet began decades ago, as Beijing built its firewall and advocated for a norm of “cyber sovereignty”.

While both arguments have their validity, the central question that remains is this: for governments with freer and more open cyber policies, what is the appropriate approach once Chinese firms cross the great firewall?

China’s internet regulations – although initially driven by the perceived need to control the internet within its borders – have created a series of barriers that have actively placed most foreign internet firms at a disadvantage, making room for the growth of domestic tech titans like Alibaba, Tencent and ByteDance. As these firms have become international, they have benefitted from the scale and resources provided by a large and closed domestic market, while also entering the more open global digital environment.

In other words, China’s tech firms are playing a poker game in which everyone else’s cards lie face-up on the table, while they continue to hold theirs close to their vest. In such a situation, all other players at the table have little choice but to hide their own cards as well.

For India, China’s promotion of the cyber sovereignty notion provides a clear precedent for it to foster its own tech giants. For more developed nations, it is a crucial moment to examine if the notion of “cyber sovereignty” can be applied while maintaining a digital sphere that is liberal, open and democratic, as well as what role, if any, Chinese firms can play in it.

Until that question is answered, TikTok’s future is likely to remain tenuous.