The Digital Services Act is Europe’s rule governing content on social networks. The Act provides restrictions on what can be published. Where in the U.S. firms are largely not liable for illegal content posted on their websites, in Europe they are since 2022 responsible for numerous forms of ‘harmful’ content and can receive sanctions up to 6% of global revenue.
Under the DSA, U.S.-style restrictions on TikTok are almost impossible. While a “Digital Services Coordinator” can request a "temporary restriction of access" through courts (Art 51.3), this power only kicks in after all other options are exhausted. Even then, it requires proving a criminal offense involving a "threat to life or safety of persons." A Chinese or Russian platform controlling content for 150 million Europeans raises no special flags under the DSA framework.
Regulators can raid company offices (Art 69.2), demand algorithm audits (Art 40.3), and impose fines of up to 6% of global revenue (Art 52.3). However, they cannot address platforms controlled by foreign powers without meeting the high criminal threshold. Even the "crisis protocols" focus on narrow emergencies (Art 36). Neither national security concerns nor foreign ownership risks — obvious triggers for platform bans — are covered. The DSA prioritizes content violations, not strategic threats.
Perhaps that is fine — one law does not have to do everything. But as of right now, Brussel’s desire to regulate social media’s impact places greater weight on a billionaire’s posts than a rival state's control.
There are many bad reasons for wanting to interfere with TikTok. Those like Tyler Cowen who oppose the U.S. divestiture have pointed out that if there are large-scale privacy concerns, then the right approach is to do an investigation, provide evidence, and treat them as we would any other company violating user privacy.
Similarly, concerns about user harm — such as Albania's planned ban following a TikTok-related death, or the EU's investigation into features promoting underage addiction — should apply equally to YouTube Shorts, Instagram Reels, Facebook Shorts, Linkedin Shorts, and every other algorithmic short-form video platform.
Instead, the argument for restriction stems from a straightforward strategic calculation: a state the EU classifies as a ‘systemic rival’ should probably not control the continent's most influential media property.
The cost-benefit in favor of restriction rests on the following:
i. Control of TikTok allows China to deploy large volumes of propaganda in its favor at a time of its choosing.
ii. While major escalation between Europe or its allies and China is not probable, the potential damage is high.
iii. The social media market is robust, with many established firms providing similar services and no legal restrictions on new entrants, so the infringement on liberties is limited.
iv. Given China’s extensive restrictions on Western tech firms and minimal response when India adopted a complete ban in 2020, significant backlash is unlikely.
Restricting TikTok is a question of preemption. There is evidence that TikTok may already be biasing its algorithm to show pro-PRC content. If it is, it is subtle — videos criticizing Xi Jinping or supporting Taiwanese nationalism are freely available on the site. But it isn't about what they are already doing—it is about whether Europeans should be willing to give them the potential to do it at all.
Liberals fear the slippery slope — the possibility that this would be another tool that would lead to overregulation. Indeed, X is controlled by a foreign owner who seems set on using that platform to directly interfere with European politics and is almost certainly altering the platform's algorithm to make that possible. If the EU restricts TikTok—why not X?
There is a principle that can be applied: restricting media platforms controlled by countries of concern. The EU previously prohibited Russia Today and Sputnik. While TikTok is nominally based in Singapore, owner ByteDance is headquartered in China. Multiple European states (including France, Austria, and Spain) already limit foreign control of traditional media, particularly television. The right approach here would be to specifically target ownership of media properties by certain countries.
The strongest arguments against interfering with TikTok are those that simply disagree with the premise that China is a threat. Many bright Europeans disagree with the prevailing hawkishness in D.C. But most national governments in Europe today do not support this. On paper, they acknowledge that China is a ‘systemic rival’ and ‘competitor’. There is a nominal belief that China is indeed a threat to Europe. If that is sincere, it would imply action.
Across Europe, the political extremes are surging in support. Even more than X, their presence is particularly vibrant on TikTok. To be clear: I do not think that political discontent is being driven by the TikTok algorithm. But it is remarkable that European elites, who have been extraordinarily focussed on the potential influence of Russian disinformation and are obsessed with Musk, are seemingly unconcerned by the impact of content many orders of magnitude larger that is directly controlled by a ‘systemic rival’.
Indeed, one of two DSA investigations into TikTok that is taking place is about disinformation affecting Romanian elections (the other was on addiction). In this case, the Commission sees a significant threat in 25,000 Russian bots — but remains blind to Beijing's grip on the platform itself.
In an earlier post, we introduced the concept of luxury rules — feel-good policies that Europe could afford to pursue thanks to unprecedented prosperity and peace. Being entirely unconcerned by TikTok is a continuing example of the complacency that Brussels elites forswore in the aftermath of the Russian invasion of Ukraine.
Scope sensitivity — a core principle in good decision-making — holds that our response to threats should scale with their magnitude. Yet in platform priorities, Europe shows a peculiar insensitivity. Politicians are reacting with greater vigor to a single individual’s tweets reaching millions than a state-controlled algorithm reaching hundreds of millions. They investigate 25,000 Russian bots with more intensity than the platform that commands 50 billion hours of European attention each year. This numerical blindness matters.
It has been lost in an unlikely corner of Europe — the French overseas territory of New Caledonia. After violent clashes left five dead, French authorities simply switched the platform off across the region. Their reason was direct: "interference from foreign countries." Priorities are made clear when faced with upheaval. For now, the clarity remains confined to a distant Pacific island. In Brussels, regulators are still counting Russian bots and Musk posts.
Mmmm