Answer given by Executive Vice-President Virkkunen on behalf of the European Commission
7.1.2025
The Commission is aware of the developments and is in close contact with TikTok to discuss the implications of these changes in light of its obligation under the Digital Services Act (DSA)[1].
The DSA does not prescribe any specific rules about the resources to be dedicated to content moderation. However, the DSA does require online platforms, including TikTok, to enforce their content moderation rules in a diligent, objective and proportionate manner.
In addition, the DSA requires platforms to have effective internal complaint-handling systems and grants users the right to challenge content moderation decisions.
While automated moderation is allowed, online platforms must be transparent about its use and accuracy. In addition, online platforms should ensure that qualified staff ensure fair and unbiased decision-making of their internal complaint-handling systems.
The DSA also requires very large online platforms and very large search engines to diligently mitigate any systemic risks stemming from the design, functioning or the (mis)use made of their service.
To that end, it is important that designated companies put in place adequate content moderation processes and dedicate enough resources for diligent content moderation.
The Commission is closely monitoring TikTok’s compliance with the DSA and will follow up with formal enforcement steps if appropriate.
- [1] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).