Lay-off of TikTok’s entire content moderation team in the Netherlands and the DSA
6.11.2024
Question for written answer E-002454/2024
to the Commission
Rule 144
Kim Van Sparrentak (Verts/ALE)
In September 2024, TikTok fired its entire moderation team of 300 moderators in the Netherlands. This is extremely worrying for online safety.
- 1.Considering there are 5.7 million active users a month in the Netherlands alone, does the Commission believe that very large online platforms, such as TikTok, can comply with the obligations surrounding notice and action in Article 16 of the Digital Services Act (DSA) and with Article 20 of the DSA on handling internal complaints, which obliges platforms to take ‘reasoned decisions’ ‘under the supervision of appropriately qualified staff, and not solely on the basis of automated means’, without dedicated staff in Member States who speak the language, understand the context and can assess what is illegal content or disinformation?
- 2.Does the Commission believe that it is possible to effectively comply with the obligations under Articles 34 and 35 of the DSA to identify, analyse ‘taking into account specific regional or linguistic aspects, including when specific to a Member State’ and effectively mitigate systemic risks for platforms like TikTok without any content moderation staff in the Netherlands?
- 3.Does the Commission agree that adequate local staffing and human oversight are needed in practice to comply with Articles 16, 20 and 34, especially for very large online platforms?
Submitted: 6.11.2024
Last updated: 14 November 2024