The practice of shadow-banning content on social media platforms
20.10.2023
Question for written answer E-003111/2023
to the Commission
Rule 138
Samira Rafaela (Renew)
We have been made aware of a very worrying practice in which major social media platforms, such as Meta, X and TikTok, are secretly hiding certain posts and content that they seem to deem incompatible with their own opinions. This practice of shadow-banning has been used particularly to hide posts and content addressing the humanitarian crisis in Gaza[1], thereby suppressing the voices of a particular community.
One of the main goals of the Digital Services Act is to create a safer digital space in which the fundamental rights of all users of digital services are protected. I would therefore like to ask the Commission the following questions:
- 1.How will the Commission work on criteria to make a clear distinction between content moderation and visibility reduction and is it aware of this worrying shift from content moderation to visibility reduction, which poses a real threat to our freedom of speech?
- 2.What range of techniques can the Commission develop in the future to prevent shadow-banning and other methods aimed at reducing the visibility of certain platform users, even if they have not breached the rules of conduct?
- 3.Considering the fast-paced development of the conflict and its subsequent impact on the online dissemination of content, can the Commission respond to these answers in a time-appropriate manner, i.e. within five days?
Submitted: 20.10.2023
- [1] https://www.theguardian.com/technology/2023/oct/18/instagram-palestine-posts-censorship-accusations