Find out how the EU wants to address harmful or illegal content online while protecting freedom of expression.
On 5 July 2022, Parliament approved the Digital Services Act to shape the rapidly developing digital economy at EU level and set standards for the rest of the world. One of the fundamental issues that MEPs wanted it to address is protecting users against harmful or illegal content.
Removing illegal content while safeguarding rights and freedoms
According to MEPs, voluntary action by platforms was not enough. They wanted clear, EU-wide rules for content moderation, applying the so-called notice and action mechanism. The rules should ensure that the mechanism:
- is effective - users will be able to easily notify online intermediaries about potentially illegal online content so they can swiftly remove it
- is not abused - in case content is flagged or taken down, affected users will be notified and have the possibility to appeal the decision to a national dispute settlement body
- respects users’ rights and freedoms, such as freedom of expression and information, so that online intermediaries remove illegal content in a diligent, proportionate and non-discriminatory manner and do not remove content that is not illegal
If users contest a decision by the platforms regarding the illegality of user-generated content, they will have the possibility to seek judicial redress,
In addition to illegal online content being removed, where it is criminal, it should be followed up by law enforcement and the judiciary. Online platforms will be obliged to report serious crimes to the competent authority.
Ways to tackle harmful content
To address the problem of harmful content such as hate speech or disinformation, the rules increase transparency obligations for platforms, including their monetisation policies.
More choice for users over what they see online
The rules give users more control over the content they see and the possibility to opt out of content curation altogether.
They also include stricter regulation of targeted advertising, including banning ads aimed at minors and making use of sensitive data, such as sexual orientation, religion or ethnicity.
The Commission presented its proposal for the Digital Service Act and the Digital Markets act on 15 December 2020. After completing negotiations with the Council, Parliament approved the Digital Services Act and the Digital Markets Act on 5 July 2022.
The Council is expected to approve the Digital Markets Act in July and the Digital Services Act in September. For details when the regulations will start to apply, please check out the press release in the links section.