- Measures to counter illegal products, services and content online, including clearly defined procedures for removals
- Stringent obligations for the biggest online platforms
- Recipients of services would have the right to seek compensation for damages
- Mandatory risk assessments and more transparency over “recommender systems” to fight harmful content and disinformation
MEPs voted for new rules to tackle illegal content, to ensure platforms are held accountable for their algorithms, and better content moderation practices.
The Internal Market and Consumer Protection Committee adopted on Tuesday by 36 votes to 7 and 2 abstentions its position on the Digital Services Act (DSA) proposal. The DSA will define clear responsibility and accountability rules for providers of intermediary services, and in particular online platforms, such as social media and marketplaces. Very large online platforms (VLOPs) will be subject to specific obligations due to the particular risks they pose in the dissemination of both illegal and harmful content.
This draft law aims to create a safer digital space in which users’ rights are protected, including through rules to tackle illegal goods, services or content online, enhance the accountability and transparency of algorithms, and deal with content moderation. Including provisions on risk assessments, risk mitigation measures, independent audits and so-called “recommender systems” (algorithms that determine what users see) in the DSA would also help to tackle harmful content (which might not be illegal) and the spread of disinformation.
Rapporteur Christel Schaldemose (S&D, DK) said: “We are now democratically reclaiming our online environment. The DSA is bringing EU tech regulation into the 21st century and it is about time. Intermediary services shape our lives - from the way we meet our significant other, where we buy our Christmas presents to how we read the news. However, the online environment’s growing influence in our lives is not only for the better: algorithms challenge our democracies by disseminating hatred and division, tech giants challenge our level playing field, and online marketplaces challenge our consumer protection standards and product safety. This has to stop. For this reason, we are building a new framework, so that what is illegal offline is also illegal online”.
Committee Chair Anna Cavazzini (Greens/EFA, DE) added: “Instead of platforms dictating the rules, the DSA will lay out how to deal with illegal content and content moderation. Additional rules for very large platforms, such as risk assessment and audits, will benefit consumers, our societies and our democracies. Today's committee vote clears the way for a vote by MEPs in January’s plenary and then the start of negotiations with Council. As one of this parliamentary term’s widest-ranging pieces of legislation on digital policy, I am happy that we found compromises that a broad majority can support."
Removing illegal content...
The DSA establishes a “notice and action” mechanism, as well as safeguards, for the removal of illegal content. Providers of hosting services should act on receipt of such a notice “without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action”, according to the text adopted by MEPs. In their position, MEPs included stronger safeguards to ensure the non-arbitrary and non-discriminatory processing of notices and respect for fundamental rights, including the freedom of expression.
Specific actions must be required from online marketplaces to ensure that consumers can purchase safe products online, MEPs say, strengthening the obligation to trace traders (the “Know Your Business Customer" principle).
... and preventing the spread of harmful content by algorithms
Platforms use “recommender systems” to choose what information to promote: the next video for users to watch, the next product to buy, the next opinion or bit of news to appear at the top of a person’s social media feed. MEPs have beefed up provisions to make sure that online platforms are transparent about the way these algorithms work and to make them more accountable for the decisions they make.
VLOPs will have to carry out mandatory risk assessments and take risk mitigation measures, which should help to better deal with harmful content and disinformation. VLOPs will also have to share data with authorities and researchers, to allow scrutiny over how they work and to help better understand the evolution of online risks. The DSA complements the European Democracy Action Plan, which aims to build more resilient democracies across the EU by countering disinformation.
MEPs demanded that recipients of digital services and organisations representing them must be able to seek redress for any damages resulting from platforms not respecting their due diligence obligations.
Further changes introduced by MEPs concern, among others:
- certain exemptions from DSA obligations for micro and small enterprises;
- online platforms should be prohibited from using deceiving or nudging techniques to influence users’ behaviour through “dark patterns”;
- targeted advertising: the text provides for more transparent and informed choice for all recipients of services, including information on how their data will be monetised and to better protect minors from direct marketing, profiling and behaviourally targeted advertising for commercial purposes;
- more choice on algorithm-based ranking: VLOPs should provide at least one recommender system which is not based on profiling;
- additional obligations for platforms primarily used for the dissemination of user-generated pornographic content;
- enforcement of the DSA: clarification of the role of “Digital Services Coordinators” in member states and their cooperation with the Commission.
Plenary will vote on the amended DSA proposal in the January session. The approved text will then become Parliament’s mandate for negotiations with EU governments, planned to start under the French presidency of the Council in the first semester of 2022.