Digital Services Act: regulating platforms for a safer online space for users 

Press Releases 
 
 
  • MEPs give green light to open negotiations with member states  
  • Measures to counter illegal products, services and content online, including clearly defined procedures on removing them 
  • More options for tracking-free advertising and a ban on using a minor’s data for targeted ads 
  • Recipients of services would have the right to seek compensation for damages 
  • Mandatory risk assessments and more transparency over algorithms to fight harmful content and disinformation 
The DSA addresses issues that influence our daily lives, including when we interact in social media networks ©AdobeStock/Mirko Vitali  

MEPs agreed a draft set of measures to tackle illegal content, to ensure platforms are held accountable for their algorithms, and improve content moderation.

The text approved today by Parliament with 530 votes to 78, with 80 abstentions, will be used as the mandate to negotiate with the French presidency of the Council, representing member states.

After the vote, Christel Schaldemose (S&D, DK), who is leading the Parliament’s negotiating team, said: “Today’s vote shows MEPs and EU citizens want an ambitious digital regulation fit for the future. Much has changed in the 20 years since we adopted the e-commerce directive. Online platforms have become increasingly important in our daily life, bringing new opportunities, but also new risks. It is our duty to make sure that what is illegal offline is illegal online. We need to ensure that we put in place digital rules to the benefit of consumers and citizens. Now we can enter into negotiations with the Council, and I believe we will be able to deliver on these issues”.

Removing illegal content and preventing the spread of disinformation

The Digital Services Act (DSA) proposal defines clear responsibilities and accountability for providers of intermediary services, and in particular online platforms, such as social media and marketplaces.

The DSA establishes a “notice and action” mechanism, as well as safeguards, for the removal of illegal products, services or content online. Providers of hosting services should act on receipt of such a notice “without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action”. MEPs also included stronger safeguards to ensure notices are processed in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including the freedom of expression.

Online marketplaces must ensure that consumers can purchase safe products online, MEPs say, strengthening the obligation to trace traders (the “Know Your Business Customer” principle).

Additional obligations for very large platforms

Very large online platforms (VLOPs) will be subject to specific obligations due to the particular risks they pose regarding the dissemination of both illegal and harmful content. The DSA would help to tackle harmful content (which might not be illegal) and the spread of disinformation by including provisions on mandatory risk assessments, risk mitigation measures, independent audits and the transparency of so-called “recommender systems” (algorithms that determine what users see).

Other key points

Parliament introduced several changes to the Commission proposal, including on:

  • exempting micro and small enterprises from certain DSA obligations;

  • targeted advertising: the text provides for more transparent and informed choice for the recipients of digital services, including information on how their data will be monetised. Refusing consent shall be no more difficult or time-consuming to the recipient than giving consent. If their consent is refused or withdrawn, recipients shall be given other options to access the online platform, including “options based on tracking-free advertising”;

  • targeting or amplification techniques involving the data of minors for the purpose of displaying ads will be prohibited, as well as targeting individuals on the basis of special categories of data which allow for targeting vulnerable groups;

  • compensation: recipients of digital services and organisations representing them must be able to seek redress for any damages resulting from platforms not respecting their due diligence obligations;

  • online platforms should be prohibited from using deceiving or nudging techniques to influence users’ behaviour through “dark patterns”;

  • more choice on algorithm-based ranking: VLOPs should provide at least one recommender system that is not based on profiling.

Further amendments approved in plenary relate to the need for providers to respect in their terms and conditions the freedom of expression and freedom and pluralism of the media, as well as a new provision on the right to use and pay for digital services anonymously (the voting list is available here and all amendments tabled to plenary here).