REPORT on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
20.12.2021 - (COM(2020)0825 – C9-0418/2020 – 2020/0361(COD)) - ***I
Committee on the Internal Market and Consumer Protection
Rapporteur: Christel Schaldemose
Rapporteurs for the opinion (*):
Henna Virkkunen, Committee on Industry, Research and Energy
Geoffroy Didier, Committee on Legal Affairs
Patrick Breyer, Committee on Civil Liberties, Justice and Home Affairs
(*) Associated committees – Rule 57 of the Rules of Procedure
- DRAFT EUROPEAN PARLIAMENT LEGISLATIVE RESOLUTION
- EXPLANATORY STATEMENT
- OPINION OF THE COMMITTEE ON INDUSTRY, RESEARCH AND ENERGY
- OPINION OF THE COMMITTEE ON LEGAL AFFAIRS
- OPINION OF THE COMMITTEE ON CIVIL LIBERTIES, JUSTICE AND HOME AFFAIRS
- OPINION OF THE COMMITTEE ON ECONOMIC AND MONETARY AFFAIRS
- OPINION OF THE COMMITTEE ON TRANSPORT AND TOURISM
- OPINION OF THE COMMITTEE ON CULTURE AND EDUCATION
- OPINION OF THE COMMITTEE ON WOMEN'S RIGHTS AND GENDER EQUALITY
- PROCEDURE – COMMITTEE RESPONSIBLE
- FINAL VOTE BY ROLL CALL IN COMMITTEE RESPONSIBLE
DRAFT EUROPEAN PARLIAMENT LEGISLATIVE RESOLUTION
on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9-0418/2020 – 2020/0361(COD))
(Ordinary legislative procedure: first reading)
The European Parliament,
– having regard to the Commission proposal to Parliament and the Council (COM(2020)0825),
– having regard to Article 294(2) and Article 114 of the Treaty on the Functioning of the European Union, pursuant to which the Commission submitted the proposal to Parliament (C9-0418/2020,
– having regard to Article 294(3) of the Treaty on the Functioning of the European Union,
– having regard to the opinion of the European Economic and Social Committee of 27 April 2021[1],
– having regard to the opinion of the Committee of the Regions of 1 July 2021[2],
– having regard to Rule 59 of its Rules of Procedure,
– having regard to opinions of the Committee on Industry, Research and Energy, the Committee on Legal Affairs, the Committee on Civil Liberties, Justice and Home Affairs, the Committee on Economic and Monetary Affairs, the Committee on Transport and Tourism, the Committee on Culture and Education and the Committee on Women’s Rights and Gender Equality,
– having regard to the report of the Committee on the Internal Market and Consumer Protection (A9-0356/2021),
1. Adopts its position at first reading hereinafter set out;
2. Calls on the Commission to refer the matter to Parliament again if it replaces, substantially amends or intends to substantially amend its proposal;
3. Instructs its President to forward its position to the Council, the Commission and the national parliaments.
Amendment 1
Proposal for a regulation
Recital 1
|
|
Text proposed by the Commission |
Amendment |
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole. |
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel and innovative ways, transforming their communication, consumption and business habits. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, for individual users, companies and for society as a whole. |
__________________ |
__________________ |
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). |
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). |
Amendment 2
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services, and resulting in a fragmentation of the internal market. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice, without lock-in effects, and reducing administrative burden for intermediary services, especially for micro, small and medium sized enterprises. |
Amendment 3
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights and freedoms guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the rights to privacy, to protection of personal data, respect for human dignity, private and family life, the freedom of expression and information, the freedom and the pluralism of the media,and the freedom to conduct a business, a high level of consumer protection, the equality between women and men and the right to non-discrimination. Children have particular rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child (UNCRC). As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world. |
Amendment 4
Proposal for a regulation
Recital 4
|
|
Text proposed by the Commission |
Amendment |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
(4) In order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers, protecting consumers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated, while respecting fundamental rights. |
Amendment 5
Proposal for a regulation
Recital 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(4a) Given the importance of digital services, it is essential that this Regulation ensures a regulatory framework which ensures full, equal and unrestricted access to intermediary services for all recipients of services, including persons with disabilities. Therefore, it is important that accessibility requirements for intermediary services, including their user interfaces, are consistent with existing Union law, such as the European Accessibility Act and the Web Accessibility Directive and that Union law is further developed, so that no one is left behind as result of digital innovation. |
Amendment 6
Proposal for a regulation
Recital 6
|
|
Text proposed by the Commission |
Amendment |
(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union. |
(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport of persons and goods, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union. |
Amendment 7
Proposal for a regulation
Recital 8
|
|
Text proposed by the Commission |
Amendment |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the directing of activities towards one or more Member States. The directing of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The directing of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
__________________ |
__________________ |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
Amendment 8
Proposal for a regulation
Recital 9
|
|
Text proposed by the Commission |
Amendment |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) 2021/784 of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation should apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures. To assist Member States and service providers, the Commission should provide guidelines as to how to interpret the interaction and complementary nature between different Union legal acts and this Regulation and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. In particular, the guidelines should clarify any potential conflicts between the conditions and obligations laid down in legal acts, referred to in this Regulation, explaining which legal act should prevail. |
__________________ |
__________________ |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
Amendment 9
Proposal for a regulation
Recital 9 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(9a) In line with Article 167(4) of the Treaty on the Functioning of the European Union, cultural aspects should be taken into account, in particular in order to respect and to promote the cultural and linguistic diversity. It is essential that this Regulation contributes to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity. |
Amendment 10
Proposal for a regulation
Recital 10
|
|
Text proposed by the Commission |
Amendment |
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. |
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33, Directive (EU) 2018/1972 of the European Parliament and of the Council33a, as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive (EU) 2019/882 of the European Parliament and of the Council, Regulation(EU) 2019/1020, Directive2001/95/EC, Directive 2013/11/EC of the European Parliament and of the Council, Regulation 2017/239437a , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union or national law on working conditions. |
__________________ |
__________________ |
30 Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). |
30 Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). |
31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). |
31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). |
32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. |
32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. |
33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. |
33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. |
|
33a Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast) |
34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. |
35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. |
36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. |
36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. |
37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules |
37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules |
|
37a Regulation (EU) 2017/2394 of the European Parliament and of the Council of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004 |
38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). |
38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). |
Amendment 11
Proposal for a regulation
Recital 11
|
|
Text proposed by the Commission |
Amendment |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected. |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 of the European Parliament and of the Council, which establish specific rules and procedures that should remain unaffected. |
Amendment 12
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, accessible, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should underpin the general idea that what is illegal offline should also be illegal online. The concept of “illegal content” should be defined appropriately and should cover information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable Union or national law is either itself illegal, such as illegal hate speech, or terrorist content and unlawful discriminatory content, or that is not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law, the provision of illegal services in particular in the area of accommodation services on short-term rental platforms non-compliant with Union or national law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is in conformity with Union law, including the Charter and what the precise nature or subject matter is of the law in question. |
Amendment 13
Proposal for a regulation
Recital 13
|
|
Text proposed by the Commission |
Amendment |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor or a purely ancillary feature of another service or functionality of the principal service and that feature or functionality cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature or functionality is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. For the purposes of this Regulation, cloud computing services should not be considered to be an online platform in cases where allowing the dissemination of specific content constitutes a minor or ancillary feature. Moreover, cloud computing services, when serving as infrastructure, for example, as the underlining infrastructural storage and computing services of an internet-based application or online platform, should not in itself be seen as disseminating to the public information stored or processed at the request of a recipient of an application or online platform which it hosts. |
Amendment 14
Proposal for a regulation
Recital 14
|
|
Text proposed by the Commission |
Amendment |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision on whom to grant access. Information exchanged using interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, are not considered to have been disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
__________________ |
__________________ |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
Amendment 15
Proposal for a regulation
Recital 16
|
|
Text proposed by the Commission |
Amendment |
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union. |
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity, consistency, predictability, accessibility and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments. |
Amendment 16
Proposal for a regulation
Recital 18
|
|
Text proposed by the Commission |
Amendment |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. The mere ranking or displaying in an order, or the use of a recommender system should not, however, be deemed as having control over an information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
Amendment 17
Proposal for a regulation
Recital 20
|
|
Text proposed by the Commission |
Amendment |
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
(20) Where a provider of intermediary services deliberately collaborates with a recipient of the services in order to undertake illegal activities, the service should be deemed not to have been provided neutrally and the provider should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
Amendment 18
Proposal for a regulation
Recital 21
|
|
Text proposed by the Commission |
Amendment |
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. |
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved in the content of the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. |
Amendment 19
Proposal for a regulation
Recital 22
|
|
Text proposed by the Commission |
Amendment |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, after having become aware of the illegal nature of the content and thus obtaining actual knowledge or awareness, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of a high level of consumer protection and of the Charter of Fundamental Rights, including the principle of freedom of expression and the right to receive and impart information and ideas without interference by public authority. The provider can obtain actual knowledge or awareness of the illegal nature of the content through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent hosting service provider to reasonably identify, assess and where appropriate act against the allegedly illegal content. As long as providers act upon obtaining actual knowledge, they should benefit from the exemptions from liability referred to in this Regulation. |
Amendment 20
Proposal for a regulation
Recital 23
|
|
Text proposed by the Commission |
Amendment |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of a consumer. Such a belief may arise, for example, where the online platform allowing distance contracts with traders fails to display clearly the identity of the trader pursuant to this Regulation, or is marketing the product or service in its own name rather than using the name of the trader who will supply it, or where the provider determines the final price of the goods or services offered by the trader. |
Amendment 21
Proposal for a regulation
Recital 25
|
|
Text proposed by the Commission |
Amendment |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, solely because they are carrying out voluntary own-initiative investigations, provided those activities are carried out in good faith and in a diligent manner and are accompanied with additional safeguards against over-removal of legal content. Providers of intermediary services should make best efforts to ensure that where automated tools are used for content moderation, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
Amendment 22
Proposal for a regulation
Recital 26
|
|
Text proposed by the Commission |
Amendment |
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. |
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed and open online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the specific provider that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently providers should act where they are in the best place to do so. |
Amendment 23
Proposal for a regulation
Recital 27
|
|
Text proposed by the Commission |
Amendment |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be and among others, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, Virtual Private Networks, cloud infrastructure services, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
Amendment 24
Proposal for a regulation
Recital 27 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(27a) A single webpage or website may include elements that qualify differently between ‘mere conduit’, ‘caching’ or hosting services and the rules for exemptions from liability should apply to each accordingly. For example, a search engine could act solely as a ‘caching’ service as to information included in the results of an inquiry. Elements displayed alongside those results, such as online advertisements, would however still qualify as a hosting service. |
Amendment 25
Proposal for a regulation
Recital 28
|
|
Text proposed by the Commission |
Amendment |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. |
(28) Providers of intermediary services should not be subject to a monitoring obligation, neither de jure, nor de facto with respect to obligations of a general nature. This does not concern specific and properly identified monitoring obligations in a specific case, where set out in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implement Union legal acts, in accordance with the conditions established in this Regulation and other Union law considered as lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, Member States should not prevent providers of intermediary services from providing end-to-end encrypted services. Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, and effectively prevents unauthorised third party access. Furthermore, to ensure effective digital privacy, Member States should not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. |
Amendment 26
Proposal for a regulation
Recital 29
|
|
Text proposed by the Commission |
Amendment |
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. |
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with Union law, including the Charter on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders. |
Amendment 27
Proposal for a regulation
Recital 30
|
|
Text proposed by the Commission |
Amendment |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, including the Charter and in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online, or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
Amendment 28
Proposal for a regulation
Recital 31
|
|
Text proposed by the Commission |
Amendment |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law in conformity with Union law, including Directive 2000/31/EC and the Charter, enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. Exceptionally, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. |
Amendment 29
Proposal for a regulation
Recital 32
|
|
Text proposed by the Commission |
Amendment |
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. |
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. Member States should ensure full implementation of the Union legal framework on confidentiality of communications and online privacy, as well as on protection of natural persons with regard to the processing of personal data enshrined in Directive (EU) 2016/680. In particular, Member States should respect the rights of individuals and journalists and refrain from seeking information which could harm media freedom or freedom of expression. |
Amendment 30
Proposal for a regulation
Recital 33
|
|
Text proposed by the Commission |
Amendment |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, as defined in Union or national law in compliance with Union law, respectively, where they are addressed to providers of intermediary services established in another Member State, they should not in principle restrict those providers’ freedom to provide their services across borders. The competent authority should transmit the orders to act against illegal content and to provide information directly to the relevant addressee by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement should notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council. This Regulation should be without prejudice to the rules on the mutual recognition and enforcement of judgements, namely as regards the right to refuse recognition and enforcement of an order to act against illegal content, in particular where such an order is contrary to the public policy in the Member State where recognition or enforcement is sought. |
Amendment 31
Proposal for a regulation
Recital 33 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(33a) This Regulation should not prevent the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, to issue an order to restore content, where such content has been in compliance with the terms and conditions of the intermediary service provider, but has been erroneously considered as illegal by the service provider and has been removed. |
Amendment 32
Proposal for a regulation
Recital 33 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(33b) To ensure the effective implementation of this Regulation, orders to act against illegal content and to provide information should comply with Union law, including with the Charter. The Commission should provide an effective response to breaches of Union law through infringement proceedings. |
Amendment 33
Proposal for a regulation
Recital 34
|
|
Text proposed by the Commission |
Amendment |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective, predictable and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as a high level of consumer protection, the safety and trust of the recipients of the service, including minors and vulnerable users, the protection of relevant fundamental rights enshrined in the Charter, the meaningful accountability of those providers and the empowerment of recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
Amendment 34
Proposal for a regulation
Recital 35
|
|
Text proposed by the Commission |
Amendment |
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online. |
(35) In that regard, it is important that the due diligence obligations are adapted to the type, nature and size of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation in relation to those services. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online. |
Amendment 35
Proposal for a regulation
Recital 36
|
|
Text proposed by the Commission |
Amendment |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to designate a single point of contact and to publish relevant and up to date information relating to their point of contact, including the languages to be used in such communications. Such information should be notified to the Digital Service Coordinator in the Member State of establishment. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. It should be possible that this contact point is the same contact point as required under other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location. |
Amendment 36
Proposal for a regulation
Recital 36 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(36a) Providers of intermediary services should also be required to designate a single point of contact for recipients of services, which allows rapid, direct and efficient communication in particular by easily accessible means such as telephone number, email addresses, electronic contact forms, chatbots or instant messaging. It should be explicitly indicated when a user communicates with chatbots. To facilitate rapid, direct and efficient communication, recipients of services should not be faced with lengthy phone menus or hidden contact information. In particular, phone menus should always include the option to speak to a human. Providers of intermediary services should allow recipients of services to choose means of direct and efficient communication which do not solely rely on automated tools. This requirement should not affect the internal organisation of providers of intermediary services, including the ability to use third-party services to provide this communication system, such as external service providers and call centres. |
Amendment 37
Proposal for a regulation
Recital 37
|
|
Text proposed by the Commission |
Amendment |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. It should be possible that a legal representative is mandated by more than one provider of intermediary services, in accordance with national law, provided that such providers qualify as micro, small or medium sized enterprises as defined in Recommendation 2003/361/EC. |
Amendment 38
Proposal for a regulation
Recital 38
|
|
Text proposed by the Commission |
Amendment |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of protecting fundamental rights, in particular freedom of expression and of information, transparency, the protection of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are drafted in a clear and unambiguous language in line with applicable Union and national law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available, using, where appropriate graphical elements, such as icons. |
Amendment 39
Proposal for a regulation
Recital 39
|
|
Text proposed by the Commission |
Amendment |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should draw up an annual report in a standardised and machine-readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC 40 which do not also qualify as very large online platforms. |
__________________ |
__________________ |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 40
Proposal for a regulation
Recital 39 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(39a) Recipients of a service should be able to make a free, autonomous and informed decisions or choices when using a service and providers of intermediary services shall not use any means, including via its interface, to distort or impair that decision-making. In particular, recipients of the service should be empowered to make such decision sinter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. However, certain practices typically exploit cognitive biases and prompt recipients of the service to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark patterns’). This should include, but should not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more visual prominence to a consent option, repetitively requesting or urging the recipient to make a decision such as making the procedure of cancelling a service significantly more cumbersome than signing up to it. However, rules preventing dark patterns should not be understood as preventing providers to interact directly with users and to offer new or additional services to them. In particular it should be possible to approach a user again in a reasonable time, even if the user had denied consent for specific data processing purposes, in accordance with Regulation (EU) 2016/679. The Commission should be empowered to adopt a delegated act to define practices that could be considered as dark patterns. |
Amendment 41
Proposal for a regulation
Recital 40
|
|
Text proposed by the Commission |
Amendment |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can establish that the content in question is clearly illegal without additional legal or factual examination of the information indicated in the notice and remove or disable access to that content ('action'). Such mechanism should include a clearly identifiable reporting mechanism, located close to the content in question allowing to notify quickly and easily items of information considered to be illegal content under Union or national law. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. While individuals should always be able to submit notices anonymously, such notices should not give rise to actual knowledge, except in the case of information considered to involve one of the offences referred to in Directive 2011/93/EU. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
Amendment 42
Proposal for a regulation
Recital 40 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(40a) Nevertheless, notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content. Such hosting service providers should redirect such notices to the particular online platform and inform the Digital Services Coordinator. |
Amendment 43
Proposal for a regulation
Recital 40 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(40b) Moreover, hosting providers should seek to act only against the items of information notified. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. |
Amendment 44
Proposal for a regulation
Recital 41
|
|
Text proposed by the Commission |
Amendment |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent objective, non-arbitrary and non-discriminatory processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
Amendment 45
Proposal for a regulation
Recital 41 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(41a) Providers of hosting services should act upon notices without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action. The provider of hosting services should inform the individual or entity notifying the specific content of its decision without undue delay after taking a decision whether to act upon the notice or not. |
Amendment 46
Proposal for a regulation
Recital 42
|
|
Text proposed by the Commission |
Amendment |
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. |
(42) Where a hosting service provider decides to remove, disable access to, demote or impose other measures with regard to information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have been proven to be efficient, proportionate and accurate, that provider should in a clear and user-friendly manner inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. The obligation should however not apply in a number of situations, namely when the content is deceptive or part of high-volume of commercial content, or when it has been requested by a judicial or law enforcement authority to not inform the recipient due to an ongoing criminal investigation until the criminal investigation is closed. Where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium, it should not be required to do so. |
Amendment 47
Proposal for a regulation
Recital 42 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(42a) A provider of hosting services may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving an imminent threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council1. In such instances, the provider of hosting services should inform without delay the competent law enforcement authorities of such suspicion, providing, upon their request, all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. The information notified by the hosting service provider should not be used for any purpose other than those directly related to the individual serious criminal offence notified. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms provider of hosting services. Providers of hosting services should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. In order to facilitate the notification of suspicions of criminal offenses, Member States should notify to the Commission the list of the competent law enforcement or judicial authorities. |
|
__________________ |
|
1 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
Amendment 48
Proposal for a regulation
Recital 43 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(43a) Similarly, in order to ensure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of Chapter III Section 3, in whole or in parts, to those providers of intermediary services that are non-for profit t, or are medium-sized enterprises, but do not present any systemic risk related to illegal content and have limited exposure to illegal content. The providers should present justified reasons for why they should be issued a waiver and send their application first to their Digital Services Coordinators of establishment for a preliminary assessment. The Commission should examine such an application taking into account a preliminary assessment carried out by the Digital Services Coordinators of establishment. The preliminary assessment should be sent together with the application to the Commission. The Commission should monitor the application of the waiver and have the right revoke a waiver at any time. The Commission should maintain a public list of all waiver issued and their conditions. |
Amendment 49
Proposal for a regulation
Recital 44
|
|
Text proposed by the Commission |
Amendment |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
(44) Recipients of the service, should be able to easily and effectively contest certain decisions, of online platforms that negatively affect them. This should include decisions of online platforms allowing consumers to conclude distance contracts with traders to suspend the provisions of their services to traders. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non-discriminatory, non-arbitrary and fair outcomes within ten working days starting on the date on which the online platform received the complaint. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
Amendment 50
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, and expeditiously, taking into account due process and without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in an objective manner. Such trusted flagger status should only be awarded, for a period of two years, to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner and have transparent funding structure. The Digital Services Coordinator should be allowed to renew the status where the trusted flagger concerned continues to meet the requirements of this Regulation. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations, consumer organisations, and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. Trusted flaggers should publish easily comprehensible and detailed reports on notices submitted in accordance with Article 14. Those reports should indicate information such as notices categorised by the entity of the provider of hosting services, the type of content notified, the legal provisions allegedly breached by the content in question, and the action taken by the provider. The reports should also include information about any potential conflict of interest and sources of funding as well as the procedure put in place by the trusted flagger to retain its independence. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and respect for exceptions and limitations to intellectual property rights. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 In order to avoid abuses of the status of trusted flagger, it should be possible to suspend such status when a Digital Service Coordinator of establishment opened an investigation based on legitimate reasons. The suspension should not be longer than the time needed to conduct the investigation and should be maintained if the Digital Services Coordinator of establishment concluded that the entity in question could still be considered as a trusted flagger. |
__________________ |
__________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 51
Proposal for a regulation
Recital 46 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(46a) The strict application of universal design to all new technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity. It is essential to ensure that providers of online platforms, which offer services in the Union, design and provide those services in accordance with the accessibility requirements, set out in Directive (EU) 2019/882. In particular, providers of online platforms should ensure that information provided, forms provided and procedures that are in place are made available in a manner that they are easy to find, easy to understand, and accessible to persons with disabilities. |
Amendment 52
Proposal for a regulation
Recital 47
|
|
Text proposed by the Commission |
Amendment |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
(47) The misuse of services of online platforms by frequently providing illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate, proportionate and effective safeguards against such misuse. The misuse of services of online platforms could be established with regard to frequently provided illegal content where it is evident that that content is illegal without conducting a detailed legal or factual analysis. Notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should be entitled to temporarily or, in a limited number of situations, permanently suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
Amendment 53
Proposal for a regulation
Recital 48
|
|
Text proposed by the Commission |
Amendment |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
deleted |
___________ |
|
1 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
|
Amendment 54
Proposal for a regulation
Recital 49
|
|
Text proposed by the Commission |
Amendment |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms that allow consumers to conclude distance contracts with traders should obtain additional information on the trader and the products and services they intend to offer on the platform. The online platform should therefore be required to obtain information on the name, telephone number and electronic mail of the economic operator and the type of product or service the trader intends to offer on the online platform. Prior to offering its services to the trader, the online platform operator should make best efforts to assess if the information provided by the trader is reliable. In addition, the platform should take adequate measures, such as where applicable, random checks, to identify and prevent illegal content from appearing on their interface. The fulfilment of the obligations on traceability of the traders, products and services should facilitate the compliance by platforms allowing consumers to conclude distance contracts with the obligation to inform consumers of the identity of their contracting party established under Directive 2011/83/EU of the European Parliament and of the Council, as well as the obligations established under Regulation (EU) No 1215/2012 as regards the Member State in which consumers can pursue their consumer rights. The requirement to provide essential information should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary and no longer than six months after the end of a relationship with the trader, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a direct legitimate interest, including through the orders to provide information referred to in this Regulation. |
Amendment 55
Proposal for a regulation
Recital 50
|
|
Text proposed by the Commission |
Amendment |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should, before allowing the display of the product or services on its online interface, make reasonable efforts to assess the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the best efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a user-friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
__________________ |
__________________ |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
Amendment 56
Proposal for a regulation
Recital 50 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(50a) Online platforms that allow consumers to conclude distance contracts with traders should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services, in compliance with the no general monitoring principle. Online platforms covered should inform recipients when the service or product they have acquired through their services are illegal. |
Amendment 57
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. New advertising models have generated changes in the way information is presented and have created new personal data collection patterns and business models that might affect privacy, personal autonomy, democracy, quality news reporting and facilitate manipulation and discrimination. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed, as well as the natural or legal person who finances the advertisement. In addition, recipients of the service should have easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. In addition to these information obligations, online platforms should ensure that recipients of the service can refuse or withdraw their consent for targeted advertising purposes, in accordance with Regulation (EU) 2016/679 in a way that is not more difficult nor time-consuming than to give their consent. Online platforms should also not use personal data for commercial purposes related to direct marketing, profiling and behaviourally targeted advertising of minors. The online platform should not be obliged to maintain, acquire or process additional information in order to assess the age of the recipient of the service. |
Amendment 58
Proposal for a regulation
Recital 52 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(52a) A core part of an online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should ensure that recipients can understand how recommender system impact the way information is displayed, and can influence how information is presented to them. They should clearly present the parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. |
Amendment 59
Proposal for a regulation
Recital 53
|
|
Text proposed by the Commission |
Amendment |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no proportionate alternative and less restrictive measures that would effectively achieve the same result. |
Amendment 60
Proposal for a regulation
Recital 54
|
|
Text proposed by the Commission |
Amendment |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. Accordingly, the number of average monthly recipients of the service should reflect the recipients actually reached by the service either by being exposed to content or by providing content disseminated on the platforms’ interface in that period of time. |
Amendment 61
Proposal for a regulation
Recital 56
|
|
Text proposed by the Commission |
Amendment |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures where mitigation is possible without adversely impacting fundamental rights. |
Amendment 62
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Four categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination and amplification of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous and counterfeit products and illegally-traded animals. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the actual and foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, freedom of the press, human dignity, the right to private life, the right to gender equality, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. A fourth category of risks concerns any actual and foreseeable negative effects on the protection of public health, including behavioural addictions due to excessive use of a service or other serious negative effects to the person's physical, mental, social and financial well-being. |
Amendment 63
Proposal for a regulation
Recital 58
|
|
Text proposed by the Commission |
Amendment |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment where mitigation is possible without adversely impacting fundamental rights. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content and of content that is incompatible with their terms and conditions. They should also consider mitigation measures in case of malfunctioning or intentional manipulation and exploitation of the service, or in case of risks inherent to the intended operation of the service, including the amplification of illegal content, of content that is in breach with their terms and conditions or any other content having negative effects, by adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the recipients of the service. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. The decision as to the choice of measures should remain with the very large online platform. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. The Commission should evaluate the implementation and effectiveness of the mitigating measures and issue recommendations when the measures implemented are deemed inappropriate or ineffective to address the systemic risk at stake. |
Amendment 64
Proposal for a regulation
Recital 59
|
|
Text proposed by the Commission |
Amendment |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, independent experts and civil society organisations. |
Amendment 65
Proposal for a regulation
Recital 60
|
|
Text proposed by the Commission |
Amendment |
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. |
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through external independent auditing, for their compliance with the obligations laid down by this Regulation. In particular, audits should assess the clarity, coherence and predictable enforcement of terms of service, the completeness, methodology and consistency of the transparency reporting obligations, the accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers regarding notices of illegal content and terms of service violations, the accuracy of classification of removed information, the internal complaint handling mechanism, the interaction with trusted flaggers and assessment of their accuracy, the diligence with regard to the verification of the traceability of traders, the adequateness and correctness of the risk assessment, the adequateness and effectiveness of the risk mitigation measures taken and, where relevant, any complementary commitments undertaken pursuant to codes of conduct and crises protocols. They should give the vetted auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Vetted auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets,that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. This guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation applicable to very large online platforms. Auditors should be legally and financially independent and should not have conflict of interest involving the very large online platform concerned and other very large online platforms, so as to be able to perform their tasks in an adequate and trustworthy manner. Additionally, vetted auditors and their employees should not have provided any service to the very large online platform audited for 12 months before the audit. They should also commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for 12 months after their position in the auditing organisation has ended. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. |
Amendment 66
Proposal for a regulation
Recital 61
|
|
Text proposed by the Commission |
Amendment |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, a statement of reasons for the failure to reach such a conclusion should be included in the audit opinion. |
Amendment 67
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. Often, they facilitate the search for relevant content for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should let the recipients decide whether they want to be subject to recommender systems based on profiling and ensure that there is an option which is not based on profiling. In addition, online platforms should ensure that recipients are appropriately informed, on the use of recommender systems, and that recipients can influence the information presented to them through making active choices. They should clearly present the main parameters for such recommender systems in an easily comprehensible and user-friendly manner to ensure that the recipients understand how information is prioritised for them, the reason why, and how to modify the parameters used to curate the content presented for the recipients. Very large online platforms should implement appropriate technical and organisational measures for ensuring that recommender systems are designed in a consumer friendly manner and do not influence end users’ behaviour through dark patterns. |
Amendment 68
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements, including the name of the product, service or brand and the object of the advertisement, and related data on the advertiser, and, if different, the natural or legal person who paid for the advertisement, and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files. |
Amendment 69
Proposal for a regulation
Recital 64
|
|
Text proposed by the Commission |
Amendment |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data and algorithms. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by vetted researchers, vetted not-for-profit bodies, organisations or associations, on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, not-for-profit bodies, organisations or associations,. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including personal data, trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. Vetted researchers, not-for-profit bodies, organisations or associations should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks. |
Amendment 70
Proposal for a regulation
Recital 66
|
|
Text proposed by the Commission |
Amendment |
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. |
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, about the interoperability of advertisement repositories, or about terms and conditions. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. In the absence of relevant standards agreed within [24 months after the entry into force of this Regulation], the Commission should be able to establish technical specifications by implementing acts until a voluntary standard is agreed. |
Amendment 71
Proposal for a regulation
Recital 67
|
|
Text proposed by the Commission |
Amendment |
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct. |
(67) The Commission and the Board should encourage the drawing-up of codes of conduct as well as the compliance with the provisions of these codes to contribute to the application of this Regulation. The Commission and the Board should aim that the codes of conduct clearly define the nature of the public interest objectives being addressed, that they contain mechanisms for independent evaluation of the achievement of these objectives and that the role of competent authorities is clearly defined. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct. |
Amendment 72
Proposal for a regulation
Recital 68
|
|
Text proposed by the Commission |
Amendment |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation, or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of intentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. |
Amendment 73
Proposal for a regulation
Recital 69
|
|
Text proposed by the Commission |
Amendment |
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan. |
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. The Commission should also encourage the development of codes of conduct to facilitate compliance with obligations in areas, such as protection of minors or short-term rental. Other areas for consideration could be to promote diversity of information through support of high quality journalism and to foster credibility of information, whilst respecting confidentiality of journalistic sources. Moreover, it is important to ensure consistency with already existing enforcement mechanisms, such as those in the area of electronic communications or media and with independent regulatory structures in these fields as defined by Union and national law. |
Amendment 74
Proposal for a regulation
Recital 70
|
|
Text proposed by the Commission |
Amendment |
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. |
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. The effectiveness of the codes of conduct should be regularly assessed. Unlike legislation, codes of conduct are not subject to democratic scrutiny and their compliance with fundamental rights is not subject to judicial review. In order to enhance accountability, participation and transparency, procedural safeguards for drawing up codes of conduct are needed. Before initiating or facilitating the drawing-up or the revision of codes of conduct, the Commission may invite where appropriate, the Fundamental Rights Agency or the European Data Protection Supervisor to express their opinion. |
Amendment 75
Proposal for a regulation
Recital 71
|
|
Text proposed by the Commission |
Amendment |
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. |
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. |
Amendment 76
Proposal for a regulation
Recital 72
|
|
Text proposed by the Commission |
Amendment |
(72) The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should appoint at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure. |
(72) The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should designate at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure. |
Amendment 77
Proposal for a regulation
Recital 73
|
|
Text proposed by the Commission |
Amendment |
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level. |
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be designated as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level. |
Amendment 78
Proposal for a regulation
Recital 74
|
|
Text proposed by the Commission |
Amendment |
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate. |
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities have the necessary financial and human resources to carry out their tasks under this Regulation. It is also necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate. |
Amendment 79
Proposal for a regulation
Recital 75
|
|
Text proposed by the Commission |
Amendment |
(75) Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to apply and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members. |
(75) Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to supervise the application and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members. |
Amendment 80
Proposal for a regulation
Recital 76
|
|
Text proposed by the Commission |
Amendment |
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. |
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in this Regulation by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. |
Amendment 81
Proposal for a regulation
Recital 77
|
|
Text proposed by the Commission |
Amendment |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to adopt proportionate interim measures in case of risk of serious harm, as well as to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. |
Amendment 82
Proposal for a regulation
Recital 78
|
|
Text proposed by the Commission |
Amendment |
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. |
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure consistent and uniform application of this Regulation, the Commission should adopt guidance on the rules and procedures related to the powers of Digital Services Coordinators. |
Amendment 83
Proposal for a regulation
Recital 79
|
|
Text proposed by the Commission |
Amendment |
(79) In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should in principle take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States. |
(79) In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States. |
Amendment 84
Proposal for a regulation
Recital 80
|
|
Text proposed by the Commission |
Amendment |
(80) Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, whether the provider is active in several Member States. |
(80) Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, the number of recipients affected, the intentional or negligent character of the infringement and whether the provider is active in several Member States. The Commission should issue guidance to Member States concerning the criteria and conditions to impose proportionate penalties. |
Amendment 85
Proposal for a regulation
Recital 81
|
|
Text proposed by the Commission |
Amendment |
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. |
(81) In order to ensure effective enforcement of the obligations, laid down in this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. The Digital Services Coordinator of establishment should assess the complaint in a timely manner and inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled. |
Amendment 86
Proposal for a regulation
Recital 82
|
|
Text proposed by the Commission |
Amendment |
(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available. |
(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements of this Regulation. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available. |
Amendment 87
Proposal for a regulation
Recital 83 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(83a) Without prejudice to the provisions on the exemption from liability, provided for in this Regulation as regards the information transmitted or stored at the request of a recipient of the service, providers of intermediary services should be liable for the infringement of their obligations laid down in this Regulation. Recipients of the service and organisations representing them should be entitled to have access to proportionate and effective remedies. They should in particular have the right to seek, in accordance with national or Union law, compensation from those providers of intermediary services against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation. |
Amendment 88
Proposal for a regulation
Recital 84
|
|
Text proposed by the Commission |
Amendment |
(84) The Digital Services Coordinator should regularly publish a report on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State. |
(84) The Digital Services Coordinator should regularly publish a report in a standardised and machine-readable format on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, based on the Internal Market Information system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State. |
Amendment 89
Proposal for a regulation
Recital 86
|
|
Text proposed by the Commission |
Amendment |
(86) In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved. |
(86) In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation on the basis of an agreement between the Member States concerned, and in the absence of agreement, under the authority of the Digital Services Coordinator of the Member State of establishment. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved. |
Amendment 90
Proposal for a regulation
Recital 88
|
|
Text proposed by the Commission |
Amendment |
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. |
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. The rules of procedure of the Board should ensure respecting the confidentiality of the information. |
Amendment 91
Proposal for a regulation
Recital 90
|
|
Text proposed by the Commission |
Amendment |
(90) For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation. |
(90) For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation. The Board should draw up an annual report regarding its activities. |
Amendment 92
Proposal for a regulation
Recital 91
|
|
Text proposed by the Commission |
Amendment |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, gender equality and non-discrimination, eradication of all forms of violence against women and girls and other forms of gender-based violence, data protection, respect for intellectual property, competition, electronic communications, audiovisual services, market surveillance, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
Amendment 93
Proposal for a regulation
Recital 96
|
|
Text proposed by the Commission |
Amendment |
(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform. |
(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission should, on its own initiative or upon advice of the Board, initiate further investigation on the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform. The Commission should initiate proceedings in view of the possible adoption of decisions in respect of the relevant conduct by the very large online platform for example where that platform is suspected of having infringed this Regulation including where the platform has been found to not implement the operational recommendations from the independent audit that has been endorsed by Digital Services Coordinator of establishment and where the Digital Services Coordinator of establishment did not take any investigatory or enforcement measures. |
Amendment 94
Proposal for a regulation
Recital 97
|
|
Text proposed by the Commission |
Amendment |
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. |
(97) Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. |
Amendment 95
Proposal for a regulation
Recital 97 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(97a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation. |
Amendment 96
Proposal for a regulation
Recital 99
|
|
Text proposed by the Commission |
Amendment |
(99) In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers |
(99) In particular, the Commission, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or that individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers. |
Amendment 97
Proposal for a regulation
Recital 100
|
|
Text proposed by the Commission |
Amendment |
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods. |
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods. The Commission should in particular ensure that the penalties are effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and nature of activities carried out, the number of recipients affected, the intentional or negligent character of the infringement as well as the economic capacity of the infringer. |
Amendment 98
Proposal for a regulation
Recital 102
|
|
Text proposed by the Commission |
Amendment |
(102) In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure. |
(102) The Commission should carry out a general evaluation of this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. This report should address in particular the definition of very large online platforms and the number of average monthly active recipients of the service. This report should also address the implementation of codes of conduct, as well as the obligation to designate a representative, established in the Union and assess the effect of similar obligations imposed by third countries on European service providers operating abroad. In particular, the Commissions should assess any impact of the costs to European service providers of any similar requirements, including to designate a legal representative, introduced by third countries and any new barriers to non-Union market access after the adoption of this Regulation. The Commission should also assess the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union. In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within three years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure. |
Amendment 99
Proposal for a regulation
Article 1 – title
|
|
Text proposed by the Commission |
Amendment |
Subject matter and scope |
Subject matter |
Amendment 100
Proposal for a regulation
Article 1 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities. |
(c) rules on the implementation and enforcement of the requirements set out in this Regulation, including as regards the cooperation of and coordination between the competent authorities. |
Amendment 101
Proposal for a regulation
Article 1 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
(b) set out harmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected |
Amendment 102
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) promote a high level of consumer protection and contribute to increased consumer choice while facilitating innovation, support digital transition and encourage economic growth within the internal market. |
Amendment 103
Proposal for a regulation
Article 1 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. |
deleted |
Amendment 104
Proposal for a regulation
Article 1 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service. |
deleted |
Amendment 105
Proposal for a regulation
Article 1 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. This Regulation is without prejudice to the rules laid down by the following: |
deleted |
(a) Directive 2000/31/EC; |
|
(b) Directive 2010/13/EC; |
|
(c) Union law on copyright and related rights; |
|
(d) Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted]; |
|
(e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted] |
|
(f) Regulation (EU) 2019/1148; |
|
(g) Regulation (EU) 2019/1150; |
|
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394; |
|
(i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. |
|
Amendment 106
Proposal for a regulation
Article 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 1a |
|
Scope |
|
1. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. |
|
2. This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service. |
|
3. This Regulation is without prejudice to the rules laid down by the following: |
|
(a) Directive 2000/31/EC; |
|
(b) Directive 2010/13/EC; |
|
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market; |
|
(d) Regulation (EU) …/….2021/784 on preventing addressing the dissemination of terrorist content online; |
|
(e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted] |
|
(f) Regulation (EU) 2019/1148; |
|
(g) Regulation (EU) 2019/1150; |
|
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation(EU) 2019/1020 and Directive 2001/95/EC on general product safety; |
|
(i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. |
|
(j) Directive (EU) 2019/882; |
|
(k) Directive (EU) 2018/1972; |
|
(l) Directive 2013/11/EU. |
|
4. By [12 months after the entry into force of this Regulation] the Commission shall publish guidelines with regard to the relationship between this Regulation and the legal acts referred to in Article 1a (3). |
Amendment 107
Proposal for a regulation
Article 2 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) ‘information society services’ means services within the meaning of Article 1(1)(b) of Directive (EU) 2015/1535; |
(a) ‘information society services’ means services as defined in Article 1(1)(b) of Directive (EU) 2015/1535; |
Amendment 108
Proposal for a regulation
Article 2 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service; |
(b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service in order to seek information or to make it accessible; |
Amendment 109
Proposal for a regulation
Article 2 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business or profession; |
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession; |
Amendment 110
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
|
|
Text proposed by the Commission |
Amendment |
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as: |
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of a provider of information society services which has a substantial connection to the Union; |
Amendment 111
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a significant number of users in one or more Member States; or |
deleted |
Amendment 112
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
|
|
Text proposed by the Commission |
Amendment |
— the targeting of activities towards one or more Member States. |
deleted |
Amendment 113
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) ‘substantial connection to the Union’ means the connection of a provider with one or more Member States resulting either from its establishment in the Union, or in the absence of such an establishment, from the fact that the provider directs its activities towards one or more Member States; |
Amendment 114
Proposal for a regulation
Article 2 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession; |
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes directly relating to his or her trade, business, craft or profession; |
Amendment 115
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network; |
— a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network including technical auxiliary functional services; |
Amendment 116
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 2
|
|
Text proposed by the Commission |
Amendment |
— a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request; |
— a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request; |
Amendment 117
Proposal for a regulation
Article 2 – paragraph 1 – point g
|
|
Text proposed by the Commission |
Amendment |
(g) ‘illegal content’ means any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
(g) ‘illegal content’ means any information or activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
Amendment 118
Proposal for a regulation
Article 2 – paragraph 1 – point h
|
|
Text proposed by the Commission |
Amendment |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor or a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation. |
Amendment 119
Proposal for a regulation
Article 2 – paragraph 1 – point k
|
|
Text proposed by the Commission |
Amendment |
(k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications; |
(k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications which enables the recipients of the service to access and interact with the relevant intermediary service; |
Amendment 120
Proposal for a regulation
Article 2 – paragraph 1 – point k a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ka) ‘trusted flagger’ means an entity that has been awarded such status by a Digital Services Coordinator; |
Amendment 121
Proposal for a regulation
Article 2 – paragraph 1 – point n
|
|
Text proposed by the Commission |
Amendment |
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information; |
(n) ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that message; |
Amendment 122
Proposal for a regulation
Article 2 – paragraph 1 – point n a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(na) 'remuneration' means economic compensation consisting of direct or indirect payment for the service provided, including where the intermediary service provider is not directly compensated by the recipient of the service or where the recipient of the service provides data to the service provider, except where such data is collected for the sole purpose of meeting legal requirements; |
Amendment 123
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, prioritise or curate in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
Amendment 124
Proposal for a regulation
Article 2 – paragraph 1 – point p
|
|
Text proposed by the Commission |
Amendment |
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; |
(p) ‘content moderation’ means the activities, either automated or not automated, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; |
Amendment 125
Proposal for a regulation
Article 2 – paragraph 1 – point q
|
|
Text proposed by the Commission |
Amendment |
(q) ‘terms and conditions’ means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services. |
(q) ‘terms and conditions’ means all terms and conditions or specifications, by the service provider irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services. |
Amendment 126
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qa) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882. |
Amendment 127
Proposal for a regulation
Article 3 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
3. This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 128
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that: |
1. Where an information society service is provided that consists of the transmission in communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or secure the information's onward transmission to other recipients of the service upon their request, on condition that the provider: |
Amendment 129
Proposal for a regulation
Article 4 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the provider does not modify the information; |
(a) does not modify the information; |
Amendment 130
Proposal for a regulation
Article 4 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the provider complies with conditions on access to the information; |
(b) complies with conditions on access to the information; |
Amendment 131
Proposal for a regulation
Article 4 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; |
(c) complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; |
Amendment 132
Proposal for a regulation
Article 4 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and |
(d) does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and |
Amendment 133
Proposal for a regulation
Article 4 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. |
(e) acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. |
Amendment 134
Proposal for a regulation
Article 4 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
2. This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 135
Proposal for a regulation
Article 5 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
Amendment 136
Proposal for a regulation
Article 5 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
4. This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 137
Proposal for a regulation
Article 6 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
1. Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4, and 5 solely because they carry out voluntary own-initiative investigations or take measures aimed at detecting, identifying and removing, or disabling of access to, illegal content or take the necessary measures to comply with the requirements of national and Union law, including the Charter and the requirements set out in this Regulation. |
Amendment 138
Proposal for a regulation
Article 6 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Providers of intermediary services shall ensure that voluntary own-initiative investigations carried out and measures taken pursuant to paragraph 1 shall be effective and specific. Such own initiative investigations and measures shall be accompanied by appropriate safeguards, such as human oversight, documentation, or any additional measure to ensure and demonstrate that those investigations and measures are accurate, non-discriminatory, proportionate, transparent and do not lead to over-removal of content. Providers of intermediary services shall make best efforts to ensure that where automated means are used, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content. |
Amendment 139
Proposal for a regulation
Article 7 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. |
1. No general obligation to monitor, neither de jure, nor de facto, through automated or non-automated means, the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity or for monitoring the behaviour of natural persons shall be imposed on those providers. |
Amendment 140
Proposal for a regulation
Article 7 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Providers of intermediary services shall not be obliged to use automated tools for content moderation or for monitoring the behaviour of natural persons. |
Amendment 141
Proposal for a regulation
Article 7 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Member States shall not prevent providers of intermediary services from offering end-to-end encrypted services. |
Amendment 142
Proposal for a regulation
Article 7 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
1c. Member States shall not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. Member States shall not oblige providers of intermediary services to generally and indiscriminately retain personal data of the recipients of their services. Any targeted retention of a specific recipient’s data shall be ordered by a judicial authority in accordance with Union or national law. |
Amendment 143
Proposal for a regulation
Article 8 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
1. Providers of intermediary services shall, upon the receipt via a secure communications channel of an order to act against one or more specific items of illegal content, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the actions taken and the moment when the actions were taken. |
Amendment 144
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent -1 (new)
|
|
Text proposed by the Commission |
Amendment |
|
— a reference to the legal basis for the order; |
Amendment 145
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed; |
— a sufficiently detailed statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law in conformity with Union law; |
Amendment 146
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— identification of the issuing authority including the date, timestamp and electronic signature of the authority, that allows the recipient to authenticate the order and contact details of a person of contact within the said authority; |
Amendment 147
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
|
|
Text proposed by the Commission |
Amendment |
— one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned; |
— a clear indication of the exact electronic location of that information, such as the exact URL or URLs where appropriate or when the exact electronic location is not precisely identifiable; one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned; |
Amendment 148
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider of the service and to the recipient of the service who provided the content; |
— easily understandable information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content, including the deadlines for appeal; |
Amendment 149
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— where necessary and proportionate, the decision not to disclose information about the removal of or disabling of access to the content for reasons of public security, such as the prevention, investigation, detection and prosecution of serious crime, not exceeding six weeks from that decision; |
Amendment 150
Proposal for a regulation
Article 8 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; |
(b) the territorial scope of the order on the basis of the applicable rules of Union and national law in conformity with Union law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; the territorial scope of the order shall be limited to the territory of the Member State issuing the order unless the illegality of the content derives directly from Union law or the rights at stake require a wider territorial scope, in accordance with Union and international law; |
Amendment 151
Proposal for a regulation
Article 8 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10 or in one of the official languages of the Member State that issues the order against the specific item of illegal content; in such case, the point of contact of the service provider may request the competent authority to provide translation into the language declared by the provider; |
Amendment 152
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the order is in compliance with Article 3 of Directive 2000/31/EC; |
Amendment 153
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(cb) where more than one provider of intermediary services is responsible for hosting the specific items of illegal content, the order is issued to the most appropriate provider that has the technical and operational ability to act against those specific items. |
Amendment 154
Proposal for a regulation
Article 8 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall adopt implementing acts in accordance with Article 70, after consulting the Board, laying down a specific template and form for the orders, referred to in paragraph 1. |
Amendment 155
Proposal for a regulation
Article 8 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Providers of intermediary services who received an order shall have a right to an effective remedy. The Digital Services Coordinator of the Member State of establishment may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relation to the order. |
|
The Digital Services Coordinator of the Member State of establishment may request the authority issuing the order to withdraw or repeal the order or adjust the territorial scope of the order to what is strictly necessary. Where such a request is refused, the Digital Services Coordinator of the Member State of establishment shall be entitled to seek the annulling, ceasing or adjustment of the effect of the order before the judicial authorities of the Member States issuing the order. Such proceedings shall be completed without undue delay. |
Amendment 156
Proposal for a regulation
Article 8 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the judicial or administrative authority that issued the order asking for the necessary clarification. |
Amendment 157
Proposal for a regulation
Article 8 – paragraph 2 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
2d. The authority issuing the order shall transmit that order and the information received from the provider of intermediary services as to the effect given to the order to the Digital Services Coordinator from the Member State of the issuing authority. |
Amendment 158
Proposal for a regulation
Article 8 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative procedural law in conformity with Union law, including the Charter. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives pursued. |
Amendment 159
Proposal for a regulation
Article 8 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Member States shall ensure that the relevant authorities may, at the request of an applicant whose rights are infringed by illegal content, issue against the relevant provider of intermediary services an injunction order in accordance with this Article to remove or disable access to that content. |
Amendment 160
Proposal for a regulation
Article 9 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
1. Providers of intermediary services shall, upon receipt via a secure communications channel of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
Amendment 161
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
|
|
Text proposed by the Commission |
Amendment |
|
— the identification details of the judicial or administrative authority issuing the order and authentication of the order by that authority, including the date, time stamp and electronic signature of the authority issuing the order to provide information; |
Amendment 162
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— a reference to the legal basis for the order; |
Amendment 163
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
— a clear indication of the exact electronic location, an account name, or a unique identifier of the recipient on whom information is sought; |
Amendment 164
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences; |
— a sufficiently detailed statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences; |
Amendment 165
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— where the information sought constitutes personal data within the meaning of Article 4, point (1), of Regulation (EU) 2016/679 or Article 3, point (1), of Directive (EU) 2016/680, a justification that the order is in accordance with applicable data protection law; |
Amendment 166
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider and to the recipients of the service concerned; |
— information about redress available to the provider and to the recipients of the service concerned including deadlines for appeal; |
Amendment 167
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— an indication on whether the provider should inform without undue delay the recipient of the service concerned, including information about the data being sought; where information is requested in the context of criminal proceedings, the request for that information shall be in compliance with Directive (EU) 2016/680, and the information to the recipient of the service concerned about that request may be delayed as long as necessary and proportionate to avoid obstructing the relevant criminal proceedings, taking into account the rights of the suspected and accused persons and without prejudice to defence rights and effective legal remedies. Such a request shall be duly justified, specify the duration of the obligation of confidentiality and shall be subject to periodic review. |
Amendment 168
Proposal for a regulation
Article 9 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10; |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10 or in one of the official languages of the Member State that issues the order against the item of illegal content; in such case, the point of contact may request the competent authority to provide translation into the language declared by the provider; |
Amendment 169
Proposal for a regulation
Article 9 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall adopt implementing acts in accordance with Article 70, after consulting the Board, laying down specific template and form for the orders referred to in paragraph 1. |
Amendment 170
Proposal for a regulation
Article 9 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. The provider of intermediary services who received an order shall have a right to an effective remedy. That right shall include the right to challenge the order before the judicial authorities of the Member State of the issuing competent authority, in particular where such an order is not incompliance with Article 3 of Directive 2000/31/EC. The Digital Services Coordinator of the Member State of establishment may choose to intervene on behalf of the provider in any redress, appeal or other legal proceedings in relation to the order. |
|
The Digital Services Coordinator of the Member State of establishment may request the authority issuing the order to withdraw or repeal the order. Where such a request is refused, the Digital Services Coordinator of the Member State of establishment shall be entitled to seek the annulling, ceasing or adjustment of the effect of the order before the judicial of the Member States of the order. Such processing shall be completed without undue delay. |
Amendment 171
Proposal for a regulation
Article 9 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. If the provider cannot comply with the order because it contains manifest errors or does not contain sufficient information to enable it to be executed, it shall, without undue delay, inform the judicial or administrative authority that issued that information order and request the necessary clarifications. |
Amendment 172
Proposal for a regulation
Article 9 – paragraph 2 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
2d. The authority issuing the order to provide a specific item of information shall transmit that order and the information received from the provider of intermediary services as to the effect given to the order to the Digital Services Coordinator from the Member State of the issuing authority. |
Amendment 173
Proposal for a regulation
Article 9 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law or administrative procedural law in conformity with Union law. |
Amendment 174
Proposal for a regulation
Article 9 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 9a |
|
Effective remedies for recipients of the service |
|
1. Recipients of the service whose content was removed according to Article 8 or whose information was sought according to Article 9 shall have the right to effective remedies against such orders, including, where applicable, restauration of content where such content has been in compliance with the terms and conditions, but has been erroneously considered as illegal by the service provider, without prejudice to remedies available under Directive (EU) 2016/680 and Regulation (EU) 2016/679. |
|
2. Such right to an effective remedy shall be exercised before a judicial authority in the issuing Member State in accordance with national law and shall include the possibility to challenge the legality of the measure, including its necessity and proportionality. |
|
3. Digital Services Coordinators shall develop national tools and guidance to recipients of the service as regards complaint and redress mechanisms applicable in their respective territory. |
Amendment 175
Proposal for a regulation
Chapter III – title
|
|
Text proposed by the Commission |
Amendment |
Due diligence obligations for a transparent and safe online environment |
Due diligence obligations for a transparent, accessible and safe online environment |
Amendment 176
Proposal for a regulation
Article 10 – title
|
|
Text proposed by the Commission |
Amendment |
Points of contact |
Points of contact for Member States’ authorities, the Commission and the Board |
Amendment 177
Proposal for a regulation
Article 10 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation. |
1. Providers of intermediary services shall designate a single point of contact enabling them to communicate directly, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation. |
Amendment 178
Proposal for a regulation
Article 10 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact. |
2. Providers of intermediary services shall communicate to the Member States' authorities, the Commission and the Board, the information necessary to easily identify and communicate with their single points of contact, including the name, the email address, the physical address and the telephone number, and shall ensure that the information is kept up to date. |
Amendment 179
Proposal for a regulation
Article 10 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision. |
Amendment 180
Proposal for a regulation
Article 10 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 10a |
|
Points of contact for recipients of services |
|
1. Providers of intermediary services shall designate a single point of contact that enables recipients of services to communicate directly with them. |
|
2. In particular, providers of intermediary services shall enable recipients of services to communicate with them by providing rapid, direct and efficient means of communication such as telephone number, email addresses, electronic contact forms, chatbots or instant messaging as well as the physical address of the establishment of the provider of intermediary services, in a user-friendly, and easily accessible manner. Providers of intermediary services shall also enable recipients of services to choose the means of direct communication, which shall not solely rely on automated tools. |
|
3. Providers of intermediary services shall make all reasonable efforts to guarantee that sufficient human and financial resources are allocated to ensure that the communication, referred to in paragraph 1 is performed in a timely and efficient manner. |
Amendment 181
Proposal for a regulation
Article 11 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. |
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services. |
Amendment 182
Proposal for a regulation
Article 11 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to cooperate with the Member States’ authorities, the Commission and the Board and comply with those decisions. |
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of,compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and sufficient resources in order to guarantee their efficient and timely cooperation with the Member States’ authorities, the Commission and the Board and comply with any of those decisions. |
Amendment 183
Proposal for a regulation
Article 11 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Providers of intermediary services shall notify the name, address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. |
4. Providers of intermediary services shall notify the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established They shall ensure that that information is kept up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity. |
Amendment 184
Proposal for a regulation
Article 11 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. This Article shall be without prejudice to the fact that, in accordance with national law, a legal representative may be mandated by more than one provider of intermediary services, provided that such providers qualify as micro, small or medium sized enterprises as defined in Recommendation 2003/361/EC. |
Amendment 185
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall use fair, non-discriminatory and transparent terms and conditions. Providers of intermediary services shall draft those terms and conditions in clear, user friendly and unambiguous language and shall make them publicly available in an easily accessible and machine-readable format in the official languages of the Member State towards which the service is directed. |
Amendment 186
Proposal for a regulation
Article 12 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. In their terms and conditions, providers of intermediary services shall include information on any restrictions or modifications that they impose in relation to the use of their service in respect of content provided by the recipients of the service. Providers of intermediary services shall also include easily accessible information on the right of the recipients to terminate the use of their service. Providers of intermediary services shall also include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. |
Amendment 187
Proposal for a regulation
Article 12 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Providers of intermediary services shall notify expeditiously the recipients of the service of any significant change to the terms and conditions and provide an explanation thereof. |
Amendment 188
Proposal for a regulation
Article 12 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
1c. Where an intermediary service is primarily directed at minors or is pre-dominantly used by them, the provider shall explain conditions for and restrictions on the use of the service in a way that minors can understand. |
Amendment 189
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall act in a fair, transparent, coherent, diligent, timely, non-arbitrary, non-discriminatory and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
Amendment 190
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Providers of intermediary services shall provide recipients of services with a concise, easily accessible and in machine-readable format summary of the terms and conditions, in clear, user-friendly and unambiguous language. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies and redress mechanisms available. |
Amendment 191
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Providers of intermediary services may use graphical elements such as icons or images to illustrate the main elements of the information requirements. |
Amendment 192
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. Very large online platforms as defined in Article 25 shall publish their terms and conditions in the official languages of all Member States in which they offer their services. |
Amendment 193
Proposal for a regulation
Article 12 – paragraph 2 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
2d. Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service. |
Amendment 194
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish in a standardised and machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
Amendment 195
Proposal for a regulation
Article 13 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform the authority issuing the order of its receipt and the effect given to the order; |
Amendment 196
Proposal for a regulation
Article 13 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) where applicable, the complete number of content moderators allocated for each official language per Member State, and a qualitative description of whether and how automated tools for content moderation are used in each official language; |
Amendment 197
Proposal for a regulation
Article 13 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action; providers of intermediary services may add additional information as to the reasons for the average time for taking the action; |
Amendment 198
Proposal for a regulation
Article 13 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; |
(c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures, as well as, where applicable, measures taken to provide training and assistance to members of staff who are engaged in content moderation, and to ensure that non-infringing content is not affected; |
Amendment 199
Proposal for a regulation
Article 13 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed. |
Amendment 200
Proposal for a regulation
Article 13 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The information provided shall be presented per Member State in which services are offered and in the Union as a whole. |
Amendment 201
Proposal for a regulation
Article 13 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, which do not also qualify as very large online platforms. |
Amendment 202
Proposal for a regulation
Article 13 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 13a |
|
Online interface design and organisation |
|
1. Providers of intermediary services shall not use the structure, function or manner of operation of their online interface, or any part thereof, to distort or impair recipients of services’ ability to make a free, autonomous and informed decision or choice. In particular, providers of intermediary services shall refrain from: |
|
(a) giving more visual prominence to any of the consent options when asking the recipient of the service for a decision; |
|
(b) repeatedly requesting that a recipient of the service consents to data processing, where such consent has been refused, pursuant to Article 7(3) of Regulation (EU) 2016/679, regardless of the scope or purpose of such processing, especially by presenting a pop-up that interferes with user experience; |
|
(c) urging a recipient of the service to change a setting or configuration of the service after the recipient has already made a choice; |
|
(d) making the procedure of terminating a service significantly more cumbersome than signing up to it; or |
|
(e) requesting consent where the recipient of the service exercises his or her right to object by automated means using technical specifications, in line with Article 21(5) of Regulation (EU) 2016/679. |
|
This paragraph shall be without prejudice to Regulation(EU) 2016/679. |
|
2. The Commission is empowered to adopt a delegated act to update the list of practices referred to in paragraph 1. |
|
3. Where applicable, providers of intermediary services shall adapt their design features to ensure a high level of privacy, safety, and security by design for minors. |
Amendment 203
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements: |
Amendment 204
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) where possible, evidence that substantiates the claim; |
Amendment 205
Proposal for a regulation
Article 14 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
(b) where relevant, a clear indication of the exact electronic location of that information, for example, the exact URL or URLs, or, where necessary, additional information enabling the identification of the illegal content as applicable to the type of content and to the specific type of hosting service; |
Amendment 206
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
3. Notices that include the elements referred to in paragraph 2, on the basis of which a diligent hosting service provider is able to establish the illegality of the content in question without conducting a legal or factual examination, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
Amendment 207
Proposal for a regulation
Article 14 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Information that has been the subject of a notice shall remain accessible while the assessment of its legality is still pending, without prejudice to the right of providers of hosting services to apply their terms and conditions. Providers of hosting services shall not be held liable for failure to remove notified information, while the assessment of legality is still pending. |
Amendment 208
Proposal for a regulation
Article 14 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity. |
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity. |
Amendment 209
Proposal for a regulation
Article 14 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
5. The provider shall also, without undue delay, notify that individual or entity of its action in respect of the information to which the notice relates, providing information on the redress possibilities. |
Amendment 210
Proposal for a regulation
Article 14 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. The anonymity of individuals who submitted a notice shall be ensured towards the recipient of the service who provided the content, except in cases of alleged violations of personality rights or of intellectual property rights. |
Amendment 211
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-discriminatory and non-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. Where the provider has no technical, operational or contractual ability to act against specific items of illegal content, it may hand over a notice to the provider that has direct control of specific items of illegal content, while informing the notifying person or entity and the relevant Digital Services Coordinator. |
Amendment 212
Proposal for a regulation
Article 15 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
1. Where a provider of hosting services decides to remove, disable access to, demote or to impose other measures with regard to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
|
This obligation shall not apply where the content is deceptive high-volume commercial content, or it has been requested by a judicial or law enforcement authority not to inform the recipient due to an ongoing criminal investigations until the criminal investigations is closed. |
Amendment 213
Proposal for a regulation
Article 15 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access; |
(a) whether the action entails either the removal, the disabling of access, the demotion of, or imposes other measures with regard to information and, where relevant, the territorial scope of the action and its duration, including, where an action was taken pursuant to Article 14, an explanation about why the action did not exceed what was strictly necessary to achieve its purpose; |
Amendment 214
Proposal for a regulation
Article 15 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the facts and circumstances relied on in taking the decision, including where relevant whether the decision was taken pursuant to a notice submitted in accordance with Article 14; |
(b) the facts and circumstances relied on in taking the action, including where relevant whether the action was taken pursuant to a notice submitted in accordance with Article 14 or based on voluntary own-initiative investigations or to an order issued in accordance with Article 8 and where appropriate, the identity of the notifier; |
Amendment 215
Proposal for a regulation
Article 15 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means; |
(c) where applicable, information on the use made of automated means in taking the action, including where the action was taken in respect of content detected or identified using automated means; |
Amendment 216
Proposal for a regulation
Article 15 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground; |
(d) where the action concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground; |
Amendment 217
Proposal for a regulation
Article 15 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground; |
(e) where the action is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground; |
Amendment 218
Proposal for a regulation
Article 15 – paragraph 2 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress. |
(f) clear, user-friendly information on the redress possibilities available to the recipient of the service in respect of the action, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress. |
Amendment 219
Proposal for a regulation
Article 15 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data. |
4. Providers of hosting services shall publish at least once a year the actions and the statements of reasons, referred to in paragraph 1 in a publicly accessible machine-readable database managed and published by the Commission. That information shall not contain personal data. |
Amendment 220
Proposal for a regulation
Article 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15 a |
|
Notification of suspicions of criminal offences |
|
1. Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving an imminent threat to the life or safety of persons has taken place, is taking place or planned to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide, upon their request, all the relevant information available. |
|
2. Where the provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative and may inform Europol. |
|
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, to be taking place or to be planned to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. For the purpose of this Article, Member States shall notify to the Commission the list of its competent law enforcement or judicial authorities. |
|
3. Unless instructed otherwise by the informed authority, the provider of hosting services shall remove or disable the content. |
|
4. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified. |
|
5. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1. |
Amendment 221
Proposal for a regulation
Article 16 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
1. This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which do not qualify as a very large online platforms as defined by Article 25 of this Regulation. |
|
2. Providers of intermediary services may submit an application accompanied by a justification for a waiver from the requirements of this section provided that they:
|
|
(a) do not present significant systemic risks and have limited exposure to illegal content; and |
|
(b) qualify as non-for-profit or qualify as a medium enterprise within the meaning of the Annex to Recommendation 2003/361/EC. |
|
3. The application shall be submitted to the Digital Services Coordinator of establishment who shall conduct a preliminary assessment. The Digital Services Coordinator of establishment shall transmit to the Commission the application accompanied by its assessment and where applicable, a recommendation on the Commission’s decision. The Commission shall examine such an application and, after consulting the Board, may issue a total or a partial waiver from the requirements of this Section. |
|
4. Where the Commission grants such a waiver, it shall monitor the use of the waiver by the provider of intermediary services to ensure that the conditions for use of the waiver are respected. |
|
5. Upon the request of the Board, the Digital Services Coordinator of establishment or the provider, or on its own initiative, the Commission may review or revoke the waiver in whole or in parts. |
|
6. The Commission shall maintain a list of all waivers issued and their conditions and shall make the list publicly available. |
|
7. The Commission shall be empowered to adopt a delegated act in accordance with Article 69 as to the process and procedure for the implementation of the waiver system in relation with this Article. |
Amendment 222
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions to remove, demote, disable access to or impose other measures that restrict visibility, availability or accessibility of the information; |
Amendment 223
Proposal for a regulation
Article 17 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) decisions to suspend or terminate the provision of the service, in whole or in part, to the recipients; |
(b) decisions to suspend or terminate, or limit the provision of the service, in whole or in part, to the recipients; |
Amendment 224
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) decisions to restrict the ability to monetise content provided by the recipients. |
Amendment 225
Proposal for a regulation
Article 17 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The period of at least six months as set out in paragraph 1 shall be considered to start on the day on which the recipient of the service is informed about the decision in accordance with Article 15. |
Amendment 226
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly, including for persons with disabilities and minors, non-discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner. |
Amendment 227
Proposal for a regulation
Article 17 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner and within ten working days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
Amendment 228
Proposal for a regulation
Article 17 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. |
5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Online platform shall ensure that decisions are taken by qualified staff. |
Amendment 229
Proposal for a regulation
Article 17 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. Recipients of the service shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned. |
Amendment 230
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
1. Recipients of the service addressed by the decisions referred to in Article 17(1), taken by the online platform on the grounds that the information provided by the recipients is illegal content or incompatible with its terms and conditions , shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. |
Amendment 231
Proposal for a regulation
Article 18 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Both parties shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. The possibility to select any out-of-court dispute settlement body shall be easily accessible on the online interface of the online platform in a clear and user-friendly manner. |
Amendment 232
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, where the body has demonstrated that it meets all of the following conditions: |
2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body for a maximum of three years, which can be renewed, where the body and persons in charge of the out-of-court dispute settlement body has demonstrated that it meets all of the following conditions: |
Amendment 233
Proposal for a regulation
Article 18 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms; |
(a) it is independent, including financially independent, and impartial towards online platforms, recipients of the service provided by the online platforms and towards individuals or entities that have submitted notices; |
Amendment 234
Proposal for a regulation
Article 18 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) its members are remunerated in a way that is not linked to the outcome of the procedure; |
Amendment 235
Proposal for a regulation
Article 18 – paragraph 2 – point b b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(bb) the natural persons in charge of dispute resolution commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended, and have not worked for such an organisation for two years prior to taking up this role; |
Amendment 236
Proposal for a regulation
Article 18 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the dispute settlement is easily accessible through electronic communication technology; |
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology and provides for the possibility to submit a complaint and the requisite supporting documents online; |
Amendment 237
Proposal for a regulation
Article 18 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure which are clearly visible and easily and publicly accessible. |
Amendment 238
Proposal for a regulation
Article 18 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Digital Services Coordinator shall reassess on a yearly basis whether the certified out-of-court dispute settlement body continues to fulfil the conditions, referred to in paragraph 2. If this is not the case, the Digital Services Coordinator shall revoke the status from the out-of-court dispute settlement body. |
Amendment 239
Proposal for a regulation
Article 18 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. The Digital Service Coordinator shall draw up a report every two years listing the number of complaints the out of court dispute settlement body has received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes. The report shall in particular: |
|
(a) identify best practices of the out-of-court dispute settlement bodies; |
|
(b) report, where appropriate, on any shortcomings, supported by statistics, that hinder the functioning of the out-of-court dispute settlement bodies for both domestic and cross-border disputes; |
|
(c) make recommendations on how to improve the effective and efficient functioning of the out-of-court dispute settlement bodies, where appropriate. |
Amendment 240
Proposal for a regulation
Article 18 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time and no later than 90 calendar days after the date on which the certified body has received the complaint. The procedure shall be considered terminated on the date on which the certified body has made the decision of out-of-court dispute settlement procedure available. |
Amendment 241
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement. |
3. If the body decides the dispute in favour of the recipient of the service, individuals or entities mandated under Article 68 that have submitted notices, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient or individuals or entities that have submitted notices have paid or are to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, and the body does not find that the recipient acted in bad faith in the dispute, the recipient or the individuals or entities that have submitted notices shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement. |
Amendment 242
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. |
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof for online platforms. Out-of-court dispute settlement procedures shall be free of charge or available at a nominal fee for the recipient of the service. |
Amendment 243
Proposal for a regulation
Article 18 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated. |
5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph as well as out-of-court dispute settlement bodies whose status has been revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated. |
Amendment 244
Proposal for a regulation
Article 19 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and expeditiously, taking into account due process. |
Amendment 245
Proposal for a regulation
Article 19 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Online platforms shall take the necessary technical and organisational measures to ensure that trusted flaggers can issue correction notices of incorrect removal, restriction or disabling access to content, or of suspensions or terminations of accounts, and that those notices to restore information are processed and decided upon with priority and without delay. |
Amendment 246
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: |
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: |
Amendment 247
Proposal for a regulation
Article 19 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner. |
(c) it carries out its activities for the purposes of submitting notices in an accurate and objective manner. |
Amendment 248
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually; |
Amendment 249
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(cb) it publishes, at least once a year, clear, easily comprehensible, detailed and standardised reports on all notices submitted in accordance with Article 14 during the relevant period. The report shall list: |
|
- notices categorised by the identity of the provider of hosting services; |
|
- the type of content notified; |
|
- the specific legal provisions allegedly breached by the content notified; |
|
- the action taken by the provider; |
|
- any potential conflicts of interest and sources of funding, and an explanation of the procedures in place to ensure that the trusted flagger retains its independence. |
|
The reports referred to in point (cb) shall be sent to the Commission which shall make them publicly available. |
Amendment 250
Proposal for a regulation
Article 19 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
3. Digital Services Coordinators shall award the trusted flagger status for a period of two years, upon which the status may be renewed where the trusted flagger concerned continues to meet the requirements of this Regulation. The Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6. The Digital Services Coordinator of the Member State of establishment of the platform shall engage in dialogue with platforms and stakeholders for maintaining the accuracy and efficacy of a trusted flagger system. |
Amendment 251
Proposal for a regulation
Article 19 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. |
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database in an easily accessible and machine-readable format and keep the database updated. |
Amendment 252
Proposal for a regulation
Article 19 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the online platforms and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. |
Amendment 253
Proposal for a regulation
Article 19 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by an online platform pursuant to paragraph 5, carried out without undue delay, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. |
Amendment 254
Proposal for a regulation
Article 19 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6. |
7. The Commission, after consulting the Board, shall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6. |
Amendment 255
Proposal for a regulation
Article 19 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 19 a |
|
Accessibility requirements for online platforms |
|
1. Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. |
|
2. Providers of online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in an accessible manner for persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation. |
|
3. Providers of online platforms shall ensure that information, forms and measures provided pursuant to this Regulation are made available in a manner that they are easy to find, easy to understand, and accessible to persons with disabilities. |
|
4. Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. |
|
5. In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. |
|
6. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. |
|
7. Online platforms which are in conformity with harmonised standards or parts thereof derived from Directive (EU) 2019/882 the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. |
|
8. Online platforms which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements. |
Amendment 256
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide illegal content, for which the illegality can be established without conducting a legal or factual examination or for which they have received two or more orders to act regarding illegal content in the previous 12 months, unless those orders were later overturned. |
Amendment 257
Proposal for a regulation
Article 20 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
2. Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that repeatedly submit notices or complaints that are manifestly unfounded. |
Amendment 258
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following: |
3. When deciding on the suspension, providers of online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of the online platform. Those circumstances shall include at least the following: |
Amendment 259
Proposal for a regulation
Article 20 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
(a) the absolute numbers of items of illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
Amendment 260
Proposal for a regulation
Article 20 – paragraph 3 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the intention of the recipient, individual, entity or complainant. |
(d) where identifiable the intention of the recipient, individual, entity or complainant; |
Amendment 261
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) whether a notice was submitted by an individual user or by an entity or persons with specific expertise related to the content in question or following the use of an automated content recognition system. |
Amendment 262
Proposal for a regulation
Article 20 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where: |
|
(a) there are compelling reasons of law or public policy, including ongoing criminal investigations; |
|
(b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; |
|
(c) a trader has repeatedly offered goods and services that do not comply with Union or national law; |
|
(d) the items removed were related to serious crimes. |
Amendment 263
Proposal for a regulation
Article 20 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
4. Providers of online platforms shall set out, in a clear, user-friendly, and detailed manner with due regard to their obligations under Article 12(2) their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
Amendment 264
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information: |
1. Online platforms allowing consumers to conclude distance contracts with traders shall ensure that traders can only use their services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services for those purposes, they have been provided with the following information: |
Amendment 265
Proposal for a regulation
Article 22 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; |
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law, including in the area of product safety; |
__________________ |
__________________ |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
Amendment 266
Proposal for a regulation
Article 22 – paragraph 1 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law. |
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law and where applicable confirming that all products have been checked against available databases, such as the Union Rapid Alert System for dangerous non-food products (RAPEX); |
Amendment 267
Proposal for a regulation
Article 22 – paragraph 1 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) the type of products or services the trader intends to offer on the online platform. |
Amendment 268
Proposal for a regulation
Article 22 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. |
2. The online platform allowing consumers to conclude distance contracts with traders shall, upon receiving that information before allowing the display of the product or service on its online interface, and until the end of the contractual relationship, make best efforts to assess whether the information referred to in points (a) to (fa) of paragraph 1 is reliable and complete. The online platform shall make best efforts to check the information provided by the trader through the use of any freely accessible official online database or online interface made available by an authorised administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources. |
|
No later than one year after the entry into force of this Regulation, the Commission shall publish the list of online databases and online interfaces mentioned in the paragraph above and keep it up-to-date. The obligations for online platforms referred to in paragraphs 1 and 2 shall apply with regard to new and existing traders. |
Amendment 269
Proposal for a regulation
Article 22 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The online platform shall make best efforts to identify and prevent the dissemination, by traders using its service, of offers for products or services which do not comply with Union or national law through measures such as random checks on the products and services offered to consumers in addition to the obligations referred to in paragraph 1 and 2 of this Article. |
Amendment 270
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
3. Where the online platform obtains sufficient indications or has reasons to believe that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Amendment 271
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the trader fails to correct or complete that information, the online platform shall suspend the provision of its service to the trader until the request is complied with. |
Where the trader fails to correct or complete that information, the online platform shall swiftly suspend the provision of its service to the trader in relation to the offering of products or services to consumers located in the Union until the request is fully complied with. |
Amendment 272
Proposal for a regulation
Article 22 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. If an online platform rejects an application for services or suspends services to a trader, the trader shall have recourse to the mechanisms under Article 17 and Article 43 of this Regulation. |
Amendment 273
Proposal for a regulation
Article 22 – paragraph 3 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
3b. Online platforms allowing consumers to conclude contracts with traders shall ensure that the identity, such as the trademark or logo, of the business user providing content, goods or services is clearly visible alongside the content, goods or services offered. For this purpose, the online platform shall establish a standardised interface for business users. |
Amendment 274
Proposal for a regulation
Article 22 – paragraph 3 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
3c. Traders shall be solely liable for the accuracy of the information provided and shall inform without delay the online platform of any changes to the information provided. |
Amendment 275
Proposal for a regulation
Article 22 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information no later than six months after the final conclusion of a distance contract. |
Amendment 276
Proposal for a regulation
Article 22 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. |
6. The online platform shall make the information referred to in points (a), (d), (e), (f), and (fa) of paragraph 1 easily accessible to the recipients of the service,, in a clear, easily accessible and comprehensible manner in accordance with the accessibility requirements of Annex I to Directive (EU) 2019/882. |
Amendment 277
Proposal for a regulation
Article 22 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22a |
|
Obligation to inform consumers and authorities about illegal products and services |
|
1. Where an online platforms allowing consumers to conclude distance contracts with traders becomes aware, irrespective of the means used to, that a product or a service offered by a trader on the interface of that platform is illegal with regard to applicable requirements in Union or national law, it shall: |
|
(a) remove the illegal product or service from its interface expeditiously and, where appropriate, inform the relevant authorities, such as the market surveillance authority or the custom authority of the decision taken; |
|
(b) where the online platform has the contact details of the recipient of the services, inform those recipients of the service that had acquired such product or service about the illegality, the identity of the trader and options for seeking redress; |
|
(c) compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past twelve months along with information about the concerned trader and options for seeking redress. |
|
2. Online platforms allowing consumers to conclude distance contracts with traders shall maintain an internal database of illegal products and services removed and/or recipients suspended pursuant to Article 20. |
Amendment 278
Proposal for a regulation
Article 23 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed; |
Amendment 279
Proposal for a regulation
Article 23 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
Amendment 280
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions. |
Amendment 281
Proposal for a regulation
Article 23 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2). |
2. Online platforms shall publish, at least once every twelve months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2). |
Amendment 282
Proposal for a regulation
Article 23 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in connection with the exercise of their supervisory powers. |
Amendment 283
Proposal for a regulation
Article 23 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. |
4. The Commission shall adopt implementing acts to establish a set of key performance indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. |
Amendment 284
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: |
1. Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, concise, and unambiguous manner and in real time: |
Amendment 285
Proposal for a regulation
Article 24 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) that the information displayed is an advertisement; |
(a) that the information displayed on the interface or parts thereof is an online advertisement, including through prominent and harmonised marking; |
Amendment 286
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the natural or legal person who finances the advertisement where this person is different from the natural or legal person referred to in point (b); |
Amendment 287
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) clear, meaningful, and uniform information about the parameters used to determine the recipient to whom the advertisement is displayed, and where applicable about how to change those parameters. |
Amendment 288
Proposal for a regulation
Article 24 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Online platforms shall ensure that recipients of the service can easily make an informed choice when exercising their consent for processing their personal data in accordance with the Regulation 2016/679 for the purposes of targeted advertising by providing them with meaningful information, including information about how their data will be monetised. Online platforms shall ensure that refusing consent shall be no more difficult or time-consuming to the recipient than giving consent. |
Amendment 289
Proposal for a regulation
Article 24 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. The personal data referred to in paragraph 2 shall not be used for commercial purposes related to direct marketing, profiling and behaviourally targeted advertising of minors. |
Amendment 290
Proposal for a regulation
Article 24 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 24a |
|
Recommender system transparency |
|
1. Online platforms shall set out in their terms and conditions and via a designated online resource that can be directly reached and easily found from the online platform’s online interface when content is recommended, in a clear, accessible and easily comprehensible manner the main parameters used in their recommender systems, as well as any options for the recipient of the service to modify or influence those main parameters that they have made available. |
|
2. The main parameters referred to in paragraph 1 shall include, at a minimum: |
|
(a) the main criteria used by the relevant system which individually or collectively are most significant in determining recommendations; |
|
(b) the relative importance of those parameters; |
|
(c) what objectives the relevant system has been optimised for; and |
|
(d) if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs. |
|
The requirements set out in paragraph 2 shall be without prejudice to rules on protection of trade secrets and intellectual property rights. |
|
3. Where several options are available pursuant to paragraph 1, online platforms shall provide a clear and easily accessible function on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
Amendment 291
Proposal for a regulation
Article 24 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 24b |
|
Additional obligations for platforms primarily used for the dissemination of user-generated pornographic content |
|
Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure: |
|
(a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; |
|
(b) professional human content moderation, trained to identify image-based sexual abuse, including content having a high probability of being illegal; |
|
(c) the accessibility of a qualified notification procedure in the form that, additionally to the mechanism referred to in Article 14, individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure is to be suspended without undue delay. |
Amendment 292
Proposal for a regulation
Article 25 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. |
1. This Section shall apply to online platforms which: |
|
(a) provide for at least four consecutive months their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. Such a methodology shall take into account, in particular: |
|
(i) the number of active recipients shall be based on each service individually; |
|
(ii) active recipients connected on multiple devices are counted only once; |
|
(iii) indirect use of service, via a third party or linking, shall not be counted; |
|
(iv) where an online platform is hosted by another provider of intermediary services, that the active recipients are assigned solely to the online platform closest to the recipient; |
|
(v) that automated interactions, accounts or data scans by a non-human (“bots”) are not included. |
Amendment 293
Proposal for a regulation
Article 25 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features. |
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1(a). The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features. |
Amendment 294
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
1. Very large online platforms shall effectively and diligently identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, and in any event before launching new services, the probability and severity of any significant systemic risks stemming from, the design, algorithmic systems, intrinsic characteristics, functioning and use made of their services in the Union. The risk assessment shall take into account risks per Member State in which services are offered and in the Union as a whole, in particular to a specific language or region. This risk assessment shall be specific to their services and activities, including technology design, business-model choices, and shall include the following systemic risks: |
Amendment 295
Proposal for a regulation
Article 26 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the dissemination of illegal content through their services; |
(a) the dissemination of illegal content through their services or content that is in breach with their terms and conditions; |
Amendment 296
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any actual and foreseeable negative effects for the exercise of the fundamental rights, including for consumer protection, to respect for human dignity, private and family life, the protection of personal data and the freedom of expression and information, as well as to the freedom and the pluralism of the media, the prohibition of discrimination, the right to gender equality, and the rights of the child, as enshrined in Articles 1, 7, 8, 11, 21, 23, 24 and 38 of the Charter respectively; |
Amendment 297
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) any malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service or risks inherent to the intended operation of the service, including the amplification of illegal content, of content that is in breach with their terms and conditions or any other content with an actual or foreseeable negative effect on the protection of minors and of other vulnerable groups of recipients of the service, on democratic values, media freedom, freedom of expression and civic discourse, or actual or foreseeable effects related to electoral processes and public security; |
Amendment 298
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) any actual and foreseeable negative effects on the protection of public health as well as behavioural addictions or other serious negative consequences to the person's physical, mental, social and financial well-being. |
Amendment 299
Proposal for a regulation
Article 26 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, whether and how their content moderation systems, terms and conditions, community standards, algorithmic systems, recommender systems and systems for selecting and displaying advertisement, as well as the underlying data collection, processing and profiling, influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
Amendment 300
Proposal for a regulation
Article 26 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. When conducting risk assessments, very large online platforms shall consult, where appropriate, representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess. |
Amendment 301
Proposal for a regulation
Article 26 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. The supporting documents of the risk assessment shall be communicated to the Digital Services Coordinator of establishment and to the Commission. |
Amendment 302
Proposal for a regulation
Article 26 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. The obligations referred to in paragraphs 1 and 2 shall by no means lead to a general monitoring obligation. |
Amendment 303
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
1. Very large online platforms shall put in place reasonable, transparent, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
Amendment 304
Proposal for a regulation
Article 27 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions; |
(a) adapting content moderation, algorithmic systems, or recommender systems and online interfaces, their decision-making processes, the design, the features or functioning of their services, their advertising model or their terms and conditions; |
Amendment 305
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) ensuring appropriate resources to deal with notices and internal complaints, including appropriate technical and operational measures or capacities; |
Amendment 306
Proposal for a regulation
Article 27 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide; |
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide, or the alternative placement and display of public service advertisements or other related factual information; |
Amendment 307
Proposal for a regulation
Article 27 – paragraph 1 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) where relevant, targeted measures aimed at adapting online interfaces and features to protect minors; |
Amendment 308
Proposal for a regulation
Article 27 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk; |
(c) reinforcing the internal processes, and resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk; |
Amendment 309
Proposal for a regulation
Article 27 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Very large online platforms shall, where appropriate, design their risk mitigation measures with the involvement of representatives of the recipients of the service, independent experts and civil society organisations. Where no such involvement is foreseen, this shall be made clear in the transparency report referred to in Article 33. |
Amendment 310
Proposal for a regulation
Article 27 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Very large online platforms shall provide a detailed list of the risk mitigation measures taken and their justification to the independent auditors in order to prepare the audit report referred to in Article 28. |
Amendment 311
Proposal for a regulation
Article 27 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
1c. The Commission shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms referred to in Article 27(1) and where necessary, may issue recommendations. |
Amendment 312
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following: |
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following: |
Amendment 313
Proposal for a regulation
Article 27 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Articles 30, 31 and 33; |
Amendment 314
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The reports shall be presented per Member State in which the systemic risks occurred and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. |
Amendment 315
Proposal for a regulation
Article 27 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
3. The Commission, in cooperation with the Digital Services Coordinators, and following public consultation shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. |
Amendment 316
Proposal for a regulation
Article 27 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The requirement to put in place mitigation measures shall not lead to a general monitoring obligation or active fact-finding obligations. |
Amendment 317
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: |
Amendment 318
Proposal for a regulation
Article 28 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Very large online platforms shall ensure auditors have access to all relevant data necessary to perform the audit properly. |
Amendment 319
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which: |
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which having been recognised and vetted by the Commission and which: |
Amendment 320
Proposal for a regulation
Article 28 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) are independent from the very large online platform concerned; |
(a) are legally and financially independent from, and do not have conflicts of interest with the very large online platform concerned and other very large online platforms; |
Amendment 321
Proposal for a regulation
Article 28 – paragraph 2 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) auditors and their employees have not provided any other service to the very large online platform audited 12 months before the audit and commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for 12 months after their position in the auditing organisation has ended; |
Amendment 322
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. The organisations that perform the audits shall establish an audit report for each audit. The report shall be in writing and include at least the following: |
3. The organisations that perform the audits shall establish an audit report for each audit subject as referred to in paragraph 1. The report shall be in writing and include at least the following: |
Amendment 323
Proposal for a regulation
Article 28 – paragraph 3 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) a declaration of interests; |
Amendment 324
Proposal for a regulation
Article 28 – paragraph 3 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) a description of the main findings drawn from the audit; |
(d) a description of the main findings drawn from the audit and a summary of the main findings; |
Amendment 325
Proposal for a regulation
Article 28 – paragraph 3 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) a description of the third parties consulted as part of the audit; |
Amendment 326
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) a description of specific elements that could not be audited, and an explanation of why these could not be audited; |
Amendment 327
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fb) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion. |
Amendment 328
Proposal for a regulation
Article 28 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The Commission shall publish and regularly update a list of vetted organisations. |
Amendment 329
Proposal for a regulation
Article 28 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. Where a very large online platform receives a positive audit report, it shall be entitled to request from the Commission a seal of excellence. |
Amendment 330
Proposal for a regulation
Article 29 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
1. In addition to the requirements set out in Article 24a, very large online platforms that use recommender systems shall provide at least one recommender system which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679, as well as an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
Amendment 331
Proposal for a regulation
Article 29 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
deleted |
Amendment 332
Proposal for a regulation
Article 30 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. |
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, efficient and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that multicriterion queries can be performed per advertiser and per all data points present in the advertisement, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed and shall make reasonable efforts to ensure that the information is accurate and complete. |
Amendment 333
Proposal for a regulation
Article 30 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the content of the advertisement; |
(a) the content of the advertisement, including the name of the product, service or brand and the object of the advertisement; |
Amendment 334
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the natural or legal person who paid for the advertisement, where that person is different from the one referred to in point (b); |
Amendment 335
Proposal for a regulation
Article 30 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; |
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including any parameters used to exclude particular groups; |
Amendment 336
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) where it is disclosed, a copy of the content of commercial communications published on the very large online platforms that are not marketed, sold or arranged by the very large online platform, which have through appropriate channels been declared as such to the very large online platform; |
Amendment 337
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) cases where the advertisement was removed on the basis of a notice submitted in accordance with Article 14 or an order issued pursuant to Article 8. |
Amendment 338
Proposal for a regulation
Article 30 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Board shall, after consulting vetted researchers, publish guidelines on the structure and organisation on repositories created pursuant to paragraph 1. |
Amendment 339
Proposal for a regulation
Article 30 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 30a |
|
Deep fakes |
|
Where a very large online platform becomes aware that a piece of content is a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful (deep fakes), the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services. |
Amendment 340
Proposal for a regulation
Article 31 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, and without delay specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only request, access and use that data for those purposes. |
Amendment 341
Proposal for a regulation
Article 31 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The very large online platform shall be obliged to explain the design, logic and the functioning of the algorithms if requested by the Digital Service Coordinator of establishment. |
Amendment 342
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers, vetted not-for-profit bodies, organisations or associations, who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification, mitigation and understanding of systemic risks as set out in Article 26(1) and Article 27(1). |
Amendment 343
Proposal for a regulation
Article 31 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Vetted researchers, vetted not-for-profit bodies, organisations and associations shall have access to aggregate numbers for the total views and view rate of content prior to a removal on the basis of orders issued in accordance with Article 8 or content moderation engaged in at the provider’s own initiative and under its terms and conditions. |
Amendment 344
Proposal for a regulation
Article 31 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate, and with an easily accessible and user-friendly mechanism to search for multiple criteria. |
Amendment 345
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
4. In order to be vetted by the Digital Services Coordinator of establishment or the Commission, researchers, not-for-profit bodies, organisations or associations shall: |
|
(a) be affiliated with academic institutions or civil society organisations representing the public interest and meeting the requirements under Article 68; |
|
(b) be independent from commercial interests, including from any very large online platform; |
|
(c) disclose the funding financing the research; |
|
(d) be independent from any government, administrative or other state bodies, outside the academic institution of affiliation if public; |
|
(e) have proven records of expertise in the fields related to the risks investigated or related research methodologies; and |
|
(f) preserve the specific data security and confidentiality requirements corresponding to each request. |
Amendment 346
Proposal for a regulation
Article 31 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Where a very large online platform has grounds to believe that a researcher, a not-for-profit body, an organisation or association is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, it shall immediately inform the relevant authority, either the Digital Service Coordinator of establishment or the Commission, which shall decide without undue delay if access shall be withdrawn and when the access shall be restored and under what conditions. |
Amendment 347
Proposal for a regulation
Article 31 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. Where the Digital Services Coordinator of establishment, or the Commission have grounds to believe that a researcher, a not-for-profit body, an organisation or association is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, it shall immediately inform the very large online platform. The very large online platform shall be entitled to withdraw access to data upon receiving the information. The Digital Services Coordinator of establishment, or the Commission shall decide if and when access shall be restored and under what conditions. |
Amendment 348
Proposal for a regulation
Article 31 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers or not-for-profit bodies, organisations or associations can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, and maintaining the security of their service. |
Amendment 349
Proposal for a regulation
Article 31 – paragraph 6 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. |
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information. |
Amendment 350
Proposal for a regulation
Article 31 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Digital Service Coordinators and the Commission shall, once a year, report the following information: |
|
(a) the number of requests made to them as referred to in paragraphs 1, 2 and 6; |
|
(b) the number of such requests that have been declined or withdrawn by the Digital Service Coordinator or the Commission and the reasons for which they have been declined or withdrawn, including following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1, 2 and 6. |
Amendment 351
Proposal for a regulation
Article 31 – paragraph 7 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
7b. Upon completion of their research, the vetted researchers that have been granted access to data shall publish their findings without disclosing confidential data and in compliance with Regulation (EU) 2016/679. |
Amendment 352
Proposal for a regulation
Article 32 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. |
2. Very large online platforms shall only designate persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3 as compliance officers. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. |
Amendment 353
Proposal for a regulation
Article 32 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) cooperating with the Digital Services Coordinator of establishment and the Commission for the purpose of this Regulation; |
(a) cooperating with the Digital Services Coordinator of establishment, the Board and the Commission for the purpose of this Regulation; |
Amendment 354
Proposal for a regulation
Article 33 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. |
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months in a standardised, machine-readable and easily accessible format. |
Amendment 355
Proposal for a regulation
Article 33 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Such reports shall include content moderation information separated and presented for each Member State in which the services are offered and for the Union as a whole. The reports shall be published in at least one of the official languages of the Member States of the Union in which services are offered. |
Amendment 356
Proposal for a regulation
Article 33 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the related risk mitigation measures identified and implemented pursuant to Article 27; |
(b) the specific mitigation measures identified and implemented pursuant to Article 27; |
Amendment 357
Proposal for a regulation
Article 33 – paragraph 2 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) where appropriate, information about the representatives of the recipients of the service, independent experts and civil society organisations, consulted for the risk assessment in accordance with Article 26. |
Amendment 358
Proposal for a regulation
Article 33 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports. |
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports, in compliance with Regulation (EU) 2016/679. |
Amendment 359
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies at least for the following: |
1. The Commission shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, in accordance with Regulation (EU) No 1025/2012, at least for the following: |
Amendment 360
Proposal for a regulation
Article 34 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) terms and conditions under Article 12, including as regards acceptance of and changes to those terms and conditions; |
Amendment 361
Proposal for a regulation
Article 34 – paragraph 1 – point a b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ab) information on traceability of traders under Article 22; |
Amendment 362
Proposal for a regulation
Article 34 – paragraph 1 – point a c (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ac) advertising practices under Article 24 and recommender systems under Article 24a; |
Amendment 363
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) transparency reporting obligations pursuant to Article 13; |
Amendment 364
Proposal for a regulation
Article 34 – paragraph 1 – point f b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fb) technical specifications to ensure that intermediary services shall be made accessible for persons with disabilities in accordance with the accessibility requirements of Directive 2019/882. |
Amendment 365
Proposal for a regulation
Article 34 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The Commission shall support and promote the development and implementation of voluntary standards set by the relevant European and international standardisation bodies aimed at the protection of minors. |
Amendment 366
Proposal for a regulation
Article 34 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall be empowered to adopt implementing acts laying down common specifications for the items listed in points (a) to (fb) of paragraph 1 where the Commission has requested one or more European standardisation organisations to draft a harmonised standard and there has not been a publication of the reference to that standard in the Official Journal of the European Union within [24 months after the entry into force of this Regulation] or the request has not been accepted by any of the European standardisation organisations. |
Amendment 367
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law. Particular attention shall be given to avoiding negative effects on fair competition, data access and security, the general monitoring prohibition and the protection of privacy and personal data. The Commission and the Board shall also encourage and facilitate regular review and adaption of the Codes of conduct to ensure that they are fit for purpose. |
Amendment 368
Proposal for a regulation
Article 35 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as relevant competent authorities, civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
Amendment 369
Proposal for a regulation
Article 35 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their specific objectives, define the nature of the public policy objective pursued and, where appropriate, the role of competent authorities, contain key performance indicators to measure the achievement of those objectives and take fully into account of the needs and interests of all interested parties, and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants. |
Amendment 370
Proposal for a regulation
Article 35 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions and request that the organisations involved amend their codes of conduct accordingly. |
Amendment 371
Proposal for a regulation
Article 35 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. |
5. The Commission and the Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic failure to comply with the Codes of Conduct, the Commission and the Board may take a decision to temporarily suspend or definitively exclude platforms that do not meet their commitments as signatories to the codes of conduct. |
Amendment 372
Proposal for a regulation
Article 36 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. |
1. The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for all actors in the online advertising eco-system, beyond the requirements of Articles 24 and 30. |
Amendment 373
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
Amendment 374
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the different types of data that can be used. |
Amendment 375
Proposal for a regulation
Article 36 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. |
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes three years after the application of this Regulation. |
Amendment 376
Proposal for a regulation
Article 36 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The Commission shall encourage all the actors in the online advertising eco-system referred to in paragraph 1 to endorse and comply with the commitments stated in the codes of conduct. |
Amendment 377
Proposal for a regulation
Article 37 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. |
Amendment 378
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) measures to ensure accessibility for persons with disabilities during implementation of crisis protocols, including by providing accessible description about these protocols. |
Amendment 379
Proposal for a regulation
Article 37 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it may request the participants to revise the crisis protocol, including by taking additional measures. |
5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it shall request the participants to revise the crisis protocol, including by taking additional measures. |
Amendment 380
Proposal for a regulation
Article 38 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Member States shall ensure that the competent authorities, referred to in paragraph 1 and in particular their Digital Services Coordinators, have adequate technical financial and human resources to carry out their tasks under this Regulation. |
Amendment 381
Proposal for a regulation
Article 39 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. |
Amendment 382
Proposal for a regulation
Article 40 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation. |
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of the supervision and enforcement by the national competent authorities, in accordance with this Chapter, of the obligations imposed on intermediaries under this Regulation. |
Amendment 383
Proposal for a regulation
Article 40 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. A provider of intermediary services which does not have an establishment in the Union but which offers services in the Union shall, for the purposes of Chapters III and IV, be deemed to be under the jurisdiction of the Member State where its legal representative resides or is established. |
2. A provider of intermediary services which does not have an establishment in the Union but which offers services in the Union shall, for the purposes of this Article, be deemed to be under the jurisdiction of the Member State where its legal representative resides or is established. |
Amendment 384
Proposal for a regulation
Article 40 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected. |
3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of this Article. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected. |
Amendment 385
Proposal for a regulation
Article 41 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period; |
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information without undue delay, or at the latest within three months; |
Amendment 386
Proposal for a regulation
Article 41 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the power to adopt interim measures to avoid the risk of serious harm. |
(e) the power to adopt proportionate interim measures or to request the relevant judicial authority to do so, to avoid the risk of serious harm. |
Amendment 387
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those others persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities. |
As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those other persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities. |
Amendment 388
Proposal for a regulation
Article 41 – paragraph 3 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures: |
3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists or is continuously repeated and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures: |
Amendment 389
Proposal for a regulation
Article 41 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) require the management body of the providers, within a reasonable time period, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken; |
(a) require the management body of the providers, within a reasonable time period, which shall in any case not exceed three months, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken; |
Amendment 390
Proposal for a regulation
Article 41 – paragraph 3 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place. |
(b) where the Digital Services Coordinator considers that the provider has not complied with the requirements of the first indent, that the infringement persists or is continuously repeated and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place. |
Amendment 391
Proposal for a regulation
Article 41 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. The Commission shall publish guidelines by [six months after the entry into force of this Regulation] on the powers of and procedures applicable to the Digital Services Coordinators. |
Amendment 392
Proposal for a regulation
Article 42 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them. |
2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission and the Board of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them. |
Amendment 393
Proposal for a regulation
Article 42 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned. |
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual worldwide turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual worldwide turnover of the provider concerned. |
Amendment 394
Proposal for a regulation
Article 42 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. |
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily worldwide turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. |
Amendment 395
Proposal for a regulation
Article 42 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Member States shall ensure that administrative or judicial authorities issuing orders pursuant to Article 8 and 9 shall only issue penalties or fines in line with this Article. |
Amendment 396
Proposal for a regulation
Article 43 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
1. Recipients of the service, , shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. During these proceedings, both parties shall have the right to be heard and receive appropriate information about the status of the proceedings. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment without undue delay. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority, without undue delay. |
Amendment 397
Proposal for a regulation
Article 43 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Upon receipt of the complaint, transmitted pursuant to paragraph 1, the Digital Services Coordinator of establishment shall assess the matter in a timely manner and shall inform within six months the Digital Services Coordinator of the Member State where the recipient resides or is established if it intends to proceed with an investigation. If it opens an investigation, it shall provide an update at least every three months. The Digital Services Coordinator of the Member State where the recipient resides or is established shall consequently inform the recipient. |
Amendment 398
Proposal for a regulation
Article 43 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 43a |
|
Compensation |
|
Without prejudice to Article 5, recipients of the service shall have the right to seek, in accordance with relevant Union and national law compensation from providers of intermediary services, against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation. |
Amendment 399
Proposal for a regulation
Article 44 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board. |
1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports in a standardised and machine-readable format available to the public, and shall communicate them to the Commission and to the Board. |
Amendment 400
Proposal for a regulation
Article 44 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned; |
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned, including information on the name of the issuing authority, the name of the provider and the type of action specified in the order, as well as a justification that the order complies with Article 3 of Directive 2000/31/EC; |
Amendment 401
Proposal for a regulation
Article 44 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 8 and 9. |
(b) the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 8 and 9, the number of appeals made against those orders, as well as the outcome of the appeals. |
Amendment 402
Proposal for a regulation
Article 44 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall make publicly available a biennial report analysing the annual reports, communicated pursuant to paragraph 1 and shall submit it to the European Parliament and to the Council. |
Amendment 403
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommend the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. |
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. |
Amendment 404
Proposal for a regulation
Article 45 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. A request or recommendation pursuant to paragraph 1 shall at least indicate: |
2. A request pursuant to paragraph 1 shall at least indicate: |
Amendment 405
Proposal for a regulation
Article 45 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. A request pursuant to paragraph 1 shall be at the same time communicated to the Commission. Where the Commission believes that the request is not justified or where the Commission is currently taking action on the same matter, the Commission can ask for the request to be withdrawn. |
Amendment 406
Proposal for a regulation
Article 45 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. |
3. The Digital Services Coordinator of establishment shall take into utmost account the request pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. |
Amendment 407
Proposal for a regulation
Article 45 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. |
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. |
Amendment 408
Proposal for a regulation
Article 45 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4. |
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4. |
Amendment 409
Proposal for a regulation
Article 45 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1. |
Amendment 410
Proposal for a regulation
Article 46 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Such joint investigations are without prejudice to the tasks and powers of the participating Digital Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the Board through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation. |
deleted |
Amendment 411
Proposal for a regulation
Article 46 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Where a Digital Services Coordinator of establishment has reasons to suspect that a provider of intermediary services has infringed this Regulation in a manner involving at least one other Member State, it may propose to the Digital Services Coordinator of destination concerned to launch a joint investigation. The joint investigation shall be based on an agreement between the Member States concerned. |
Amendment 412
Proposal for a regulation
Article 46 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Upon request of the Digital Services Coordinator of destination who has reasons to suspect that a provider of intermediary services has infringed this Regulation in its Member State, the Board may recommend to the Digital Services Coordinator of establishment to launch a joint investigation with the Digital Services Coordinator of destination concerned. The joint investigation shall be based on an agreement between the Member States concerned. |
|
Where there is no agreement within one month, the joint investigation shall be under the supervision of the Digital Services Coordinator of establishment. |
|
Such joint investigations are without prejudice to the tasks and powers of the participating Digital Services Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the Board through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation. |
Amendment 413
Proposal for a regulation
Article 47 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) coordinating and contributing to guidance and analysis of the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation; |
(b) coordinating and providing guidance and analysis to the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation; |
Amendment 414
Proposal for a regulation
Article 47 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) contributing to the effective application of Article 3 of Directive 2000/31/EC to prevent fragmentation of the digital single market; |
Amendment 415
Proposal for a regulation
Article 47 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) contribute to the effective cooperation with the competent authorities of third countries and with international organisations. |
Amendment 416
Proposal for a regulation
Article 48 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. |
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator, may participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. The meeting shall be deemed valid where at least two thirds of its members are present. |
Amendment 417
Proposal for a regulation
Article 48 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance with the tasks of the Board pursuant to this Regulation and with its rules of procedure. |
Amendment 418
Proposal for a regulation
Article 48 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. Each Member State shall have one vote. The Commission shall not have voting rights. |
2. Each Member State shall have one vote, to be cast by the Digital Services Coordinator. The Commission shall not have voting rights. |
Amendment 419
Proposal for a regulation
Article 48 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure. |
deleted |
Amendment 420
Proposal for a regulation
Article 48 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available. |
5. The Board may invite experts and observers to attend its meetings, and shall cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available. |
Amendment 421
Proposal for a regulation
Article 48 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. The Board shall, where appropriate, consult interested parties and shall make the results of that consultation publicly available. |
Amendment 422
Proposal for a regulation
Article 48 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Board shall adopt its rules of procedure, following the consent of the Commission. |
6. The Board shall adopt its rules of procedure by a two-thirds majority of its members, following the consent of the Commission. |
Amendment 423
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) issue specific recommendations for the implementation of Article 13a; |
Amendment 424
Proposal for a regulation
Article 49 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation; |
(d) advise the Commission to take the measures referred to in Article 51 and adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation; |
Amendment 425
Proposal for a regulation
Article 49 – paragraph 1 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) monitor the compliance with Article 3 of Directive 2000/31/EC of measures taken by a Member State restricting the freedom to provide services of intermediary service providers from another Member State and ensure that those measures are strictly necessary and do not restrict the application of this Regulation; |
Amendment 426
Proposal for a regulation
Article 49 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts as provided for in this Regulation, as well as the identification of emerging issues, with regard to matters covered by this Regulation. |
(e) support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in close collaboration with relevant stakeholders as provided for in this Regulation, including by issuing opinions, recommendations or advice on matters related to Article 34, as well as the identification of emerging issues, with regard to matters covered by this Regulation. |
Amendment 427
Proposal for a regulation
Article 49 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Digital Services Coordinators and other national competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate. |
2. Digital Services Coordinators and other national competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice and an explanation on the investigations, actions and the measures that they have implemented when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate. |
Amendment 428
Proposal for a regulation
Article 49 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 49a |
|
Reports |
|
1. The Board shall draw up an annual report regarding its activities. The report shall be made public and be transmitted to the European Parliament, to the Council and to the Commission in all official languages of the Union. |
|
2. The annual report shall include, among other information, a review of the practical application of the opinions, guidelines, recommendations advice and any other measures taken under Article 49(1). |
Amendment 429
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of the provisions of Section 4 of Chapter III, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period and no later than three months. |
Amendment 430
Proposal for a regulation
Article 50 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35. |
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may recommend, where appropriate, participation in a code of conduct as provided for in Article 35. |
Amendment 431
Proposal for a regulation
Article 51 – title
|
|
Text proposed by the Commission |
Amendment |
Intervention by the Commission and opening of proceedings |
Opening of proceedings by the Commission |
Amendment 432
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, shall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
Amendment 433
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. |
2. Where the Commission initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. |
Amendment 434
Proposal for a regulation
Article 52 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by reasoned request or by decision require the very large online platforms concerned, their legal representatives as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period |
Amendment 435
Proposal for a regulation
Article 52 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The purpose of the request shall include reasoning on why and how the information is necessary and proportionate to the objective pursued and why it cannot be received by other means. |
Amendment 436
Proposal for a regulation
Article 52 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading. |
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). |
Amendment 437
Proposal for a regulation
Article 55 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement. |
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service the Commission may, by decision, order proportionate interim measures in compliance with fundamental rights against the very large online platform concerned on the basis of a prima facie finding of an infringement. |
Amendment 438
Proposal for a regulation
Article 56 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission may, upon request or on its own initiative, reopen the proceedings: |
2. The Commission shall reopen the proceedings: |
Amendment 439
Proposal for a regulation
Article 58 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) interim measures ordered pursuant to Article 55; |
(b) interim measures ordered pursuant to Article 55; or |
Amendment 440
Proposal for a regulation
Article 58 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. In the decision adopted pursuant to paragraph 1 the Commission shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period and to provide information on the measures that that platform intends to take to comply with the decision. |
3. In the decision adopted pursuant to paragraph 1 the Commission shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within one month and to provide information on the measures that that platform intends to take to comply with the decision. |
Amendment 441
Proposal for a regulation
Article 58 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. |
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. The decision shall apply with immediate effect. |
Amendment 442
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently: |
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total worldwide turnover in the preceding financial year where it finds that the platform, intentionally or negligently: |
Amendment 443
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently: |
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total worldwide turnover in the preceding financial year, where they intentionally or negligently: |
Amendment 444
Proposal for a regulation
Article 59 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings. |
4. In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement any fines issued under Article 42 for the same infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings. |
Amendment 445
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: |
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily worldwide turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: |
Amendment 446
Proposal for a regulation
Article 64 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed. |
1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed, along with, where possible and justified, non-confidential documents or other forms of information on which the decision is based. |
Amendment 447
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures it intends to request and identifying the intended addressee or addressees thereof. |
Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than 14 working days describing the measures it intends to request and identifying the intended addressee or addressees thereof. |
Amendment 448
Proposal for a regulation
Article 66 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the development and implementation of standards provided for in Article 34. |
Amendment 449
Proposal for a regulation
Article 68 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council52 , recipients of intermediary services shall have the right to mandate a body, organisation or association to exercise the rights referred to in Articles 17, 18 and 19 on their behalf, provided the body, organisation or association meets all of the following conditions: |
Without prejudice to Directive (EU) 2020/1818 of the European Parliament and of the Council52 , recipients of intermediary services shall have the right to mandate a, or a body, organisation or association to exercise the rights referred to in Articles 8, 12, 13, 14, 15, 17, 18, 19, 43 and 43a on their behalf, provided the body, organisation or association meets all of the following conditions: |
__________________ |
__________________ |
52 [Reference] |
52 [Reference] |
Amendment 450
Proposal for a regulation
Article 69 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation]. |
2. The delegation of power referred to in Articles 13a, 16, 23, 25, and 31 shall be conferred on the Commission for five years starting from [date of expected adoption of the Regulation]. The Commission shall draw up a report in respect of the delegation of power not later than nine months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than three months before the end of each period. |
Amendment 451
Proposal for a regulation
Article 69 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The delegation of power referred to in Articles 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force. |
3. The delegation of power referred to in Articles 13a, 16, 23, 25, 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force. |
Amendment 452
Proposal for a regulation
Article 69 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. A delegated act adopted pursuant to Articles 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council. |
5. A delegated act adopted pursuant to Articles 13a, 16, 23, 25, 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of four months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council. |
Amendment 453
Proposal for a regulation
Article 70 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall be assisted by the Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011. |
1. The Commission shall be assisted by a Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011. |
Amendment 454
Proposal for a regulation
Article 73 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. By five years after the entry into force of this Regulation at the latest, and every five years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. |
1. By three years after the entry into force of this Regulation at the latest, and every three years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. This report shall address in particular: |
|
(a) the application of Article 25, including with respect to the number of average monthly active recipients of the service; |
|
(b) the application of Article 11; |
|
(c) the application of Article 14, |
|
(d) the application of Articles 35 and 36. |
Amendment 455
Proposal for a regulation
Article 73 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Where appropriate, the report referred to in paragraph 1 shall be accompanied by a proposal for amendment of this Regulation. |
Amendment 456
Proposal for a regulation
Article 73 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources. |
3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources, and pay specific attention to small and medium-sized enterprises and the position of new competitors. |
Amendment 457
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. It shall apply from [date - three months after its entry into force]. |
2. It shall apply from [date - six months after its entry into force]. |
EXPLANATORY STATEMENT
Introduction
The Rapporteur welcomes the Commission’s proposal on a Digital Services Act. Digital services are an important backbone of our economy, bringing new opportunities for both consumers and businesses, using the various digital services on a daily basis.
At the same time digital services have created serious challenges and risks. The nature, scale and importance of digital services for the economy and society have changed dramatically since the current legislation was put into place. An updated regulatory framework on digital services, establishing clear responsibilities is necessary to address these challenges and to ensure a level playing field in the digital Single Market and a safer digital space for the users.
The Rapporteur acknowledges the horizontal nature of this Regulation, but at the same time considers that the one size fits all approach fails to tackle the problems with illegal products and services sold through online marketplaces. The Rapporteur is of the opinion that stricter rules on online marketplaces must be introduced in order to create a level playing field and ensure the principle of “what is illegal offline should also be illegal online”.
The Rapporteur welcomes the Commission’s aim to increase the transparency of online advertisement and recommender systems, but is of the view that the Commission’s proposal lacks concrete obligations to ensure accountability and to prevent the amplification of illegal content. The Rapporteur thus sees a need to propose further transparency measures and requirements in order to ensure user protection by design and by default.
Lastly, the Rapporteur welcomes the focus on the implementation and enforcement provisions and believes that given the cross-border nature of digital services, the hybrid enforcement model suggested by the Commission could ensure an effective and efficient enforcement of this Regulation. However, the Rapporteur finds it necessary to strengthen some provisions to ensure that no Member State becomes a safe haven for online platforms.
Consumer protection and online marketplaces
Although the Rapporteur acknowledges the horizontal approach of the DSA, more specific actions must be required from online marketplaces to ensure that consumers can purchase safe products and services online. The Rapporteur welcomes certain aspects of the Commission proposal, namely the traceability of traders, the specific condition to the liability exemption targeting online marketplaces and the fact that that notices under certain conditions will be considered to give actual knowledge and thereby make online platforms liable if they do not remove the content.
However, in order to tackle the issue with illegal products and thus ensuring the principle of “what is illegal offline should also be illegal online” is more than just words, the Rapporteur is of the opinion that further conditions to the exemption of liability and obligations must be introduced to ensure consumer protection and a level playing field for European businesses within the digital Single Market.
The Rapporteur proposes a new Article laying down stricter conditions for the exemptions of liability specifically targeting online marketplaces. These conditions include amongst other things requirements to comply with certain due diligence obligations and conditions that ensures that where a trader from a third country does not have an economic operator liable for the product safety, the marketplace will not benefit from the exemption of liability. This is done to ensure liability of any product sold to European consumers, including from e-commerce. In addition, consumers will be able to seek redress for any damages the products or services have caused from the online platform.
Lastly, the Rapporteur proposes to strengthen the obligation on the traceability of traders, both by introducing a new article extending the scope of certain provisions presented in Article 22 to all intermediary services and by introducing new provisions targeting online marketplaces. These provisions include obligations to prevent dangerous and/or non-compliant products from being offered online and obligations to cooperate with national authorities when necessary regarding dangerous products already sold.
Removal of illegal content
The Rapporteur is of the opinion that illegal content should be removed from intermediary services as fast as possible while taking into account fundamental rights. The Rapporteur believes that the DSA should establish a framework for notice and takedown with clearly defined procedures, safeguards and timelines for acting on notifications on illegal content and ensure uniform procedures in all Member States. While it is necessary to grant digital platforms time to assess the legality of content, some content has a very high impact and may pose a greater threat to society or significant damage to the individual. It is thus reasonable to have two sets of timelines with shorter timeframes for such high impact content. To ensure consistency with existing legislation, the Rapporteur specifies that these deadlines are without prejudice to deadlines set in sectorial legislation or legal orders.
In addition, the Rapporteur welcomes the obligation introduced in Article 20 on measures and protection against misuse. However, when a user frequently provide illegal content to an interface, e.g. offer products that does not comply with EU laws, the platform should for a reasonable period of time suspend the user. This should not be limited to manifestly illegal content only.
Users rights
The Rapporteur also welcomes the Commission’s proposal for an internal complaint-handling system and the out-of-court dispute settlement body. However, in order to ensure an efficient procedure, the Rapporteur has proposed to include timeframes. In addition, the internal complaint-handling system should not only be available for those whose content has been removed, but also for those whose notification has been rejected.
The Rapporteur believes that not only national authorities and the Commission should have access to direct and efficient means of communications with intermediary services, but also the recipients of services. The Rapporteur proposes a new Article that allows recipients of services to choose between means of communication with the intermediary services.
Lastly, the Rapporteur is of the opinion that the addtional obligations imposed on online platforms under this Regulation chapter two, section three, should be applicable to micro and small enterprises as well except Article 23. Consumer protection law does not differentiate between small and big enterprises, and therefore the obligations should not be limited to larger platforms.
Online advertising
The Rapporteur firmly believes that the pervasive collecting and use of users’ data to provide targeted, micro-targeted and behavioural advertising has spiralled out of control. The Rapporteur welcomes the new transparency obligations on this issue, but believes that transparency alone cannot solve the problems related to targeted online advertisement.
The Rapporteur proposes a new article aiming to allow consumers to navigate through online platforms without being subject to targeted advertising. The Rapporteur, therefore, proposes that targeted advertising is set off by default and that consumers can easily opt-out. The Rapporteur also suggests that when online intermediaries process data for targeted advertising, it shall not carry out activities that can lead to pervasive tracking.
Furthermore, the Rapporteur proposes to extend the scope of the Article on online advertising transparency to all intermediary services and suggests new transparency provisions. The Rapporteur suggests that the intermediary services should specify among other things the person who finances the advertisement and where the advertisement has been displayed. Moreover, the intermediary service should allow access to NGOs, researchers and public authorities upon their request to information on direct and indirect payment or any remuneration received.
Lastly, in order to improve consumers’ awareness of commercial content the Rapporteur suggests to have prominent and harmonised markings of advertisement. Today, it is up to the individual trader to decide how to disclose the advertisement as long as this is judged as being sufficiently clear to an average consumer of the expected target group. This freedom results in a variety of different markings which makes it difficult for consumers to recognise an advertisement. Therefore, a prominent and harmonised marking for advertisement is needed.
Recommender systems and algorithmic accountability
The Rapporteur welcomes the Commission’s recognition that recommender systems can have a significant impact on user’s ability to choose information, and that the Commission has decided to dedicate an article to address related issues. However, the Rapporteur sees the need to further strengthen the empowerment of consumers when it comes to recommender systems.
The Rapporteur suggests to extend the scope of the Article to all online platforms as recommender systems used on platforms with less than 45 million active users also have a significant impact on users. Furthermore, the Rapporteur proposes that any recommender system should, by default, not be based on profiling, and that consumers subject to recommender system using profiling should be able to view and delete any profiles used to curate the content they see. In addition, the Rapporteur believes that the algorithms used in recommender system should be designed in a way that prevents dark patterns and rabbit holes from happening. Moreover, the Rapporteur suggest a “must-carry” obligation to ensure that information of public interest is high-ranked in the platforms algorithms.
Lastly, the Rapporteur finds that greater accountability on algorithms should be introduced in the proposal. The Rapporteur suggests that the Commission should be able to assess the algorithms used by very large online platforms and determine whether they comply with a number of requirements. The Commission would be allowed to sanction in case of infringement of certain requirements.
Implementation and enforcement
The Rapporteur welcomes the enforcement model proposed by the Commission. However, some changes have been made in order to strengthen the model. Taking inspiration from the Regulation (EU) 2017/2394 , the Rapporteur proposes that the Digital Service Coordinator and the Commission should have the possibility to restrict access to the interface of an intermediary service, if the provider repeatedly infringe the obligations set out in this Regulation. Furthermore, the Commission should not just have the possibility to act, but should be obliged to act if it has reasons to believe that a very large online platform infringes this Regulation.
OPINION OF THE COMMITTEE ON INDUSTRY, RESEARCH AND ENERGY (29.9.2021)
for the Committee on the Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9-0418/2020 – 2020/0361(COD))
Rapporteur for opinion: Henna Virkkunen
(*) Associated committee – Rule 57 of the Rules of Procedure
SHORT JUSTIFICATION
In its proposal for the Digital Services Act, the European Commission has set out a number of ways to improve the protection of fundamental human rights online and to create a stronger obligation of transparency and accountability for online platforms.
The aim of this new Regulation should be to strengthen democracy, increase fair competition and accelerate innovation. The digital world must adhere to the same European values as the rest of our societies: democracy, freedom of speech and human rights. What is illegal in offline should also be illegal online. It is also vitally important for European business, especially SMEs, that companies based in third countries but operating in the internal market follow the same rules as European companies.
In the draft opinion, I have limited myself to parts of the Regulation that fall under ITRE competence. This has been a conscious choice, which I encourage all Rapporteurs and other colleagues to follow, when they analyse and propose changes to the Commission proposal. As the industry, research, energy, ICT and SME committee of the Parliament, we have a keen interest and a clear competence for many parts of the Regulation, but should also recognize the important role played by other associated committees and the lead committee.
The proposal includes obligations to remove illegal content from platforms, traceability of business users, ways to challenge moderation decisions, and researchers' access to data. In many parts of the Regulation, we need to strike the right balance between different legitimate interests and arguments. After carefully analysing the Commission proposal, I have found that in many of these cases, the choice made by the Commission in their proposal has been a justified and well-reasoned one. For these Articles, I am not proposing changes, even if I consider many of them to fall under ITRE competence.
The focus of my draft opinion is on the amount of administrative burden and requirements we are setting, not only for the big companies, but especially for the small. In the draft opinion, I have identified several requirements that due to their nature, level of detail or expected amount of compliance cost should not apply to micro and small enterprises. This, I believe, is also well in line with the input I have received from different political groups before the publication of this draft opinion.
In addition to the focus on micro and small enterprises, I have introduced certain clarifications and changes that are more technical in their nature. We should ensure that the standards we set in this Regulation are clear and provide businesses and consumers the required legal certainty. We should also ensure that the mechanisms we are introducing in this legislation are efficient and fulfil the role that they have been assigned.
AMENDMENTS
The Committee on Industry, Research and Energy calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice, without lock-in effects. |
Amendment 2
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the right to privacy, the right to protection of personal data, freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
Amendment 3
Proposal for a regulation
Recital 4
|
|
Text proposed by the Commission |
Amendment |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, clear, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
Amendment 4
Proposal for a regulation
Recital 5
|
|
Text proposed by the Commission |
Amendment |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their responsibility for upholding fundamental rights. |
__________________ |
__________________ |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
Amendment 5
Proposal for a regulation
Recital 8
|
|
Text proposed by the Commission |
Amendment |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of its targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
__________________ |
__________________ |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
Amendment 6
Proposal for a regulation
Recital 9
|
|
Text proposed by the Commission |
Amendment |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
(9) This Regulation should not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended 28 , and Regulation (EU) …/.. of the European Parliament and of the Council 29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. In the event of a conflict between lex specialis legislation, including national implementing measures, and this Regulation, the lex specialis provisions will prevail. |
__________________ |
__________________ |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
Amendment 7
Proposal for a regulation
Recital 11
|
|
Text proposed by the Commission |
Amendment |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected. |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, and to any provisions of national law on copyright and related rights adopted in compliance with Union law so as to ensure the highest level of protection of those rights, which establish specific rules and procedures that should remain unaffected. |
Amendment 8
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined as covering information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable Union or national law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant, dangerous or counterfeit products, illegally-traded animals, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. The Commission should provide guidance on how to identify illegal content. |
Amendment 9
Proposal for a regulation
Recital 13
|
|
Text proposed by the Commission |
Amendment |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, livestreaming platforms or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request, or otherwise play an active role in the dissemination of user-generated content. Search engines and equivalent services may also be considered to be online platforms, if these services meet the definition of online platform set out in this Regulation. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms, for the entirety or for part of their service, where the dissemination to the public is merely a minor and purely ancillary feature of the principal service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. Similarly, link-sharing options or similar features of cloud-based solutions for storing user-generated content could constitute such a feature, where the possibility of disseminating content to the public is clearly an ancillary feature to the principal service of storing information and content. |
Amendment 10
Proposal for a regulation
Recital 14
|
|
Text proposed by the Commission |
Amendment |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a limited number of pre-determined persons, taking into account the potential for groups to become tools for wide dissemination of content to the public. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of the definition of online platform in this Regulation. To the extent that they qualify as ‘mere conduit’, ‘caching’ or ‘hosting‘ services, those services should be able to benefit from liability exemptions provided for by this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
__________________ |
__________________ |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36. |
Amendment 11
Proposal for a regulation
Recital 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(15a) Applying effective end-to-end encryption to data is essential for trust in, and security on, the internet, as it effectively prevents unauthorised third party access and helps to ensure the confidentiality of communications. |
Amendment 12
Proposal for a regulation
Recital 18
|
|
Text proposed by the Commission |
Amendment |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the provider intermediary service promotes or references such content. |
Amendment 13
Proposal for a regulation
Recital 20
|
|
Text proposed by the Commission |
Amendment |
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
(20) Where the main purpose of the information society service is to engage in or facilitate illegal activities or where a provider of intermediary services deliberately collaborates with a recipient of the services in order to undertake illegal activities, the service should be deemed not to have been provided neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
Amendment 14
Proposal for a regulation
Recital 21
|
|
Text proposed by the Commission |
Amendment |
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. |
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved in the content of the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. |
Amendment 15
Proposal for a regulation
Recital 22
|
|
Text proposed by the Commission |
Amendment |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to assess the grounds for and, when necessary, proceed to removing or disabling access to all copies of that content. The removal or disabling of access should be undertaken in the observance of the principles enshrined in the Charter of Fundamental Rights, including the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its periodic own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the illegal content. |
Amendment 16
Proposal for a regulation
Recital 23
|
|
Text proposed by the Commission |
Amendment |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online marketplaces, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average consumer. |
Amendment 17
Proposal for a regulation
Recital 24
|
|
Text proposed by the Commission |
Amendment |
(24) The exemptions from liability established in this Regulation should not affect the possibility of injunctions of different kinds against providers of intermediary services, even where they meet the conditions set out as part of those exemptions. Such injunctions could, in particular, consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal content specified in such orders, issued in compliance with Union law, or the disabling of access to it. |
(24) The exemptions from liability established in this Regulation should not affect the possibility of injunctions of different kinds against providers of intermediary services, where they meet the conditions set out as part of those exemptions. Such injunctions could, in particular, consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal content specified in such orders, issued in compliance with Union law, or the disabling of access to it. As a general rule, injunctions addressed to intermediary services should be considered as a last resort, where any other reasonable and proportionate action closer to the content owner is not available. |
Amendment 18
Proposal for a regulation
Recital 25
|
|
Text proposed by the Commission |
Amendment |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
(25) In order to create legal certainty and not to discourage automated or non-automated activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in a diligent manner for the purpose of detecting, identifying and acting against illegal content. Such activities should be accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of the exemptions from liability set out in this Regulation. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
Amendment 19
Proposal for a regulation
Recital 26
|
|
Text proposed by the Commission |
Amendment |
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. |
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Only where that intermediary has not responded to the request should requests or orders be addressed, as a last resort, to intermediaries lower in the internet stack, for removing or blocking access to content, including all the necessary information for localising as precisely as possible the illegal content. |
Amendment 20
Proposal for a regulation
Recital 27
|
|
Text proposed by the Commission |
Amendment |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
(27) New technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, content delivery networks or providers of services deeper in the internet stack, such as IT infrastructure services (on-premise, cloud-based and hybrid hosting solutions), that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. Services deeper in the internet stack acting as online intermediaries could be required to take proportionate action where the customer fails to remove the illegal content, unless this is technically impracticable. |
Amendment 21
Proposal for a regulation
Recital 28
|
|
Text proposed by the Commission |
Amendment |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, nor should they be required to use automated tools for content moderation. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Such orders referred should not consist in requiring a service provider to introduce, exclusively at its own expense, a screening system which entails general and permanent monitoring in order to prevent any future infringement. However, such orders could require a host provider to remove information which it stores, the content of which is identical or equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information, provided that the monitoring of and search for the information concerned is limited to information properly identified in the injunction, such as the name of the person concerned by the infringement determined previously, the circumstances in which that infringement was determined and equivalent content to that which was declared to be illegal, and does not require the host provider to carry out an independent assessment of that content. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation. The Regulation should not be considered to be impeding the ability of providers to undertake proactive measures to identify and remove illegal content and to prevent its reappearance. |
Amendment 22
Proposal for a regulation
Recital 30
|
|
Text proposed by the Commission |
Amendment |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. It should be possible for orders to act against illegal content to require providers of intermediary services to take steps, in specific cases, to remove identical or equivalent illegal content, within the same context. It should also be possible for them to require providers of intermediary services to take steps to prevent the reappearance of the illegal content. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
Amendment 23
Proposal for a regulation
Recital 31
|
|
Text proposed by the Commission |
Amendment |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law, national law or international law and the interests of international comity. |
Amendment 24
Proposal for a regulation
Recital 33
|
|
Text proposed by the Commission |
Amendment |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information under either Union law or national law in compliance with Union law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
Amendment 25
Proposal for a regulation
Recital 34
|
|
Text proposed by the Commission |
Amendment |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to reinforce and guarantee legislation and rights, such as the safety and trust of the recipients of the service, including minors, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
Amendment 26
Proposal for a regulation
Recital 36
|
|
Text proposed by the Commission |
Amendment |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can be used by professional entities and by users of services which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location. . |
Amendment 27
Proposal for a regulation
Recital 37
|
|
Text proposed by the Commission |
Amendment |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Nothing in this Regulation prohibits providers of intermediary services from establishing collective representation or obtaining the services of a legal representative by other means, including contractual ones, provided that the legal representative can fulfill the role assigned to it in this Regulation. Providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, should be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation. |
Amendment 28
Proposal for a regulation
Recital 40
|
|
Text proposed by the Commission |
Amendment |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easy to access and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notification'), pursuant to which that provider should assess the illegality of the identified content and, on the basis of that assessment, can decide whether or not it agrees with the notification of illegal content and wishes to remove or disable access to that content ('action'). In the event that provider of hosting services considers, on the basis of an assessment, that the notice of illegal content is positive and consequently decides to remove or disable access to it, it shall ensure that such content remains inaccessible after take down. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
Amendment 29
Proposal for a regulation
Recital 41
|
|
Text proposed by the Commission |
Amendment |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. While an absolute hierarchy between those rights does not exist, freedom of expression should be recognised as a cornerstone of a democratic society. |
Amendment 30
Proposal for a regulation
Recital 42 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(42a) Providers of hosting services should not be subject to the obligation to provide a statement of reasons when doing so would cause unintended safety concerns for the recipient of the service. Specifically in cases of one-to-one interface platforms, such as dating applications and other, similar services, providing a statement of reasons should be considered as likely to cause unintended safety concerns for the reporting party. As a result of this, those services should, by default, refrain from providing statements of reasons. Additionally, other providers of hosting services should make reasonable efforts to assess whether providing a statement of reasons could cause unintended safety concerns to the reporting party and, if this is the case, should refrain from providing a statement of reasons. |
Amendment 31
Proposal for a regulation
Recital 42 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(42b) The service provider should ensure that a member of staff who takes a decision based on or is otherwise frequently subjected to illegal content receives adequate training as well as appropriate working conditions, including, where necessary, the opportunity to seek professional support and qualified psychological assistance. |
Amendment 32
Proposal for a regulation
Recital 43 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(43a) The additional obligations imposed on online platforms under this Regulation should not apply to not-for-profit scientific or educational repositories or to online platforms offering products and services from third-party traders, which are established in the European Union, where the access of those traders is exclusive, curated and entirely controlled by the providers of the online platform and their products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform. |
Amendment 33
Proposal for a regulation
Recital 43 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(43b) To avoid unnecessary regulatory burden, certain obligations should not apply to online marketplaces offering products and services from third-party traders which are established in the European Union, where the access of those traders is exclusive, curated and entirely controlled by the providers of the online marketplace and their products and services are reviewed and pre-approved by the providers of the online marketplace before they are offered on the marketplace. These online platforms are often referred to as closed online platforms. As the products and services offered are reviewed and pre-approved by the online platforms, the prevalence of illegal content and products on these platforms is low, and these platforms cannot, in most cases, benefit from relevant liability exemptions outlined in this Regulation. These online platforms should subsequently not be subjected to the obligations that are necessary for platforms with different operational models where the prevalence of illegal content is more frequent and the relevant liability exemptions are available. |
Amendment 34
Proposal for a regulation
Recital 44
|
|
Text proposed by the Commission |
Amendment |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise, including on freedom of expression, to carry out their activities in a fair and cost-effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
Amendment 35
Proposal for a regulation
Recital 44 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(44a) If an out-of-court dispute settlement body decides the dispute in favour of the recipient of the service, the online platform should reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient should not be required to reimburse fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement, unless the body finds the complaint manifestly unfounded and abusive. |
Amendment 36
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms, having received guidance from public authorities on how to identify illegal content, take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and are known to flag content frequently with a high rate of accuracy, that they represent collective interests and that they work in a diligent, effective and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry representing collective interests and of right-holders specifically created for that purpose could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions, ensure independent collective interest representation and that their assessment of what constitutes an IPR infringement is unbiased and consistent. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
_________________ |
_________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 37
Proposal for a regulation
Recital 47
|
|
Text proposed by the Commission |
Amendment |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
(47) The misuse of services of online platforms by frequently providing or disseminating illegal content or by frequently submitting unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficient detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
Amendment 38
Proposal for a regulation
Recital 49
|
|
Text proposed by the Commission |
Amendment |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers and other users, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling and disseminating products or services in violation of the applicable rules, online marketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the provider of the online marketplace, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online marketplaces should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. Occasional traders who are natural persons should not be subject to disproportionate identification requirements on online marketplaces. Providers of the online marketplaces should not ask for information from natural persons that goes beyond the mere registration of the marketplace users. |
Amendment 39
Proposal for a regulation
Recital 50
|
|
Text proposed by the Commission |
Amendment |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the providers of online marketplaces should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System 45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the providers of online marketplaces should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot, as this would be disproportionate. Nor should such providers of online marketplaces, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Providers of online marketplaces should also design and organise their online interface in a user-friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface should allow traders to provide the information allowing for the unequivocal identification of the product or the service, including labelling requirements, in accordance with legislation on product safety and product compliance. |
__________________ |
__________________ |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
Amendment 40
Proposal for a regulation
Recital 50 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(50a) Without prejudice to relevant exemptions for micro and small enterprises, and to strengthen the obligations of online marketplaces, further ex-ante provisions should be put in place, so as to ensure ex-ante that consumers have the necessary information for product offers, prevent unsafe and non-compliant products and product categories, strengthen ex-ante actions against product counterfeiting as well as to cooperate (ex post) where necessary with regard to dangerous products already sold. Providers of online marketplaces should inform recipients of their service when the service or product they have acquired through their services is illegal. Once they have taken a decision to remove an illegal offering from their service, the providers of online marketplaces should take measures to prevent such illegal offerings, as well as identical or equivalent offerings, from being reuploaded on their marketplace. |
Amendment 41
Proposal for a regulation
Recital 51
|
|
Text proposed by the Commission |
Amendment |
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union. |
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union, in standardised formats and through standardised application programming interfaces. |
Amendment 42
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising that can have an impact both on the equal treatment and opportunities of citizens and on the perpetuation of harmful stereotypes and norms. New advertising models have generated changes in the way information is presented and have created new personal data collection patterns and business models that might negatively affect privacy, personal autonomy, democracy, quality news reporting and facilitate manipulation and discrimination. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that data collection is kept to a minimum, the maximisation of revenue from advertising does not limit the quality of the service and that the recipients of the service have extensive individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
Amendment 43
Proposal for a regulation
Recital 52 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(52a) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
Amendment 44
Proposal for a regulation
Recital 53
|
|
Text proposed by the Commission |
Amendment |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those challenges to fundamental rights, there being no alternative and less restrictive measures that would effectively achieve the same result. Only in very exceptional cases should users be permanently denied access to a very large online platform. The decision to permanently deny access should always be able to be revoked by a competent court. |
Amendment 45
Proposal for a regulation
Recital 54
|
|
Text proposed by the Commission |
Amendment |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10 % of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. In certain cases, it should also be possible for an online platform whose number of recipients does not exceed the operational threshold set at 10 % of the Union population to be considered as a very large online platform due to its turnover, role in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and because of its influence on how recipients obtain and communicate information online. |
Amendment 46
Proposal for a regulation
Recital 56
|
|
Text proposed by the Commission |
Amendment |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they were able to set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures to address, in particular, filtering bubbles and effects. |
Amendment 47
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products and illegally-traded animals. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
Amendment 48
Proposal for a regulation
Recital 60
|
|
Text proposed by the Commission |
Amendment |
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. |
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent external auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. As an indication of independence, at the time of the performance of the audit, auditors should not have provided services, other than auditing services, to the very large online platform over the course of the previous 12 months. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. |
Amendment 49
Proposal for a regulation
Recital 61
|
|
Text proposed by the Commission |
Amendment |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to the fact that the platform remains solely responsible for its compliance with this Regulation and without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. A disclaimer should be added to an opinion where the auditor does not have enough information to conclude its opinion due to the novelty of the issues being audited. |
Amendment 50
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed on the use of recommender systems, and that recipients can easily control the way information is presented to them. They should clearly and separately present the main parameters for such recommender systems in a clear, concise, accessible and easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. Very large online platforms should ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service. |
Amendment 51
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
deleted |
Amendment 52
Proposal for a regulation
Recital 63 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(63a) By associating advertisements with content uploaded by users, very large online platform could indirectly lead to the promotion of illegal content, or content that is in breach of their terms and conditions and could risk to considerably damage to the buyers of advertising space. In order to prevent such practice, very large online platforms should take steps, including through contractual guarantees to the purchasers of advertising space, to ensure that the content to which they associate advertisements is legal and compliant with their terms and conditions. These steps could include independent audits entailing a quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with platforms’ terms and conditions. |
Amendment 53
Proposal for a regulation
Recital 63 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(63b) Very large online platforms should use their best efforts not to permit behavioural and micro-targeted advertising towards children under the age of 18, in accordance with the General Data Protection Regulation. |
Amendment 54
Proposal for a regulation
Recital 64
|
|
Text proposed by the Commission |
Amendment |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific information or data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling very large online platforms to provide information and access to data to vetted researchers. All requirements for providing information and for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. Research conducted under this regime should, where possible, be conducted on the basis of open access principles and should use standardised data sets to ensure a high level of transparency and accountability as regards the use of provided data. |
Amendment 55
Proposal for a regulation
Recital 65 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(65a) Interoperability with very large online platforms is desirable as it can create new opportunities for the development of innovative services, overcome the lock-in effect and ensure competition and user choice. These possibilities could allow recipients to benefit from cross-platform interaction. Very large online platforms may provide an application programming interface through which third-party platforms and their recipients can interoperate with the main functionalities and recipients of the core services offered by the platform. The main functions could include the ability to receive information from certain accounts, to share provided content and react to it. Additionally, very large online platforms could make the core functionalities of their services interoperable with other online platforms to enable cross-platform communication. This possibility should not limit, hinder or delay the very large online platform’s ability to solve security issues and should be in compliance with all their responsibilities, especially regarding fundamental rights and protection of privacy. The Commission should request European standardisation bodies to develop the necessary technical standards for interoperability, such as protocol on interoperability and data interoperability and portability. |
Amendment 56
Proposal for a regulation
Recital 65 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(65b) Very large online platforms should ensure the portability of reviews to the reputation system of another platform operator upon the termination of the platform-user contract. For the sake of transparency, information about the processes, technical requirements, timeframes and charges that apply in case a platform user wishes to transfer reviews to the reputation system of another platform operator should be provided beforehand. When displaying reviews imported from another platform, the receiving platform operator should indicate the origin of such reviews, where possible. |
Amendment 57
Proposal for a regulation
Recital 68
|
|
Text proposed by the Commission |
Amendment |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
Amendment 58
Proposal for a regulation
Recital 70
|
|
Text proposed by the Commission |
Amendment |
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. |
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. The codes should contain clear and precise consumer protection and human rights objectives and be governed in a transparent manner. The effectiveness of the codes of conduct should be regularly assessed. |
Amendment 59
Proposal for a regulation
Recital 97
|
|
Text proposed by the Commission |
Amendment |
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. |
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. However, it should justify any inaction. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. |
Amendment 60
Proposal for a regulation
Article 1 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) interoperability requirements for very large online platforms. |
Amendment 61
Proposal for a regulation
Article 1 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) contribute to the proper functioning of the internal market for intermediary services; |
(a) contribute to the proper functioning of the internal market for digital services, including by creating a level playing-field; |
Amendment 62
Proposal for a regulation
Article 1 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected; |
Amendment 63
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) facilitate innovation, support the digital transition, encourage economic growth and encourage competition for digital services, while protecting users’ and consumers’ rights. |
Amendment 64
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ia) Directive (EU) 2019/882; |
Amendment 65
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
|
|
Text proposed by the Commission |
Amendment |
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as: |
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union or in the absence of such an establishment, where the provider targets its activities towards one or more Member States. |
Amendment 66
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a significant number of users in one or more Member States; or |
deleted |
Amendment 67
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
|
|
Text proposed by the Commission |
Amendment |
— the targeting of activities towards one or more Member States. |
deleted |
Amendment 68
Proposal for a regulation
Article 2 – paragraph 1 – point h
|
|
Text proposed by the Commission |
Amendment |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
(h) ‘online platform’ means a provider of a hosting service which applies specific terms and conditions and, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of the principal service and, for objective and technical reasons cannot be used without that principal service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
Amendment 69
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ha) ‘online marketplace’ means an online platform that allows consumers to conclude distance contracts with other traders or consumers on the platform; |
Amendment 70
Proposal for a regulation
Article 2 – paragraph 1 – point l
|
|
Text proposed by the Commission |
Amendment |
(l) ‘Digital Services Coordinator of establishment’ means the Digital Services Coordinator of the Member State where the provider of an intermediary service is established or its legal representative resides or is established; |
(l) ‘Digital Services Coordinator of establishment’ means the Digital Services Coordinator of the Member State where the provider of an intermediary service has its main establishment or, in the case that the intermediary service is not established in the European Union, where its legal representative is established; |
Amendment 71
Proposal for a regulation
Article 2 – paragraph 1 – point n
|
|
Text proposed by the Commission |
Amendment |
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information; |
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against indirect and direct forms of remuneration specifically for promoting that information; |
Amendment 72
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to rank, prioritise and suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
Amendment 73
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that: |
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that the provider: |
Amendment 74
Proposal for a regulation
Article 4 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the provider does not modify the information; |
(a) does not modify the information; |
Amendment 75
Proposal for a regulation
Article 4 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the provider complies with conditions on access to the information; |
(b) complies with conditions on access to the information; |
Amendment 76
Proposal for a regulation
Article 4 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; |
(c) complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; |
Amendment 77
Proposal for a regulation
Article 4 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and |
(d) does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and |
Amendment 78
Proposal for a regulation
Article 4 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. |
(e) acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the illegal content at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. |
Amendment 79
Proposal for a regulation
Article 5 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Paragraph 1 shall not apply when the main purpose of the information society service is to engage in or facilitate illegal activities or when the provider of the information society service deliberately collaborates with a recipient of the service in order to undertake illegal activities. |
Amendment 80
Proposal for a regulation
Article 5 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders on the platform, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
Amendment 81
Proposal for a regulation
Article 6 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they take voluntary own-initiative investigation measures for the purpose of detecting, identifying and removing, or disabling access to, illegal content, including through the use of technological tools and instruments, in order to comply with the requirements of Union law, including those set out in this Regulation. |
Amendment 82
Proposal for a regulation
Article 6 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Providers of intermediary services shall ensure that voluntary investigations are accompanied by appropriate safeguards, including, where necessary, human oversight, to ensure they are transparent, fair and non-discriminatory. |
Amendment 83
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider of the service and to the recipient of the service who provided the content; |
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content; |
Amendment 84
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) compliance with the measures in the order is technically feasible taking into account the available technical capabilities of the service provider concerned. |
Amendment 85
Proposal for a regulation
Article 8 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The orders to act against illegal content may require providers of intermediary services to take steps, in the specific case, to remove identical or equivalent illegal content. |
Amendment 86
Proposal for a regulation
Article 10 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Any requests to providers of intermediary services, made on the basis of this legislation, shall be transmitted through the Digital Service Coordinator in the Member State of establishment, who is responsible for collecting requests and communication from all relevant sources. |
Amendment 87
Proposal for a regulation
Article 11 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. Providers of intermediary services that qualify as micro, small or medium-sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation. |
Amendment 88
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall include information on the activities undertaken by them and any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
Amendment 89
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall act in a transparent, non-discriminatory, coherent, predictable, diligent, non-arbitrary, necessary and proportionate manner in applying and enforcing the terms and conditions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
Amendment 90
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Those parts of the terms and conditions that do not comply with this Article shall not be binding on recipients of the services. Providers of intermediary services shall inform recipients of their services of all changes in terms and conditions in advance. |
Amendment 91
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Where very large online platforms within the meaning of Article 25 of this Regulation allow for the dissemination to the public of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790, such platforms shall not remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions. |
Amendment 92
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. Any restrictions that providers of intermediary services impose in relation to the use of their service and the information provided by the recipients of the service shall be in full compliance with the fundamental rights of the recipients of the services as enshrined in the Charter. |
Amendment 93
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Where possible, the information published shall be broken down per Member State in which services are offered. Those reports shall include, in particular, information on the following, as applicable: |
Amendment 94
Proposal for a regulation
Article 13 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; |
(a) the number of orders received from Member States’ authorities, categorised, where possible, by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9; |
Amendment 95
Proposal for a regulation
Article 13 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider; |
Amendment 96
Proposal for a regulation
Article 13 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; |
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service, as well as measures taken to train content moderators and the safeguards put in place to ensure that non-infringing content is not affected; |
Amendment 97
Proposal for a regulation
Article 13 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, where identifiable, the basis for those complaints, decisions taken in respect of those complaints and the number of instances where content moderation decisions were reversed. |
Amendment 98
Proposal for a regulation
Article 14 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices at scale and exclusively by electronic means. Those mechanisms may not replace a decision of an independent judicial and administrative authority as to whether content is illegal or not. |
Amendment 99
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
2. The notifications referred to in paragraph 1 shall be sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify and assess the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
Amendment 100
Proposal for a regulation
Article 14 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
(b) a clear indication of the electronic identification of that information, such as the URL or URLs where possible, and, where necessary, additional information enabling the identification of the illegal content; |
Amendment 101
Proposal for a regulation
Article 14 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete. |
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are, to the best of their knowledge, accurate and complete. |
Amendment 102
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
3. Notices that include the elements referred to in paragraph 2 and that are thus sufficiently precise and adequately substantiated, and on the basis of which a diligent provider of hosting services can identify the illegality of the specific content, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
Amendment 103
Proposal for a regulation
Article 14 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity. |
4. Where the notification contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notification to that individual or entity. |
Amendment 104
Proposal for a regulation
Article 14 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
5. The provider shall also, without undue delay, notify that individual or entity and the content provider of its decision in respect of the information to which the notification relates, providing information on the redress possibilities in respect of that decision. |
Amendment 105
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Providers of hosting services shall, where the information provided is sufficiently clear, act on any notifications that they receive under the mechanisms referred to in paragraph 1, taking into account their technical and operational ability to act against specific items of illegal content, and take their decisions in respect of the information to which the notifications relate, in a timely, diligent and non-arbitrary manner. Where they use automated means for that processing, they shall include information on such use in the notification referred to in paragraph 4. |
Amendment 106
Proposal for a regulation
Article 14 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. Paragraphs 4 and 5 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraphs 4 and 5 shall not apply to enterprises that previously qualified for the status of a micro or small enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof. |
Amendment 107
Proposal for a regulation
Article 15 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
1. Where a provider of hosting services decides or not to remove or disable access to, or otherwise moderate either the form or distribution of, specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient without undue delay and at the latest within 24 hours after such removal or disabling of access or other content moderation and content curation measure, of the decision and provide a clear and specific statement of reasons for that decision. |
Amendment 108
Proposal for a regulation
Article 15 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access; |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and the territorial scope of the disabling of access; |
Amendment 109
Proposal for a regulation
Article 15 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means; |
(c) where applicable, information on the means used in taking the decision; |
Amendment 110
Proposal for a regulation
Article 15 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data. |
deleted |
Amendment 111
Proposal for a regulation
Article 15 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where the statement of reasons could give rise to unintended safety concerns for the reporting party. In addition, providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where the provider can demonstrate that the recipient of the service has repeatedly provided illegal content. |
Amendment 112
Proposal for a regulation
Article 15 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. Paragraphs 2 and 3 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, those paragraphs shall not apply to enterprises that previously qualified for the status of a micro or small enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof. |
Amendment 113
Proposal for a regulation
Article 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15a |
|
Protection against repeated misuse and criminal offences |
|
1. Providers of intermediary services shall, after having issued a prior warning, suspend or, in appropriate circumstances, terminate the provision of their services to recipients of the service that frequently provide illegal content after having provided a comprehensive explanation, |
|
2. Where a provider of intermediary services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. Where the provider of intermediary services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it has its main establishment or legal representative, and shall also transmit this information to Europol for appropriate follow-up. |
Amendment 114
Proposal for a regulation
Article 15 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15b |
|
Market entrance protection |
|
The provisions in this Section shall not be enforced against micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC for a period of one year after their establishment. During this period, such enterprises shall make all reasonable efforts to comply with the provisions in this section and shall act in good faith. |
Amendment 115
Proposal for a regulation
Article 16 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, unless they meet the criteria to qualify as very large online platforms under this Regulation. This Section shall not apply to enterprises that previously qualified for the status of a micro or small enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof, unless they meet the criteria to qualify as very large online platforms under this Regulation. |
Amendment 116
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
1. Online platforms shall provide recipients of the service and qualified entities within the meaning of Article 3(4) of Directive (EU) 2020/1828, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
Amendment 117
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions to remove, disable, demote, demonetise or restrict access to the information or otherwise impose sanctions against it; |
Amendment 118
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints and include human review. |
Amendment 119
Proposal for a regulation
Article 17 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and non-arbitrary manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. If the complaining entity so requests, the online platform shall publicly confirm the reversal of the decision. |
Amendment 120
Proposal for a regulation
Article 17 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. Such delay shall not exceed three weeks from the lodging of the complaint. |
Amendment 121
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
After internal complaint handling mechanisms are exhausted, recipients of the service addressed by the decisions referred to in Article 17(1) and qualified entities within the meaning of Article 3(4) of Directive (EU) 2020/1828 shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected by the recipient with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Amendment 122
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute; |
(b) it has the necessary legal expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute; |
Amendment 123
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the dispute settlement is easily accessible through electronic communication technology; |
(c) it offers dispute settlement that is easily accessible through electronic communication technology; |
Amendment 124
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) it is capable of settling dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union; |
(d) it is capable of settling dispute in a swift, efficient, transparent and cost-effective manner and in at least one official language of the Union; |
Amendment 125
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. |
(e) it offers dispute settlement that takes place in accordance with clear and fair rules of procedure and sufficient confidentiality safeguards. |
Amendment 126
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) where applicable, it has particular legal expertise in relation to the applicable laws relating to freedom of expression and its limitations and the applicable case law, including the case law of the European Court of Human Rights. |
Amendment 127
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement. |
If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement, unless the body finds that the complaint is manifestly unfounded and abusive. |
Amendment 128
Proposal for a regulation
Article 18 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive. |
6. This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive. Any attempt to reach an out-of-court agreement on the settlement of a dispute in accordance with this Article shall not affect the rights of the providers of online platform services and of the recipients of the service concerned to initiate judicial proceedings at any time before, during or after the out-of-court dispute settlement process. |
Amendment 129
Proposal for a regulation
Article 19 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. Similar priority may be given to other notices, when the trustworthiness of those submitting them and the severity and urgency of the situations concerned is considered to be exceptional. |
Amendment 130
Proposal for a regulation
Article 19 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; |
(a) it has demonstrated particular expertise, accuracy and expertise for the purposes of detecting, identifying and notifying illegal content; |
Amendment 131
Proposal for a regulation
Article 19 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner. |
(c) it carries out its activities for the purposes of submitting notices in an objective manner. |
Amendment 132
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) where applicable, it has particular legal expertise in relation to the applicable laws relating to freedom of expression and its limitations and the applicable case law, including the case law of the European Court of Human Rights. |
Amendment 133
Proposal for a regulation
Article 19 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Digital Services Coordinator of the Member State may award the status of trusted flagger to an entity established in another Member State, if the said entity already holds the status of a trusted flagger in the Member State where it is established. Where several Member States have awarded such status to the same entity, the entity may be referred to as a European trusted flagger. |
Amendment 134
Proposal for a regulation
Article 19 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. The Digital Services Coordinator of the Member State of establishment of the platform shall engage in dialogue with platforms and stakeholders for maintaining the accuracy and efficacy of a trusted flagger system. |
Amendment 135
Proposal for a regulation
Article 19 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
Amendment 136
Proposal for a regulation
Article 19 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Online platforms shall, where possible, provide trusted flaggers with access to technical means that help them detect illegal content on a large scale. |
Amendment 137
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Online platforms shall suspend, for a reasonable period of time, or in appropriate circumstances terminate, and after having issued a prior warning and having provided a comprehensive explanation, the provision of their services to recipients of the service that frequently provide manifestly illegal content. A termination of the service can be issued in case the recipients fail to comply with the applicable provisions set out in this Regulation or in case the suspension has occurred at least three times following verification of the repeated provision of illegal content. |
Amendment 138
Proposal for a regulation
Article 20 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are unfounded. |
Amendment 139
Proposal for a regulation
Article 20 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
(a) the absolute numbers of items of illegal content or unfounded notices or complaints, submitted in the past year; |
Amendment 140
Proposal for a regulation
Article 20 – paragraph 3 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the intention of the recipient, individual, entity or complainant. |
(d) where identifiable, the intention of the recipient, individual, entity or complainant. |
Amendment 141
Proposal for a regulation
Article 20 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension, and the circumstances in which they will terminate their services. |
Amendment 142
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. |
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it has its main establishment or legal representative and also transmit this information to Europol for appropriate follow-up. |
Amendment 143
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information: |
1. Providers of online marketplaces shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services, online marketplaces have obtained the following information from the trader: |
Amendment 144
Proposal for a regulation
Article 22 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the bank account details of the trader, where the trader is a natural person; |
deleted |
Amendment 145
Proposal for a regulation
Article 22 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; |
(d) to the extent the contract relates to products that are subject to the Union Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator, established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and the Council 51 or any relevant act of Union law; |
__________________ |
__________________ |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
Amendment 146
Proposal for a regulation
Article 22 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. |
2. The provider of the online marketplace shall, upon receiving that information, take effective steps that would reasonably be taken by a diligent operator in accordance with a high industry standard of professional diligence to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is accurate, current and reliable through the use of independent and reliable sources including any freely accessible official online database or online interface made available by an authorised administrator, Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. |
Amendment 147
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Where the provider of the online marketplace obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that marketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Amendment 148
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
Where the trader fails to correct or complete that information, the online platform shall suspend the provision of its service to the trader until the request is complied with. |
deleted |
Amendment 149
Proposal for a regulation
Article 22 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The provider of online marketplace shall require that traders promptly inform them of any changes to the information referred to in paragraph 1 and, in such cases, repeat the relevant steps referred to in paragraph 2. Where the provider of the online marketplace obtains indication that an item of information referred to in Article 22 is inaccurate, the provider of the online marketplace shall request the trader to provide evidence of the accuracy of that item of information or to correct it without delay. Where the trader fails to provide evidence of accuracy, correct or complete that information, the online platform shall suspend the provision of its service to the trader until the request is complied with. |
Amendment 150
Proposal for a regulation
Article 22 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
4. The provider of the online marketplace shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned including the period for redress. They shall subsequently delete the information. |
Amendment 151
Proposal for a regulation
Article 22 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Without prejudice to paragraph 2, the platform shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. |
5. Without prejudice to paragraph 2, the provider of the online marketplace shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. |
Amendment 152
Proposal for a regulation
Article 22 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. |
6. The provider of the online marketplace shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. |
Amendment 153
Proposal for a regulation
Article 22 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. The online platform shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. |
7. The provider of the online marketplace shall design and organise its online interface in a fair and user-friendly way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. |
Amendment 154
Proposal for a regulation
Article 22 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. The provider of the online marketplace shall design its service in a way that allows traders to communicate to their customers all relevant information for the identification of the product or the service, and, where applicable, the information concerning labelling, including CE marking. |
Amendment 155
Proposal for a regulation
Article 22 – paragraph 7 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
7b. Online platforms shall ensure that traders are approved without undue delay once the online platform has received the information referred to in paragraph 1 and taken the steps referred to in paragraph 2. |
Amendment 156
Proposal for a regulation
Article 22 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22a |
|
Additional provisions for online marketplaces related to illegal offers |
|
1. Where a provider of the online marketplace becomes aware of the illegal nature of a product or service offered through its services, it shall inform those recipients of the service that had acquired such product or contracted such service. |
|
2. The provider of the online marketplace shall suspend without undue delay the provision of its services to traders that provide, in a repeated manner, illegal offers for a product or a service. It shall immediately notify its decision to the trader. |
|
3. For products, categories or groups of products, which are susceptible to bear a serious risk to health and safety of consumers, based on accidents registered in the Safety Business Gateway, the Safety Gate statistics, the results of the joint activities on product safety and other relevant indicators or evidence, as outlined in the Regulation (EU) […/…] on general product safety, amending Regulation (EU) No 1025/2012 and repealing Directive 87/357/EEC and Directive 2001/95/EC, the provider of the online marketplace shall verify, with regard to the information referred to in paragraph 7a of Article 22, and before the trader's offer is made available on the online marketplace, if the offer that the trader wishes to propose to consumers located in the Union is mentioned in the list, or the lists, of products or categories of products identifies as non-compliant, as classified in any freely accessible official online database or online interface, and shall not authorise the trade to provide the offer if that the product is on such list. |
|
4. The provider of the online marketplace shall ensure that content identified as illegal remain inaccessible after take down, and take steps, in the specific case, to remove identical or equivalent illegal content. |
Amendment 157
Proposal for a regulation
Article 23 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of illegal content, the submission of unfounded notices and the submission of unfounded complaints; |
Amendment 158
Proposal for a regulation
Article 24 – title
|
|
Text proposed by the Commission |
Amendment |
Online advertising transparency |
Online advertising transparency requirements |
Amendment 159
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: |
1. Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, meaningful and unambiguous manner and at all times: |
Amendment 160
Proposal for a regulation
Article 24 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) that the information displayed is an advertisement; |
(a) that the information displayed or parts thereof is an advertisement; |
Amendment 161
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the natural or legal person who finances the advertisement, if different from the natural or legal person identified pursuant to point (b); |
Amendment 162
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) clear, meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed, and how to change those parameters; |
Amendment 163
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) whether the advertisement was selected using an automated system and, in that case, the identity of the natural or legal person responsible for the system. |
Amendment 164
Proposal for a regulation
Article 24 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Online platforms shall offer users the possibility to easily opt-out from micro-targeted tracking and advertisements that are based on their behaviour data or other profiling techniques, within the meaning of Article 4(4) of Regulation (EU) 2016/679. |
Amendment 165
Proposal for a regulation
Article 25 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. |
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3, or where the platform has been designated as a very large online platform in accordance with paragraph 4a. |
Amendment 166
Proposal for a regulation
Article 25 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features. |
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union and whether the turnover, operating model and nature of platform constitutes a systemic risk, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features, as well as how to determine whether the turnover, operating model and size of platform constitutes a systemic risk. |
Amendment 167
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from four months after that publication. |
deleted |
Amendment 168
Proposal for a regulation
Article 25 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The Commission shall adopt delegated acts in accordance with Article 69 to designate online platforms, which provide their services to a number of average monthly active recipients of the service in the Union lower than 45 million but pose a very high systemic risk, as very large online platforms. The assessment of a systemic risk shall be based on the following criteria: |
|
(a) the annual turnover of the online platform, with EUR 50 million as a threshold that shall be exceeded for an online platform to qualify for further assessment based on points b) to e). |
|
(b) the role of the online platform as a public forum; |
|
(c) the role, nature and volume of economic transactions on the online platform; |
|
(d) the role of the online platform in disseminating information, opinions and ideas and in influencing how recipients of the service obtain and communicate information online; and |
|
(e) the depth and scope of the systemic risks stemming from the functioning and use made of the services of the online platform, as defined in Article 26, as well as the historical prevalence of illegal content on the service. |
Amendment 169
Proposal for a regulation
Article 25 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from four months after that publication. |
Amendment 170
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks: |
Amendment 171
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any negative effects for the exercise of fundamental rights, including the rights to respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter, as well as any other fundamental rights enshrined in the Charter that can be negatively affected by such risks now or in the future; |
Amendment 172
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) intentional manipulation of their servicewith actual or foreseeable systemic negative effects on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security, including the risk of manipulation of their service by means of inauthentic use, deep fakes or automated exploitation of the service. |
Amendment 173
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
1. Very large online platforms shall put in place reasonable, proportionate and effective measures to cease, prevent and mitigate the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
Amendment 174
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) ensuring appropriate staffing to deal with notices and complaints; |
Amendment 175
Proposal for a regulation
Article 27 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19; |
(d) adjusting cooperation with trusted flaggers in accordance with Article 19; |
Amendment 176
Proposal for a regulation
Article 27 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) best practices for very large online platforms to mitigate the systemic risks identified. |
(b) best practices for very large online platforms to cease, prevent and mitigate the systemic risks identified. |
Amendment 177
Proposal for a regulation
Article 27 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general recommendations on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those recommendations the Commission shall organise public consultations. |
Amendment 178
Proposal for a regulation
Article 27 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The reports referred to in paragraph 2 shall be disseminated to the public and include standardised, open data describing the systemic risks, especially risks to fundamental rights. |
Amendment 179
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: |
Amendment 180
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which: |
2. Audits performed pursuant to paragraph 1 shall be performed by organisations or associations which have been constituted in accordance with the law of a Member State and which: |
Amendment 181
Proposal for a regulation
Article 28 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) are independent from the very large online platform concerned; |
(a) are independent from the very large online platform concerned and have not provided any service other than audits or relevant ancillary services to that platform in the previous 12 months; |
Amendment 182
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) have not audited the very large online platform concerned for more than three consecutive years. |
Amendment 183
Proposal for a regulation
Article 28 – paragraph 3 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) where the audit opinion is not positive, operational recommendations on specific measures to achieve compliance. |
(f) where the audit opinion is negative, recommendations on specific measures to achieve compliance and risk-based remediation timelines, with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority; |
Amendment 184
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) where the organisation that performs the audit does not have enough information to conclude the audit opinion due to the novelty of the issues audited, a relevant disclaimer. |
Amendment 185
Proposal for a regulation
Article 28 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified. |
4. Very large online platforms receiving an audit report that contains evidence that the platform is not properly assessing or mitigating systemic risks stemming from the functioning and use made of their services in the Union shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified. |
Amendment 186
Proposal for a regulation
Article 28 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle. |
Amendment 187
Proposal for a regulation
Article 28 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. The audits shall be submitted to Digital Services Coordinators, European Union Agency for Fundamental Rights and to the Commission. Summary of audit findings, not including sensitive information, shall be made public. Digital Services Coordinators, European Union Agency for Fundamental Rights and the Commission may provide a public comment on the audits. |
Amendment 188
Proposal for a regulation
Article 29 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
1. Very large online platforms shall not make the recipients of their services subject to recommender system based on profiling, unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main technical parameters used in their recommender systems, and they shall provide options for the recipients of the service to modify or influence those main technical parameters that they shall make available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679, and, where possible, keep a log of the significant changes implemented to the recommender system. |
Amendment 189
Proposal for a regulation
Article 29 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible and user-friendly functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them |
Amendment 190
Proposal for a regulation
Article 29 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The parameters referred to in paragraph 1 shall include: |
|
(a) whether the recommender system is an automated system and, in that case, the identity of the natural or legal person responsible for the recommender system, if different from the platform provider; |
|
(b) clear information about the main criteria used by recommender systems; |
|
(c) where possible, the relevance and weight of each main criteria which leads to the information recommended; |
|
(d) the goals the system has been optimised for, |
|
(e) if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs. |
Amendment 191
Proposal for a regulation
Article 29 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 29a |
|
Portability of data and reviews |
|
1. Very large online platforms shall provide effective portability of data generated through the activity of a business user or end user and shall, in particular, provide tools for end users to facilitate the exercise of data portability, in line with Regulation EU 2016/679, including by the provision of continuous and real-time access; |
|
2. Very large online platforms shall ensure the portability of reviews to the reputation system of another platform operator upon the termination of the platform-user contract. |
Amendment 192
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the natural or legal person who finances the advertisement, if different from the natural or legal person identified pursuant to point (b); |
Amendment 193
Proposal for a regulation
Article 30 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically. |
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically, and whether other groups have been explicitly excluded. |
Amendment 194
Proposal for a regulation
Article 31 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, information and access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
Amendment 195
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide information and access to data to vetted researchers, not-for-profit bodies, organisations or associations which have been constituted in accordance with the law of a Member State, who meet the requirements in paragraphs 4 of this Article, for the sole purpose of facilitating and conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1) and to enable verification of the effectiveness and proportionality of the mitigation measures as set out in Article 27(1). |
Amendment 196
Proposal for a regulation
Article 31 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. The period for which information and data is to be provided pursuant to paragraphs 1 and 2 shall be specified in the request. The data provided to vetted researchers shall be as disaggregated as possible, unless the researcher requests it otherwise. |
Amendment 197
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, disclose the funding financing the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
Amendment 198
Proposal for a regulation
Article 31 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Upon completion of their research, the vetted researchers that have been granted access to data shall publish their findings without disclosing personal data. |
Amendment 199
Proposal for a regulation
Article 33 – paragraph 2 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) relevant information on content moderation broken down per Member State in which the services are offered and in the Union as a whole. |
Amendment 200
Proposal for a regulation
Article 33 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports. |
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports. In such cases, the platform shall indicate that information was removed from the report, the scope of the information removed and the reason for the removal. |
Amendment 201
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the different types of data that can be used. |
Amendment 202
Proposal for a regulation
Article 37 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The Commission shall be responsible for the drafting and scrutiny of the crisis protocols referred to in paragraph 1. It shall report annually thereon to the European Parliament. |
Amendment 203
Proposal for a regulation
Article 38 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The competent authorities referred to in the first subparagraph shall have relevant expertise in the field of data protection, consumer protection or the regulation of user-generated content. |
Amendment 204
Proposal for a regulation
Article 38 – paragraph 1 – subparagraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Supervisory authorities designated under Regulation (EU) 2016/679 shall be tasked with the application and enforcement of measures related to data processing set out in this Regulation. |
Amendment 205
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. |
Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for the application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of matters related to this Regulation and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. |
Amendment 206
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission, without prejudice to the possibility for Member States to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator. |
For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission, without prejudice to the possibility for Member States to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator, including sharing information on cross-border cases and providing support for each other during ongoing interventions and investigations. |
Amendment 207
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The Board shall create a publicly accessible list of all Digital Services Coordinators and competent authorities. It shall regularly update and monitor this list. |
Amendment 208
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The Commission shall provide guidance to Member States to ensure a consistent approach on how national, local and regional authorities should relate to their Digital Services Coordinator. |
Amendment 209
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
The Commission shall publish and update a register containing the name and contact information of the Digital Service Coordinator responsible in each Member State. |
Amendment 210
Proposal for a regulation
Article 39 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, independent, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have all necessary technical, financial and human resources, including skills and competence building, as well as infrastructure to carry out their tasks. Such resources may include access to training and regular exchanges with service providers to understand the specificities of their business model. |
Amendment 211
Proposal for a regulation
Article 39 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party. |
2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall not take instructions from any other public authority or any private party. |
Amendment 212
Proposal for a regulation
Article 39 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Digital Services Coordinators may seek information from a public authority or private party if it deems it necessary to carry out its duties, without compromising its independence and neutrality. |
Amendment 213
Proposal for a regulation
Article 41 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Member States shall ensure that any exercise of the powers pursuant to paragraphs 1, 2 and 3 is subject to adequate safeguards laid down in the applicable national law in conformity with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties. |
6. Member States shall ensure that any exercise of the powers pursuant to paragraphs 1, 2 and 3 is subject to the highest safeguards laid down in the applicable national law, in absolute conformity with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties. |
Amendment 214
Proposal for a regulation
Article 42 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned. |
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or global turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1 % of the annual income or global turnover of the provider concerned. |
Amendment 215
Proposal for a regulation
Article 42 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. |
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily global turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. |
Amendment 216
Proposal for a regulation
Article 43 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Recipients of the service, as well as other parties with a legitimate interest and who are independent of any provider of intermediary service, shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Amendment 217
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) issue own-initiative opinions. |
Amendment 218
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently: |
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6 % of its total global turnover in the preceding financial year where it finds that that platform, intentionally or negligently: |
Amendment 219
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently: |
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1 % of the total global turnover in the preceding financial year, where they intentionally or negligently: |
Amendment 220
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: |
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily global turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: |
Amendment 221
Proposal for a regulation
Article 64 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed. |
1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed, along with, where possible, non-confidential documents or other forms of information on which the decision is based. |
Amendment 222
Proposal for a regulation
Article 73 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources. |
3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources, and pay specific attention to small and medium-sized enterprises and the position of new competitors. |
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
ITRE 8.2.2021 |
|||
Associated committees - date announced in plenary |
20.5.2021 |
|||
Rapporteur for the opinion Date appointed |
Henna Virkkunen 15.12.2020 |
|||
Discussed in committee |
17.6.2021 |
15.7.2021 |
|
|
Date adopted |
27.9.2021 |
|
|
|
Result of final vote |
+: –: 0: |
46 22 6 |
||
Members present for the final vote |
Nicola Beer, François-Xavier Bellamy, Hildegard Bentele, Tom Berendsen, Vasile Blaga, Michael Bloss, Paolo Borchia, Marc Botenga, Markus Buchheit, Cristian-Silviu Buşoi, Jerzy Buzek, Carlo Calenda, Maria da Graça Carvalho, Ignazio Corrao, Ciarán Cuffe, Josianne Cutajar, Nicola Danti, Pilar del Castillo Vera, Martina Dlabajová, Valter Flego, Niels Fuglsang, Lina Gálvez Muñoz, Claudia Gamon, Nicolás González Casares, Christophe Grudler, András Gyürk, Henrike Hahn, Robert Hajšel, Ivo Hristov, Ivars Ijabs, Romana Jerković, Eva Kaili, Izabela-Helena Kloc, Łukasz Kohut, Zdzisław Krasnodębski, Andrius Kubilius, Miapetra Kumpula-Natri, Thierry Mariani, Marisa Matias, Joëlle Mélin, Dan Nica, Angelika Niebler, Ville Niinistö, Aldo Patriciello, Mauri Pekkarinen, Mikuláš Peksa, Tsvetelina Penkova, Morten Petersen, Markus Pieper, Clara Ponsatí Obiols, Manuela Ripa, Robert Roos, Sara Skyttedal, Maria Spyraki, Jessica Stegrud, Beata Szydło, Riho Terras, Grzegorz Tobiszowski, Patrizia Toia, Evžen Tošenovský, Marie Toussaint, Isabella Tovaglieri, Viktor Uspaskich, Henna Virkkunen, Pernille Weiss, Carlos Zorrinho |
|||
Substitutes present for the final vote |
Erik Bergkvist, Izaskun Bilbao Barandica, Cornelia Ernst, Valérie Hayer, Elena Lizzi, Jutta Paulus, Sandra Pereira, Angelika Winzig |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
46 |
+ |
NI |
Viktor Uspaskich# |
PPE |
François-Xavier Bellamy, Hildegard Bentele, Tom Berendsen, Vasile Blaga, Cristian-Silviu Buşoi, Jerzy Buzek, Maria da Graça Carvalho, Pilar del Castillo Vera, Andrius Kubilius, Angelika Niebler, Aldo Patriciello, Markus Pieper, Sara Skyttedal, Maria Spyraki, Riho Terras, Henna Virkkunen, Pernille Weiss, Angelika Winzig |
Renew |
Nicola Beer, Izaskun Bilbao Barandica, Nicola Danti, Martina Dlabajová, Valter Flego, Claudia Gamon, Christophe Grudler, Valérie Hayer, Ivars Ijabs, Mauri Pekkarinen, Morten Petersen |
S&D |
Erik Bergkvist, Carlo Calenda, Josianne Cutajar, Niels Fuglsang, Lina Gálvez Muñoz, Nicolás González Casares, Robert Hajšel, Ivo Hristov, Romana Jerković, Eva Kaili, Łukasz Kohut, Miapetra Kumpula-Natri, Dan Nica, Tsvetelina Penkova, Patrizia Toia, Carlos Zorrinho |
22 |
- |
ECR |
Izabela-Helena Kloc, Zdzisław Krasnodębski, Robert Roos, Jessica Stegrud, Beata Szydło, Grzegorz Tobiszowski, Evžen Tošenovský |
ID |
Paolo Borchia, Markus Buchheit, Elena Lizzi, Isabella Tovaglieri |
The Left |
Marc Botenga, Sandra Pereira |
Verts/ALE |
Michael Bloss, Ignazio Corrao, Ciarán Cuffe, Henrike Hahn, Ville Niinistö, Jutta Paulus, Mikuláš Peksa, Manuela Ripa, Marie Toussaint |
6 |
0 |
ID |
Thierry Mariani, Joëlle Mélin |
NI |
András Gyürk, Clara Ponsatí Obiols |
The Left |
Cornelia Ernst, Marisa Matias |
Key to symbols:
+ : in favour
- : against
0 : abstention
OPINION OF THE COMMITTEE ON LEGAL AFFAIRS (11.10.2021)
for the Committee on the Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council on Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9‑0418/2020 – 2020/0361(COD))
Rapporteur for opinion: Geoffroy Didier
(*) Associated committee – Rule 57 of the Rules of Procedure
SHORT JUSTIFICATION
The DSA should cover all digital services that play an important role in the dissemination of illegal content, in order to subject their content moderation practices to adequate regulation. That is why I clarified the scope of the DSA in order to explicitly target 3 types of services that play a major role in the dissemination of content: search engines, live-streaming services of user-generated content and messaging services.
These three categories of services should be subject, firstly, to the obligations currently provided for all intermediary services, and secondly, to the risk assessment and mitigation obligations applied to very large platforms, when they exceed the relevant thresholds. Live-streaming services and messaging services should also fall under certain obligations applicable to hosting services and online platforms, to the extent that these obligations can be applied to them. For example, these services can and should comply with obligations related to suspension of accounts and to guarantees offered to users in case of sanctions.
In the background of the fast expansion in recent years and in particular during the Covid-19 pandemic, online marketplaces have induced a number of threats regarding consumer protection, both in terms of enforcement of consumers’ rights and of product safety and product compliance. Furthermore, these marketplaces give rise to growing concerns as to industrial property rights and counterfeiting, and more generally to growing concerns about the setting of an unlevel playing field, whereby compliant companies increasingly undergo unfair competition from incompliant ones.
For example, the investigations of the 10 main online marketplaces illustrated that, on average, 63 % of the products proposed to European consumers were non-compliant and 28% of these products were actually dangerous, such rates being significantly higher than those found for “brick and mortar” retailers.
Such situation is undoubtedly linked to a loophole in the current legal framework, which allows online marketplaces to escape from a number of basic requirements, the lack of which makes it impossible to ensure a reasonable and satisfactory level of protection of the European consumers purchasing online. The wider the market share of online marketplaces, the higher such risk, and the more worrisome it is.
Given the foregoing, in order to bring this loophole to an end, and thus to rule out such increasing risk, it appears indispensable to add to the DSA a number of additional specific provisions for online platforms offering marketplaces services.
Another one problem is the application of the so-called “country of origin principle” that could result, given the current establishment of content platforms in the EU, in a few national authorities being the only authorities empowered to enforce the DSA. These authorities might not be able to fulfill their roles. Furthermore, the proposed scheme would not allow national specificities to be properly taken into account for the regulation of content. The DSA must therefore be adjusted in order to explicitly confer prerogatives of intervention upon the competent authorities of the country of destination (e.g. power of access to data, involvement in the investigation and decision-making, power to take action on a problem affecting its territory, direct intervention in the event of unjustified inaction by the authority of the country of establishment).
AMENDMENTS
The Committee on Legal Affairs calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 1
|
|
Text proposed by the Commission |
Amendment |
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25, new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole. |
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25, new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel and innovative ways, transforming their communication, connection, consumption and business habits on the one hand, and bringing about societal and economic transformations in the Union on the other. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users for example in the form of financial fraud and scams on social networks and for society as a whole. |
_________________ |
_________________ |
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). |
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). |
Amendment 2
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws create regulatory fragmentation and negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice, striking a proper balance between support for innovation on the one hand and protection for consumers and other service users on the other. |
Amendment 3
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information, the freedom to conduct a business, privacy and personal data protection, the right to non-discrimination and access to justice. |
Amendment 4
Proposal for a regulation
Recital 4
|
|
Text proposed by the Commission |
Amendment |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated and fundamental rightsrespected. |
Amendment 5
Proposal for a regulation
Recital 8
|
|
Text proposed by the Commission |
Amendment |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
_________________ |
_________________ |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
Amendment 6
Proposal for a regulation
Recital 9
|
|
Text proposed by the Commission |
Amendment |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, among others, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. This Regulation should also respect the competences of Member States to adopt laws promoting freedom and pluralism of the media as well as cultural and linguistic diversity.This Regulation should not affect Member States’ freedom to regulate more strongly issues on which those other acts leave Member States the possibility of adopting certain measures at national level. In the event of a conflict between Directive 2010/13/EU as amended and this Regulation, Directive 2010/13/EU as well as the national measures taken in accordance with that Directive should prevail. To assist Member States and providers, the Commission should provide guidelines as to how to interpret the interaction between different Union acts and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. |
_________________ |
_________________ |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
Amendment 7
Proposal for a regulation
Recital 10
|
|
Text proposed by the Commission |
Amendment |
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. |
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive 2013/11/EU of the European Parliament and of the Council37a, Directive 2006/123/EC of the European Parliament and of the Council37b, and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. |
_________________ |
_________________ |
30 Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). |
30 Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). |
31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). |
31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). |
32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. |
32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. |
33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. |
33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. |
34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. |
35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. |
36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. |
36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. |
37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules |
37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules |
|
37a Directive 2013/11/EU of the European Parliament and of the Council of 21 May 2013 on alternative dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive 2009/22/EC (Directive on consumer ADR) (OJ L 165, 18.6.2013, p. 63). |
|
37b Directive 2006/123/EC of the European Parliament and of the Council of 12 December 2006 on services in the internal market (OJ L 376, 27.12.2006, p. 36). |
38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). |
38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). |
Amendment 8
Proposal for a regulation
Recital 11
|
|
Text proposed by the Commission |
Amendment |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected. |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 of the European Parliament and of the Council1a, which establish specific rules and procedures that should remain unaffected and are lex specialis, prevailing over this Regulation. |
|
________________ |
|
1a Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (OJ L 130, 17.5.2019, p. 92). |
Amendment 9
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, accessible, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should underpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that is not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant dangerous or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material, the provision of illegal services such as hosting services on short-term accommodation rental platforms which do not conform to Union or national law, or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
Amendment 10
Proposal for a regulation
Recital 12 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(12a) Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate should not be considered as illegal content. Similarly, material, such as an eye-witness video of a potential crime, should not be considered as illegal, merely because it depicts an illegal act. An assessment should determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes. |
Amendment 11
Proposal for a regulation
Recital 13
|
|
Text proposed by the Commission |
Amendment |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, search engines, social networks, content-sharing platforms, or online marketplaces and live streaming platforms or instant messaging services providers should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of the principal service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
Amendment 12
Proposal for a regulation
Recital 14
|
|
Text proposed by the Commission |
Amendment |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a large or potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a number of pre-determined. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.. |
_________________ |
_________________ |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
Amendment 13
Proposal for a regulation
Recital 16
|
|
Text proposed by the Commission |
Amendment |
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union. |
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity, consistency, predictability, accessibility and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union. |
Amendment 14
Proposal for a regulation
Recital 18
|
|
Text proposed by the Commission |
Amendment |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. The provider of intermediary services is considered to play an active role when it organises and references the content, regardless of whether this is automated or not. |
Amendment 15
Proposal for a regulation
Recital 20
|
|
Text proposed by the Commission |
Amendment |
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
(20) A provider of intermediary services the main purpose of which is to engage in or facilitate illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
Amendment 16
Proposal for a regulation
Recital 21
|
|
Text proposed by the Commission |
Amendment |
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. |
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, such as network management, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. |
Amendment 17
Proposal for a regulation
Recital 22
|
|
Text proposed by the Commission |
Amendment |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the Charter of Fundamental Rights of the European Union, including the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
Amendment 18
Proposal for a regulation
Recital 23
|
|
Text proposed by the Commission |
Amendment |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, including online financial transactions, providers of hosting services, online platforms and other service providers such as marketplaces that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue, including online financial transactions, in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
Amendment 19
Proposal for a regulation
Recital 25
|
|
Text proposed by the Commission |
Amendment |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
(25) In order to create legal certainty, ensuring that the regulatory framework provisions are applied in a proportional manner, and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability set out in this Regulation. Therefore, any such activities and measures that a given provider may have taken in order to detect, identify and act against illegal content on a voluntary basis should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
Amendment 20
Proposal for a regulation
Recital 27
|
|
Text proposed by the Commission |
Amendment |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, although they do not fall within the obligations under this Regulation, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
Amendment 21
Proposal for a regulation
Recital 28
|
|
Text proposed by the Commission |
Amendment |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. |
(28) Member States are prevented from imposing a monitoring obligation on service providers only with respect to obligations of a general nature, imposing constant content identification from the entirety of available content. This does not concern monitoring obligations in a specific case, where set down in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation, that implements Union acts in accordance with national legislation, in accordance with the conditions established in this Regulation and other Union lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation or as a general obligation for providers to take proactive measures to relation to illegal content or as an obligation to use automated content-filtering tools. Equally, nothing in this Regulation should prevent providers from enacting end-to-end encrypting of their services. |
Amendment 22
Proposal for a regulation
Recital 28 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(28a) Providers of intermediary services should not be obliged to use automated tools for content moderation as such tools have difficulties of effectively understanding the subtlety of context and meaning in human communication, which is necessary to determine whether assessed content violates the law or terms of service. |
Amendment 23
Proposal for a regulation
Recital 29
|
|
Text proposed by the Commission |
Amendment |
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. |
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with the Union law, including the Charter of Fundamental Rights of the European Union, on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations, often leading to fragmentation of the internal market. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain uniform conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders. The applicable rules on the mutual recognition of court decisions should be unaffected. |
Amendment 24
Proposal for a regulation
Recital 30
|
|
Text proposed by the Commission |
Amendment |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, including the Charter of Fundamental Rights of the European Union and in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
Amendment 25
Proposal for a regulation
Recital 31
|
|
Text proposed by the Commission |
Amendment |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. In this context and to maintain proportionality, orders addressed to a provider that has its main establishment in another Member State or outside the Union should be limited to the Member State issuing the order, unless the legal basis for the order is Union law. |
Amendment 26
Proposal for a regulation
Recital 32
|
|
Text proposed by the Commission |
Amendment |
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. |
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. This information should include legally collected information, such as the relevant e-mail addresses, telephone numbers, and other contact details necessary to ensure such compliance. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. |
Amendment 27
Proposal for a regulation
Recital 33
|
|
Text proposed by the Commission |
Amendment |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information as defined in Union or national law in accordance with Union law, including the Charter of Fundamental Rights of the European Union, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
Amendment 28
Proposal for a regulation
Recital 34
|
|
Text proposed by the Commission |
Amendment |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure an accessible, safe and transparent online environment, it is necessary to establish a clear, predictable and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities, ensuring the right balance between support for innovation on the one hand and protection for consumers and users on the other. |
Amendment 29
Proposal for a regulation
Recital 35
|
|
Text proposed by the Commission |
Amendment |
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online. |
(35) In order to make sure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of chapter III, in whole or in parts, to those providers of intermediary services that are non-for-profit or pursue a public interest mission and are SMEs without any systemic risk related to illegal content. Providers shall present justified reasons for why they should be issued a waiver. The Commission should examine such an application and has the authority to issue or revoke a waiver at any time. The Commission should maintain a public list of all waivers issued and their conditions containing a description on why the provider is justified a waiver. |
Amendment 30
Proposal for a regulation
Recital 36
|
|
Text proposed by the Commission |
Amendment |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant and up-to-date information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . |
Amendment 31
Proposal for a regulation
Recital 37
|
|
Text proposed by the Commission |
Amendment |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In order to avoid disproportionate burden, micro and small enterprises as defined in Commission Recommendation 2003/361/EC1a should be exempt from the obligation to designate a legal representative. |
|
____________ |
|
1a Commission Recommendation of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 32
Proposal for a regulation
Recital 38
|
|
Text proposed by the Commission |
Amendment |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are fair, non-discriminatory and transparent, and are drafted in a clear and unambiguous language in line with applicable Union law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, the legal consequences to be faced by the users for knowingly storing or uploading illegal content as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available. |
Amendment 33
Proposal for a regulation
Recital 38 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(38a) Providers can take voluntary measures for general risk assessments of potential risks related to their services, for example in relations with minors. Those measures should not lead to any new profiling, tracking or identification obligations on providers of intermediary services. |
Amendment 34
Proposal for a regulation
Recital 38 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(38b) The exemptions from liability established in this Regulation should not be available to providers of intermediary services that do not comply with the obligations set out in this Regulation. The non-compliance may affect the possibility of benefiting from the liability exemption, as the aim of this Regulation is to ensure that the standards to qualify for such exemptions contribute to a high level of safety and trust in the online environment. |
Amendment 35
Proposal for a regulation
Recital 39
|
|
Text proposed by the Commission |
Amendment |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Recommendation 2003/361/EC. In any public versions of such reports providers of intermediary services should remove any information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions. |
_________________ |
|
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
|
Amendment 36
Proposal for a regulation
Recital 40
|
|
Text proposed by the Commission |
Amendment |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide, based on its own assessment, whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Online platforms should try to prevent that a content which has already been identified as illegal and that has been removed on the basis of a prior notice, reappears again. The application of it should not lead to any general obligation and should be subject to human review. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Furthermore, the notice and action mechanism should be complemented by ‘stay down’ provisions whereby providers of hosting services should demonstrate their best efforts in order to prevent from reappearing content which is identical to another piece of content that has already been identified and removed by them as illegal. The application of this requirement should not lead to any general monitoring obligation. |
Amendment 37
Proposal for a regulation
Recital 40 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(40a) Notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content, such as to an online platform and not to the hosting service provider on which provides services to that online platform. Such hosting service providers should redirect such notices to the particular online platform and inform the notifying party of this fact. |
Amendment 38
Proposal for a regulation
Recital 40 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(40b) Hosting providers should seek to act only against the items of information notified. This may include acts such as disabling hyperlinking to the items of information. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal, contractual, or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. If a recipient fails to act or delays action, or the provider has reason to believe has failed to act or otherwise acts in bad faith, the hosting provider may suspend their service in line with their terms and conditions. |
Amendment 39
Proposal for a regulation
Recital 41 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(41a) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, either because it is illegal or is not allowed under its terms and conditions, it should do so in a timely manner, taking into account the potential harm of the infraction and the technical abilities of the provider. |
Amendment 40
Proposal for a regulation
Recital 42
|
|
Text proposed by the Commission |
Amendment |
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. |
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have proven to be efficient, proportionate and reliable, that provider may prevent the reappearance of the notified or equivalent illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. However, the information to the recipient should not be required if it relates to spam, removal of content similar or identical to content already removed from the same recipient, who has already received a statement. |
Amendment 41
Proposal for a regulation
Recital 43
|
|
Text proposed by the Commission |
Amendment |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation or are held or controlled by entities established outside the Union. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
_________________ |
_________________ |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 42
Proposal for a regulation
Recital 43 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(43a) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide, based on its own assessment, whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
Amendment 43
Proposal for a regulation
Recital 43 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(43b) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
Amendment 44
Proposal for a regulation
Recital 44
|
|
Text proposed by the Commission |
Amendment |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies located in either the Member State of the recipient or the provider and that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. Dispute resolution proceedings should be concluded within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
Amendment 45
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests or those of individual rightholders and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The same should be granted to applicants within the meaning of Regulation (EU) No 608/2013 or in case of complaints pursuant to Regulation (EU) 2019/1020 so as to ensure that existing rules regarding custom enforcement or consumer protection are effectively implemented to online sale. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
__________________ |
__________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 46
Proposal for a regulation
Recital 47
|
|
Text proposed by the Commission |
Amendment |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
(47) The misuse of services of online platforms by repeatedly providing illegal content, facilitating the repeated uploading of illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate and effective safeguards against such misuse. Information should be considered to be illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend or terminate their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
Amendment 47
Proposal for a regulation
Recital 48
|
|
Text proposed by the Commission |
Amendment |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving an imminent threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing upon request all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
_________________ |
_________________ |
44 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
44 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
Amendment 48
Proposal for a regulation
Recital 48 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(48a) Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all available relevant information. |
Amendment 49
Proposal for a regulation
Recital 49
|
|
Text proposed by the Commission |
Amendment |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online marketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential and accurate information to the providers of online marketplace, including for purposes of promoting messages on products or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online marketplaces should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
Amendment 50
Proposal for a regulation
Recital 50
|
|
Text proposed by the Commission |
Amendment |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online marketplaces covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. Additionally this information provided by the trader should be sufficiently specific and supported, where possible. However, the online marketplaces covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online marketplaces, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online marketplaces should also design and organise their online interface in a user-friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 |
_________________ |
_________________ |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
Amendment 51
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
Amendment 52
Proposal for a regulation
Recital 53
|
|
Text proposed by the Commission |
Amendment |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic and financial transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, including regarding misleading information or any other types of illegal content there being no alternative and less restrictive measures that would effectively achieve the same result. |
Amendment 53
Proposal for a regulation
Recital 54
|
|
Text proposed by the Commission |
Amendment |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses may have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. |
Amendment 54
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous and counterfeit products or the display of copyright-infringing content. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or the misuse of the platforms’ terms and conditions, including content moderation policies, when enforced. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
Amendment 55
Proposal for a regulation
Recital 58
|
|
Text proposed by the Commission |
Amendment |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
(58) Very large online platforms should deploy the necessary and proportionate means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, prevent the intentional manipulation and exploitation of the service, including by the amplification of illegal content, adapting their decision-making processes, or adapting their terms and conditions as well as making content moderation policies and the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
Amendment 56
Proposal for a regulation
Recital 59
|
|
Text proposed by the Commission |
Amendment |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and relevant public actors. |
Amendment 57
Proposal for a regulation
Recital 60
|
|
Text proposed by the Commission |
Amendment |
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. |
(60) Given the need to ensure verification by independent experts, large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by researchers vetted by the competent authorities. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. |
Amendment 58
Proposal for a regulation
Recital 61
|
|
Text proposed by the Commission |
Amendment |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
(61) The audit report should be independent and substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
Amendment 59
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have an impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
Amendment 60
Proposal for a regulation
Recital 62 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(62a) The practice of very large online platforms to associate advertisement with content uploaded by users could indirectly lead to the monetisation and promotion of illegal content, or content that is in breach of their terms and conditions and could risk to considerably damage the brand image of the buyers of advertising space. In order to prevent such practice, the very large online platforms should ensure, including through standard contractual guarantees to the buyers of advertising space, that the content to which they associate advertisements is legal, and compliant with their terms and conditions. Furthermore, the very large online platforms should allow advertisers to have direct access to the results of audits carried out independently and evaluating the commitments and tools of platforms for protecting the brand image of the buyers of advertising space ('brand safety'). |
Amendment 61
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement. In addition, very large online platforms should label any known deep fake video, audio or other files. |
Amendment 62
Proposal for a regulation
Recital 64
|
|
Text proposed by the Commission |
Amendment |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms, such as the dissemination of illegal content or amplification of illegal content brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for providing information or compelling access to data from very large online platforms to vetted researchers, which meet the conditions set out in this regulation, where relevant to a research project. All requests for providing information or access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
Amendment 63
Proposal for a regulation
Recital 65 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(65a) Minimum interoperability requirements for very large online platforms could create new opportunities for the development of innovative services, limit lock-in effects of existing platforms due to network effect and could therefore improve competition and users choice. In order to facilitate free choice of recipients between different services, interoperability for industry-standard features of very large online platforms should be considered. Such interoperability could empower recipients to choose a service based on its functionality and features. |
Amendment 64
Proposal for a regulation
Recital 67
|
|
Text proposed by the Commission |
Amendment |
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct. |
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation, as well as the compliance of online platforms with the provisions of these codes. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct. |
Amendment 65
Proposal for a regulation
Recital 68
|
|
Text proposed by the Commission |
Amendment |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation, illegal content or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
Amendment 66
Proposal for a regulation
Recital 69
|
|
Text proposed by the Commission |
Amendment |
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan. |
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech. |
Amendment 67
Proposal for a regulation
Recital 70
|
|
Text proposed by the Commission |
Amendment |
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. |
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. |
Amendment 68
Proposal for a regulation
Recital 76
|
|
Text proposed by the Commission |
Amendment |
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. |
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV and Articles 8 and 9 by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. |
Amendment 69
Proposal for a regulation
Recital 77
|
|
Text proposed by the Commission |
Amendment |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. Member States should also consider specialised training, in cooperation with Union bodies, offices and agencies, for relevant national authorities, in particular administrative authorities, who are responsible for issuing orders to act against illegal content and provide information. |
Amendment 70
Proposal for a regulation
Recital 78
|
|
Text proposed by the Commission |
Amendment |
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. |
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure coherence between the Member States, the Commission should adopt guidance on the procedures and rules related to the powers of Digital Services Coordinators. |
Amendment 71
Proposal for a regulation
Recital 91
|
|
Text proposed by the Commission |
Amendment |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, respect for intellectual property, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
Amendment 72
Proposal for a regulation
Recital 97 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(97a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation. |
Amendment 73
Proposal for a regulation
Recital 99
|
|
Text proposed by the Commission |
Amendment |
(99) In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers. |
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers. |
Amendment 74
Proposal for a regulation
Recital 106 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(106a) In order to promote the freedom of expression and media pluralism online, the importance of editorial content and services must be recognised, requiring intermediary service providers to refrain from removing, suspending or disabling access to it. It follows that intermediary service providers should be exempt from liability for editorial content and services. Intermediary service providers should put mechanisms in place to facilitate the practical application, for example, the flagging of lawful editorial content and services by content providers. Providers of editorial content and services should be identified by the Member State in which the provider is established. Those providers should be understood as performing an economic activity within the meaning of Articles 56 and 57 TFEU. |
Amendment 75
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes: |
1. This Regulation lays down harmonised rules on the provision of intermediary services in order to improve the functioning of the internal market whilst ensuring the rights enshrined in the Charter of Fundamental Rights of the European Union, in particular the freedom of expression and information in an open and democratic society. In particular, it establishes: |
Amendment 76
Proposal for a regulation
Article 1 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
(b) set out uniform, proportionate, harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
Amendment 77
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) facilitate innovation, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market; |
Amendment 78
Proposal for a regulation
Article 1 – paragraph 2 – point b b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(bb) protect consumers making use of services falling under this Regulation. |
Amendment 79
Proposal for a regulation
Article 1 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. This Regulation shall apply to instant messaging services used for purposes others than private or non-commercial. |
Amendment 80
Proposal for a regulation
Article 1 – paragraph 5 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) Directive 2010/13/EC; |
(b) Directive 2010/13/EC as amended by Directive 2018/1808/EU; |
Amendment 81
Proposal for a regulation
Article 1 – paragraph 5 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) Union law on copyright and related rights; |
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790; |
Amendment 82
Proposal for a regulation
Article 1 – paragraph 5 – point h
|
|
Text proposed by the Commission |
Amendment |
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394; |
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Regulation XXX (General Product Safety Regulation); |
Amendment 83
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ia) Directive (EU) 2019/882; |
Amendment 84
Proposal for a regulation
Article 1 – paragraph 5 – point i b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ib) Directive 2006/123/EC. |
Amendment 85
Proposal for a regulation
Article 1 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. The Commission shall by ... [within one year of the adoption of this Regulation] publish guidelines with regard to the relations between this Regulation and legislative acts listed in paragraph 5. Those guidelines shall clarify any potential conflicts between the conditions and obligations listed in those legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent. |
Amendment 86
Proposal for a regulation
Article 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 1a |
|
Contractual provisions |
|
Any contractual provisions between an intermediary service provider and a trader, business user or a recipient of its service which are contrary to this Regulation shall be unenforceable. |
Amendment 87
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) 'active end user' means an individual successfully accessing an online interface and having significant interaction with it, its product or service; |
Amendment 88
Proposal for a regulation
Article 2 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business or profession; |
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft or profession; |
Amendment 89
Proposal for a regulation
Article 2 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession; |
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person marketing products and/or services in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession, or any natural or legal person that is offering goods, digital content or services on a commercial scale; |
Amendment 90
Proposal for a regulation
Article 2 – paragraph 1 – point f – introductory part
|
|
Text proposed by the Commission |
Amendment |
(f) ‘intermediary service’ means one of the following services: |
(f) ‘intermediary service’ means one of the following information society services: |
Amendment 91
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
|
|
Text proposed by the Commission |
Amendment |
‒ a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service; |
‒ a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service, and which does not have any active role in data processing; |
Amendment 92
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 4 (new)
|
|
Text proposed by the Commission |
Amendment |
|
‒ an online platform within the meaning of point (h); |
Amendment 93
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 5 (new)
|
|
Text proposed by the Commission |
Amendment |
|
‒ an online search engine as defined in point (5) of Article 2 of Regulation (EU) 2019/1150; |
Amendment 94
Proposal for a regulation
Article 2 – paragraph 1 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) ‘live streaming platform services’ meansinformation society services of which the main or one of the main purposes is to give the public access to audio or video material that is live broadcasted by its users, which it organises and promotes for profit-making purposes; |
Amendment 95
Proposal for a regulation
Article 2 – paragraph 1 – point g
|
|
Text proposed by the Commission |
Amendment |
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
(g) ‘illegal content’ means any information which, in itself or by its reference to illegal content, products, services or activity, including financial fraud, is not in compliance with Union law or the criminal, administrative or civil legal framework of a Member State, irrespective of the precise subject matter or nature of that law; |
Amendment 96
Proposal for a regulation
Article 2 – paragraph 1 – point h
|
|
Text proposed by the Commission |
Amendment |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
(h) ‘online platform’ means a provider of a hosting service which stores and disseminates to the public information and optimises its content, unless that activity is a minor and purely ancillary feature of service or functionality of the principal service and, for objective and technical reasons cannot be used without that principal service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation; |
Amendment 97
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ha) ‘online marketplace’ means a service using software, including a website or an application, operated by or on behalf of a trader which allows consumers to conclude distance contracts with other traders or consumers; |
Amendment 98
Proposal for a regulation
Article 2 – paragraph 1 – point h b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(hb) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media; |
Amendment 99
Proposal for a regulation
Article 2 – paragraph 1 – point h c (new)
|
|
Text proposed by the Commission |
Amendment |
|
(hc) ‘online social networking service’ means a platform that enables end users to connect, share, discover and communicate with each other across multiple devices and, in particular, via chats, posts, videos and recommendations; |
Amendment 100
Proposal for a regulation
Article 2 – paragraph 1 – point i
|
|
Text proposed by the Commission |
Amendment |
(i) ‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties; |
(i) ‘dissemination to the public’ means taking an active role in making information available, at the request of the recipient of the service who provided the information, to a significant and potentially unlimited number of third parties; |
Amendment 101
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ia) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful; |
Amendment 102
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system, used by a very large online platform to suggest, classify, prioritise or organise in its online interface specific information for recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
Amendment 103
Proposal for a regulation
Article 2 – paragraph 1 – point p
|
|
Text proposed by the Commission |
Amendment |
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; |
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services, whether automated or processed by a person, aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; |
Amendment 104
Proposal for a regulation
Article 2 – paragraph 1 – point q
|
|
Text proposed by the Commission |
Amendment |
(q) ‘terms and conditions’ means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services. |
(q) ‘terms and conditions’ means all terms and conditions or specifications provided by the provider of intermediary services, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services; |
Amendment 105
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qa) ‘dark pattern’ means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making or choice. |
Amendment 106
Proposal for a regulation
Article 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 2a |
|
Digital privacy |
|
1. Where technically possible, a provider of an information society service shall enable the use of and payment for that service without collecting personal data of the recipient. |
|
2. A provider of an information society service shall process personal data concerning the use of the service by a recipient only to the extent strictly necessary to enable the recipient to use the service or to charge the recipient for the use of the service. |
Amendment 107
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, the service provider shall not be liable for the information transmitted, on condition that the provider: |
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, or an improvement of the security of that transmission, the service provider shall not be liable for the information transmitted, on condition that the provider: |
Amendment 108
Proposal for a regulation
Article 3 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
3. This Article shall not affect the possibility for a court or functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 109
Proposal for a regulation
Article 4 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
2. This Article shall not affect the possibility for a court or functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 110
Proposal for a regulation
Article 5 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content. |
(b) upon obtaining such knowledge or awareness, expeditiously, decisively and permanently removes or disables access to the illegal content if the content or activity is to be deemed illegal within the meaning of Article 2(g); |
Amendment 111
Proposal for a regulation
Article 5 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Without prejudice to specific deadlines, set out in Union law or within administrative or legal orders, providers of hosting services shall, upon obtaining actual knowledge or awareness, remove or disable access to illegal content as soon as possible and in any event: |
|
(a) within 30 minutes where the illegal content pertains to the broadcast of a live sports or entertainment event; |
|
(b) within 24 hours where the illegal content can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety; |
|
(c) within 72 hours in all other cases where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety; |
Amendment 112
Proposal for a regulation
Article 5 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider. |
2. Paragraph 1 shall not apply: |
|
(a) where the recipient of the service is acting under the authority or the control of the provider; |
|
(b) when the main purpose of the information society service is to engage in or facilitate illegal activities or when the provider of the information society service deliberately collaborates with a recipient of the services in order to undertake illegal activities; |
|
(c) where the provider of intermediary services plays an active role in, for instance, providing, controlling, optimising, classifying, organising, referencing or promoting of the content. |
Amendment 113
Proposal for a regulation
Article 5 – Paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
3. Paragraph 1 shall not apply with respect to liability of an online markeplace, where such a marketplace presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
Amendment 114
Proposal for a regulation
Article 5 – Paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
4. This Article shall not affect the possibility for a court or a functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 115
Proposal for a regulation
Article 6 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
1. Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, or national law, in accordance with Union law, including the Charter of Fundamental Rights of the European Union, and the requirements set out in this Regulation. |
Amendment 116
Proposal for a regulation
Article 6 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Paragraph 1 shall apply only when intermediary services are compliant with due diligence obligations laid down in this Regulation. |
Amendment 117
Proposal for a regulation
Article 6 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Voluntary own-initiative investigations shall not lead to ex-ante control measures based on automated content moderation tools. |
Amendment 118
Proposal for a regulation
Article 6 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
1c. Providers of intermediary services shall ensure that measures taken pursuant to paragraph 1 shall be effective, specific and targeted. Such measures should be accompanied with appropriate safeguards, such as human oversight, documentation, traceability or any additional measures to ensure that own initiative investigations are accurate, non-discriminatory, proportionate, transparent and not lead to over-removal of content. |
Amendment 119
Proposal for a regulation
Article 7 – title
|
|
Text proposed by the Commission |
Amendment |
No general monitoring or active fact-finding obligations |
No general monitoring or automated content moderation or active fact-finding obligations |
Amendment 120
Proposal for a regulation
Article 7 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. |
1. No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. Providers of intermediary services shall not be obliged to use automated tools for content moderation. |
Amendment 121
Proposal for a regulation
Article 7 – paragraph 2 (new)
|
|
Text proposed by the Commission |
Amendment |
|
2. This Regulation shall not prevent providers from offering end-to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability. |
Amendment 122
Proposal for a regulation
Article 8 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, received from and issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, including the Charter of Fundamental Rights of the European Union, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
Amendment 123
Proposal for a regulation
Article 8 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that has issued the order. |
Amendment 124
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
– identification of the competent judicial or administrative authority; |
Amendment 125
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
– reference to the legal basis for the order; |
Amendment 126
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
|
|
Text proposed by the Commission |
Amendment |
– information about redress available to the provider of the service and to the recipient of the service who provided the content; |
– information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content; |
Amendment 127
Proposal for a regulation
Article 8 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider; |
Amendment 128
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the order is issued only where no other effective means are available to bring about the cessation or the prohibition of the infringement; |
Amendment 129
Proposal for a regulation
Article 8 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. |
Amendment 130
Proposal for a regulation
Article 8 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Member States shall ensure that providers have a right to appeal and object to implementing the order and shall facilitate the use and access to that right. |
Amendment 131
Proposal for a regulation
Article 8 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. |
4. The conditions and requirements laid down in this article shall be without prejudice to civil court decisions and requirements under national criminal procedural law in conformity with Union law. |
Amendment 132
Proposal for a regulation
Article 8 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The conditions and requirements laid down in this article shall be without prejudice to data confidentiality and commercial secrecy requirements, in conformity with Union law, including the Charter of Fundamental Rights of the European Union. |
Amendment 133
Proposal for a regulation
Article 8 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. The Commission shall adopt implementing acts, organising a European information exchange system, allowing for secure communication and authentication of authorised orders between relevant authorities, Digital Services Coordinators and providers, as referred to in Articles 8(1), 8a(1) and 9(1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. |
Amendment 134
Proposal for a regulation
Article 8 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 8a |
|
Orders to restore lawful content |
|
1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to restore a specific item or multiple items of removed content, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders without undue delay, specifying the action taken and the moment when the action was taken. |
|
2. Member States shall ensure that the orders referred to in paragraph 1 meet the following conditions: |
|
(a) the orders contain the following elements: |
|
(i) a statement of reasons explaining why the content in question is legal, by reference to the specific provision of Union or national law or court ruling; |
|
(ii) one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the legal content concerned; |
|
(iii) information about redress available to the provider of the service who removed the content and to the recipient of the service who notified the content; |
|
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; and |
|
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. |
Amendment 135
Proposal for a regulation
Article 9 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. Where no effect has been given to the order, a statement of the provider shall explain the reasons why the information cannot be provided to the national judicial or administrative authority that issued the order. |
Amendment 136
Proposal for a regulation
Article 9 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. If the provider cannot comply with the information order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that issued the information order. |
Amendment 137
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
|
|
Text proposed by the Commission |
Amendment |
– a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences; |
– a statement of reasons according to which the information is required and why this requirement is necessary and to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for official reasons related to the prevention, investigation, detection and prosecution of criminal offences; |
Amendment 138
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
– identification of the competent judicial or administrative authority; |
Amendment 139
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
– reference to the legal basis for the order; |
Amendment 140
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
|
|
Text proposed by the Commission |
Amendment |
– information about redress available to the provider and to the recipients of the service concerned; |
– information about redress mechanisms available to the provider and to the recipients of the service concerned; |
Amendment 141
Proposal for a regulation
Article 9 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control; |
(b) the order only requires the provider to provide information already legally collected for the purposes of providing the service and which lies within its control, such as e-mail addresses, telephone numbers and other contact details necessary to determine the compliance referred to in point (a); |
Amendment 142
Proposal for a regulation
Article 9 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10; |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider; |
Amendment 143
Proposal for a regulation
Article 9 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the order is issued only where no other effective means are available to receive the same specific item of information; |
Amendment 144
Proposal for a regulation
Article 9 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. It shall ensure that form means the standards set down in the Annex of [XXX the regulation on European Production and Preservation Orders for electronic evidence in criminal matters]. |
Amendment 145
Proposal for a regulation
Article 9 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. |
4. The conditions and requirements laid down in this article shall be without prejudice to civil court decisions and requirements under national criminal procedural law in conformity with Union law. |
Amendment 146
Proposal for a regulation
Article 9 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The conditions and requirements laid down in this Article shall be without prejudice to data confidentiality and commercial secrecy requirements, in conformity with Union law including the Charter of Fundamental Rights of the European Union. |
Amendment 147
Proposal for a regulation
Article 9 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. The obligations under this Article shall not oblige providers of intermediary services to introduce new tracking of profiling techniques for recipients of the service in order to comply with orders to provide information. |
Amendment 148
Proposal for a regulation
Article -10 (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article -10 |
|
Waiver |
|
1. Providers of intermediary services may apply to the Commission for a waiver from the requirements of Chapter III, if they prove that they are: |
|
(a) micro, small and medium entreprises within the meaning of the Annex to Recommendation 2003/361/EC, including when carrying out their activities on a non-for-profit basis or pursuant a public interest mission or; |
|
(b) a medium enterprises within the meaning of the Annex to Recommendation 2003/361/EC without any systemic risk related to illegal content. The Providers shall present justified reasons for their request; |
|
(c) editorial platforms within the meaning of Article 2 (ha) of this Regulation; |
|
2. The providers of intermediary services carrying out their activities on a non-for-profit basis or pursuant a public interest mission shall be independent from any entity that operates on a for-profit basis for the purposes of this article; |
|
3. The Commission shall examine such an application and, after consulting the Board, may issue a waiver in whole or in parts to the requirements of this Chapter. |
|
4. Upon the request of the Board or the provider, or on its own initiative, the Commission may review a waiver issued and revoke the waiver in whole or in parts. |
|
5. The Commission shall maintain a list of all waivers issued and their conditions and shall publish this list to the public. |
Amendment 149
Proposal for a regulation
Article 11 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. |
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, for those already existing as soon as possible, for those to be established prior to the establishment, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. The Members States may require very large online platforms to designate a legal representative in their Member State. |
Amendment 150
Proposal for a regulation
Article 11 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to cooperate with the Member States’ authorities, the Commission and the Board and comply with those decisions. |
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resources in order to guarantee their proper and timely cooperation with the Member States’ authorities, the Commission and the Board and compliance with those decisions. |
Amendment 151
Proposal for a regulation
Article 11 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Providers of intermediary services shall notify the name, address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. |
4. Providers of intermediary services shall notify the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity. |
Amendment 152
Proposal for a regulation
Article 11 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. Providers of intermediary services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to obtain a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation. |
Amendment 153
Proposal for a regulation
Article 11 – paragraph 5 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
5b. Providers of online social networking services designated as very large online platform according to Article 25 shall designate a legal representative to be bound to obligations laid down in this Article at the request of the Digital Services Coordinator of the Member States where this provider offers its services. |
Amendment 154
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall ensure that their terms and conditions prohibit the recipients of their services from providing content that is not in compliance with Union law or the law of the Member State where such information is made available. |
|
The terms and conditions shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format, in the languages in which the service is offered and include a searchable archive of previous versions with their date of application of the provider’s terms and conditions. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions, including information on the available remedies and the possibilities for opt-out, where relevant. |
Amendment 155
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall ensure that any additional restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service are designed with due regard to the fundamental rights as enshrined in the Charter. |
|
Providers of intermediary services shall enforce the restrictions referred to in the first subparagraph in a diligent, objective and proportionate manner, with due regard to the rights and legitimate interests of all parties involved. |
Amendment 156
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Where very large online platforms within the meaning of Article 25 of this Regulation otherwise allow for the dissemination to the public of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790 and of audiovisual media services within the meaning in Article 1(a) of Directive (EU) 2018/1808, such platforms shall not remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions, unless it is illegal content. |
Amendment 157
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. The Digital Services Coordinator of each Member State has the right to request very large online platforms, to apply measures and tools of content moderation, including algorithmic decision-making and human review reflecting Member State’s socio-cultural context. The framework for this cooperation as well as specific measures related thereto may be laid down in national legislation and shall be notified to the Commission. |
Amendment 158
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. Providers of intermediary services shall refrain from any dark patterns or other techniques to encourage the acceptance of terms and conditions, including giving consent to sharing personal and non-personal data. |
Amendment 159
Proposal for a regulation
Article 12 – paragraph 2 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
2d. The Digital Services Coordinator of each Member State, by means of national legislation, may request a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling cases involving the removal of lawful content online that is taken down erroneously. |
Amendment 160
Proposal for a regulation
Article 12 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 12a |
|
General Risk Assessment and Mitigation Measures |
|
1. Providers of intermediary services shall identify, analyse and assess, at least once a year, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to each of their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service. |
|
2. Providers of intermediary services shall wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures to the risk identified in line with applicable law and their terms and conditions. |
|
3. Providers of intermediary services shall, upon request, explain to the competent Digital Services Coordinator, how it undertook this risk assessment and what mitigation measures it undertook. |
|
4. Providers of intermediary services shall specially consider in the design, functioning and use of their services any actual, potential or foreseeable negative impact on fundamental rights, gender equality, and the protection of minors and people with disabilities. |
Amendment 161
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish, at least once a year, clear, easily accessible, comprehensible, and detailed reports on any content moderation they engaged in during the relevant period. The reports shall be available in searchable archives. Those reports shall include, in particular, information on the following, as applicable: |
Amendment 162
Proposal for a regulation
Article 13 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; |
(a) the number of orders received from Member States’ authorities, categorised, where possible, by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform the authority issuing the order of its receipt and the time for taking the action specified in those orders; |
Amendment 163
Proposal for a regulation
Article 13 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider; |
Amendment 164
Proposal for a regulation
Article 13 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, where identifiable, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
Amendment 165
Proposal for a regulation
Article 13 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Providers of intermediary services shall ensure that the identities, such as the trademark/logo or other characteristic traits of trade users providing goods or services on intermediary services are clearly visible alongside the goods or services provided. |
Amendment 166
Proposal for a regulation
Article 13 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Where made available to the public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider's terms and conditions. |
Amendment 167
Proposal for a regulation
Article 13 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 13a |
|
Online interface design |
|
1. Providers of intermediary services shall refrain from subverting or impairing autonomous decision-making or free choice of a recipient of a service through the design, functioning or operation of online interfaces or a part thereof. In particular, providers shall refrain from: |
|
(a) according visual prominence to one option when asking the recipient of the service for consent or a decision; |
|
(b) repeatedly requesting consent to data processing or requesting a change to a setting or configuration of the service after the recipient of the service has already made its choice; |
|
(c) making the making the refusal of consent to data processing more difficult or time-consuming to the recipient of the service than giving consent; |
|
(d) making the procedure of cancelling a service more difficult than signing up to it. |
|
2. A choice or decision made by the recipient of the service using an online interface that does not comply with the requirements of paragraph 1 of this Article shall not constitute consent in the sense of Regulation (EU) 2016/679. |
|
3. The Commission shall publish guidelines with a list of specific design patterns that qualify as subverting or impairing the autonomy, decision-making, or choice of the recipients of the service. |
Amendment 168
Proposal for a regulation
Article 13 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 13b |
|
Compliance with obligations for online marketplace |
|
Online marketplace shall ensure compliance with the obligations laid down in this Regulation, in order to achieve the objectives of the relevant obligation in an effective manner. |
|
The non-compliance with the obligations laid down in the Regulation may affect the possibility for online marketplace of benefiting from the liability exemption as laid down in Article 5(1). |
Amendment 169
Proposal for a regulation
Chapter III – Section 2 – title
|
|
Text proposed by the Commission |
Amendment |
Additional provisions applicable to providers of hosting services, including online platforms |
Additional provisions applicable to providers of hosting services, including online platforms, and to providers of livestreaming platform services and of instant messaging services used for purposes other than private or non-commercial |
Amendment 170
Proposal for a regulation
Article 14 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. |
1. Instant messaging services used for purposes other than private or non-commercial and providers of hosting services, including online platforms, shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content, or content that is in breach with their terms and conditions. Those mechanisms shall be easy to access, user-friendly, clearly visible on the hosting service interface, user-friendly and located close to the content in question allowing for the submission of notices exclusively by electronic means in the language of the individual or entity submitting a notice. |
Amendment 171
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
2. Notices submitted under the mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent reviewer can identify the illegality or the breach of the content in question with the terms and conditions. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
Amendment 172
Proposal for a regulation
Article 14 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content; |
(a) a sufficiently substantiated explanation of the reasons why the individual or entity considers the information in question to be illegal content, or content that is in breach with providers' terms and conditions; |
Amendment 173
Proposal for a regulation
Article 14 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
(b) a clear indication of the electronic location of that information, enabling the identification of the illegal content, or why the content such as the trademark/logo or other characteristic traits, is in breach with providers' terms and conditions; |
Amendment 174
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent economic provider can identify the illegality of the content in question shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
Amendment 175
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Instant messaging services used for purposes other than private or non-commercial and providers of hosting services, including online platforms without prejudice to Article 5 paragraph 1(b), shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a swift, non-discriminatory and objective manner and in any case within a maximum of 72 hours. Where decisions on the removal or deactivation of access to content are taken, providers of hosting services may take all measures necessary to prevent the same illegal content or equivalent illegal content from reappearing on their service. The application of this paragraph shall not lead to any general monitoring obligation and shall be subject to human review. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. This means, in particular, key information on the procedure followed, the technology used, the criteria and reasoning underpinning the decision and the rationale behind any automated decision-making. |
Amendment 176
Proposal for a regulation
Article 14 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. Providers of hosting service shall, without undue delay and in any case within the deadlines set out in Article 5 of the receipt of the notification at the latest, inform consumers who have purchased illegal products between the moment they have been uploaded on the provider’s website and the moment the listing has been taken down by the platform following a valid notice. Those measures shall not lead to any new profiling, tracking or identification obligation of providers. |
Amendment 177
Proposal for a regulation
Article 14 – paragraph 6 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
6b. Where providers of hosting services, live streaming platform service and of instant messaging services used for purposes other than private or non-commercial have previously taken down, removed or deactivated access to illegal content as a result of a notice and a valid claim procedure which did not lead to a successful appeal, they may take all reasonable, proportional action to block, deactivate or permanently take down the illegal content or any identical content. |
Amendment 178
Proposal for a regulation
Article 14 – paragraph 6 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
6c. The taking down, removal or deactivation of access as defined in paragraph 6 may be annulled by the following measures: a successful appeal, or a judicial ruling by a court with jurisdiction in a Member State, the General Court or the Court of Justice of the European Union. |
Amendment 179
Proposal for a regulation
Article 14 – paragraph 6 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
6d. This Article shall not apply to editorial content provided by a trader assuming editorial responsibility for that content and complying with rules which are in line with community and national law. |
Amendment 180
Proposal for a regulation
Article 19
|
|
Text proposed by the Commission |
Amendment |
Article 19 |
Article 14a |
Trusted flaggers |
Trusted flaggers |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. |
1. Online platforms and providers of hosting services shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are immediately processed and decided, without prejudice to the implementation of a complaint and redress mechanism. |
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: |
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions, without prejudice to the implementation of a complaint and redress mechanism: |
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; |
(a) it has particular expertise and competence that could be exercised in one or more Member States, for the purposes of detecting, identifying and notifying illegal content, as well as intentional manipulation and exploitation of the service in the sense of Article 26(1)(c); |
(b) it represents collective interests and is independent from any online platform; |
(b) it represents collective interests or as individual right holder and is independent from any online platform, law enforcement, or other government or relevant commercial entity; |
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner. |
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner, and in full respect of fundamental rights such as the freedom of expression and information, and it is independent; |
|
(ca) it publishes, at least once a year, clear, easily comprehensible and detailed reports on any notices submitted in accordance with Article 14 during the relevant period. The report shall list notices categorised by the identity of the hosting service provider, the type of alleged illegal or terms and conditions violating content concerned, and what action was taken by the provider. In addition, the reports shall identify relationships between the trusted flagger and any online platform, law enforcement, or other government or relevant commercial entity, and explain the means by which the trusted flagger maintains its independence. |
|
2a. The conditions set in paragraph 2 allow trusted flaggers’ notifications to be sufficient for immediate removal or disabling the content notified by them. |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. This communication shall include the geographical scope within which the trusted flagger competence was recognised based on the approval of a particular Digital Services Coordinator and information on expertise and competence declared by the trusted flagger. |
|
3a. Member States may recognise entities that were awarded the status of trusted flaggers in another Member State as a trusted flagger on their own territory. Upon request by a Member State, trusted flaggers can be awarded the status of European trusted flagger by the Board, in accordance with Article 48(2). The Commission shall keep register of European trusted flaggers. |
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. |
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
5. Where an online platform or a provider of hosting services has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated or incorrect notices, or notices violating recipients’ fundamental right or aiming at distorting competition, through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform or a provider of hosting services pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. The Digital Services Coordinator can also take into account any evidence according to which the entity would have used its status to distort competition. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. |
7. The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6. |
7. The Commission, after consulting the Board, shall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 4a, 6 and 7. |
(Article 19 is placed after Article 14 and is amended.)
Amendment 181
Proposal for a regulation
Article 15 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
1. Where a provider of hosting services decides to remove, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing, disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility, of the decision and provide a clear and specific statement of reasons for that decision. |
Amendment 182
Proposal for a regulation
Article 15 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. When the removing or disabling access to specific items of information is followed by the transmission of these specific items of information in accordance with Article 15a, the requirement to inform the recipient set out in paragraph 1 of this Article may be postponed by a period of six weeks in order to avoid interfere with potential ongoing criminal investigations. The period of six weeks can be renewed only following a motivated decision of the competent authority to which the specific items of information had been transmitted. |
Amendment 183
Proposal for a regulation
Article 15 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access; |
(a) whether the decision entails either the removal of, the disabling of access to, the restriction of the visibility of, the information and, where relevant, the territorial scope of the disabling of access or of the restriction of visibility; |
Amendment 184
Proposal for a regulation
Article 15 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not include personal data. |
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a database managed by the Commission which is accessible to national and European authorities. That information shall not include personal data. |
Amendment 185
Proposal for a regulation
Article 15 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Paragraphs 2, 3 and 4 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, those paragraphs shall not apply to enterprises that previously qualified for the status of a micro or small enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status. |
Amendment 186
Proposal for a regulation
Article 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15a |
|
Preservation of content and related data, and mandatory transmission of specific items of information |
|
1. Providers of hosting services shall store the illegal content which has been removed or access to which has been disabled as a result of content moderation, or of an order to act against a specific item of illegal content as referred to in Article 8, as well as any related data removed as a consequence of the removal of such illegal content, which are necessary for: |
|
(a) administrative or judicial review or out-of-court dispute settlement against a decision to remove or disable access to illegal content and related date; or |
|
(b) the prevention, detection, investigation and prosecution of criminal offences. |
|
2. Illegal content referred to in this article means related to human trafficking and child pornography, as well as content that publicly inciting to violence directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin in accordance with Council Framework Decision 2008/913/JHA1a and Directive 2011/36/EU of the European Parliament and of the Council1b. |
|
3. Providers of hosting services shall store the illegal content and related data pursuant to in paragraph 1 for six months from the date of removal or disabling access to it. The illegal content shall, upon request from the competent authority or court, be stored for a further specified period only if and for as long as necessary for ongoing administrative or judicial review as referred to in paragraph 1, point (a). |
|
4. Providers of hosting services shall ensure that the illegal content and related data stored pursuant to paragraph 1 are subject to appropriate technical and organisational safeguards. Those technical and organisational safeguards shall ensure that the illegal content and related data stored are accessed and processed only for the purposes referred to in paragraph 1 and shall ensure a high level of security of personal data concerned. Providers of hosting services shall review and update those safeguards where necessary. |
|
5. Providers of hosting services shall transmit to the competent authorities of the Member States the illegal content which has been removed or access to which has been disabled, whether such a removing or disabling access to is a result of a voluntary content moderation or of a use of the notice and action mechanism referred to in Article 14. They shall transmit that illegal content under the following conditions: |
|
(a) illegal content referred to in paragraph 2 of this Article; and |
|
(b) the competent law enforcement authority to receive such illegal content is that of the Member State of the residence or establishment of the person who made the illegal content available, or, failing that, the law enforcement authority is that of the Member State in which the provider of hosting services is established or has its legal representative, or, failing that, the provider of hosting services shall inform Europol; |
|
(c) when the provider of hosting services is a very large online platform in accordance with the Section 4 of Chapter III, it shall, when transmitting the illegal content, add a flag indicating that the illegal content involves a threat to the life or safety of persons. |
|
6. Each Member State shall notify to the Commission the list of its competent law enforcement authorities for the purposes of paragraph 5. |
|
_______________ |
|
1a Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law (OJ L 328, 6.12.2008, p. 55). |
|
1b Directive 2011/36/EU of the European Parliament and of the Council of 5 April 2011 on preventing and combating trafficking in human beings and protecting its victims, and replacing Council Framework Decision 2002/629/JHA (OJ L 101, 15.4.2011, p. 1). |
Amendment 187
Proposal for a regulation
Article 15 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15b |
|
Notification of suspicions of serious criminal offences |
|
1. Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. |
|
2. Where provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or shall inform Europol. |
|
3. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified. |
|
4. For the purpose of this Article, the Member State concerned shall be the Member State where the serious criminal offence is suspected to have taken place, to be taking place or to likely take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected serious criminal offence resides or is located. |
|
5. For the purpose of this Article, each Member State shall notify to the Commission the list of its competent law enforcement or judicial authorities. |
Amendment 188
Proposal for a regulation
Article 15 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15c |
|
Principles for content management |
|
1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, proportionate to the type and volume of content, relevant and limited to what is necessary in relation to the purposes for which the content is managed. Content hosting platforms shall be accountable for ensuring that their content management practices are fair, transparent and proportionate. |
|
2. Users shall not be subjected to discriminatory practices, exploitation or exclusion, for the purposes of content moderation by the content hosting platforms, such as removal of user-generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class. |
|
3. Content hosting platforms shall provide the users with sufficient information on their content curation profiles and the individual criteria according to which content hosting platforms curate content for them, including information as to whether algorithms are used and their objectives. |
|
4. Content hosting platforms shall provide users with an appropriate degree of influence over the curation of content made visible to them, including the choice of opting out of content curation altogether. In particular, users shall not be subject to content curation without their freely given, specific, informed and unambiguous prior consent. |
Amendment 189
Proposal for a regulation
Article 16 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
This Section shall not apply to online platforms that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and to online platforms that no longer qualify as micro or small enterprises, and that are not owned by entities having their establishment outside the Union.. |
Amendment 190
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
1. Online platforms shall provide recipients of the service, as well as individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the online platform not to act after having received a notice, and against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
Amendment 191
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions whether or not to remove, suspend the possibility of purchase or rental, disable access to or restrict the visibility of the information; |
Amendment 192
Proposal for a regulation
Article 17 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) decisions to suspend or terminate the provision of the service, in whole or in part, to the recipients; |
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients; |
Amendment 193
Proposal for a regulation
Article 17 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) decisions to suspend or terminate the recipients’ account. |
(c) decisions whether or not to suspend or terminate the recipients’ account. |
Amendment 194
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) decisions to restrict or not the ability to monetize content provided by the recipients. |
Amendment 195
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(cb) decisions of online marketplaces to suspend the provisions of their services to traders; |
Amendment 196
Proposal for a regulation
Article 17 – paragraph 1 – point c c (new)
|
|
Text proposed by the Commission |
Amendment |
|
(cc) decisions that adversely affect the recipient’s access to significant features of the platform’s regular services; |
Amendment 197
Proposal for a regulation
Article 17 – paragraph 1 – point c d (new)
|
|
Text proposed by the Commission |
Amendment |
|
(cd) decisions not to act upon a notice; |
Amendment 198
Proposal for a regulation
Article 17 – paragraph 1 – point c e (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ce) decisions whether or not to apply additional labels or additional information to content provided by the recipients. |
Amendment 199
Proposal for a regulation
Article 17 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months as set out to in paragraph 1 of this Article shall be considered to start from the day on which the recipient was informed in accordance with Article 15(2). |
Amendment 200
Proposal for a regulation
Article 17 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and non-arbitrary manner and without undue delay. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
Amendment 201
Proposal for a regulation
Article 17 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants and the individual or bodies which submitted a referral linked to the complainant’s request of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. The decision mentioned in this paragraph shall also include: |
|
– information on whether the decision referred to in paragraph 1 was taken as a result of human review; |
|
– in case the decision referred to in paragraph 1 is upheld, a detailed explanation on how the information to which the complaint relates to is in breach of the platform’s terms and conditions or why the online platform considers the information to be unlawful. |
Amendment 202
Proposal for a regulation
Article 17 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. |
5. Online platforms shall ensure that recipients of the service may contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. |
Amendment 203
Proposal for a regulation
Article 17 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. Online platforms shall ensure that any relevant information in relation to decisions taken by the internal complaint-handling mechanism is available to recipients of the service for the purpose of seeking redress through an out-of-court dispute settlement body pursuant to Article 18 or before a court. |
Amendment 204
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
1. After internal complaint handling mechanisms are exhausted, recipients of the service individuals or entities that have submitted notices, addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Amendment 205
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law. |
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law. Judicial redress against a decision by an out-of-court dispute settlement body shall be directed against the online platform, not the settlement body. |
Amendment 206
Proposal for a regulation
Article 18 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Where a recipient seeks a resolved to multiple complaints, either party may request that the out-of-court dispute settlement body treats and resolves these complaints in a single dispute decision. |
Amendment 207
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, where the body has demonstrated that it meets all of the following conditions: |
2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall certify the body, where the body has demonstrated that it meets all of the following conditions: |
Amendment 208
Proposal for a regulation
Article 18 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms; |
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms, including aspects such as financial resources and personnel, and is legally distinct from and functionally independent of the government of the Member State or any other public or private body as well as of individuals or entities that have submitted notices; |
Amendment 209
Proposal for a regulation
Article 18 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the dispute settlement is easily accessible through electronic communication technology; |
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology; |
Amendment 210
Proposal for a regulation
Article 18 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) it is capable of settling dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union; |
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities and cost-effective manner and in at least one official language of the Union; |
Amendment 211
Proposal for a regulation
Article 18 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure which are easily and publicly accessible. |
Amendment 212
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. |
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. |
|
Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time. |
Amendment 213
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. |
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. Out-of-court dispute settlement procedures shall preferably be free of charge for the recipient. |
Amendment 214
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 3
|
|
Text proposed by the Commission |
Amendment |
Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlement. |
Certified out-of-court dispute settlement bodies shall make information on the fees, or the mechanisms used to determine the fees, publicly available. |
Amendment 215
Proposal for a regulation
Article 18 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Decisions reached by an out-of-court dispute settlement body shall not be disputable by another out-of-court dispute settlement body and the resolution of a particular dispute may only be discussed in one out-of-court dispute settlement body. |
Amendment 216
Proposal for a regulation
Article 19 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 19a |
|
Accessibility requirements for online platforms |
|
1. Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I to Directive (EU) 2019/882. |
|
2. Providers of online platforms shall prepare the necessary information in accordance with Annex V to Directive (EU) 2019/882 as well as information, forms and measures provided pursuant to this Regulation and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public including in a manner which is accessible to persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation. |
|
3. Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. |
|
4. In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. |
|
5. Provider of online platforms shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. |
|
6. Online platforms which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. |
|
7. Online platforms which are inconformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements. |
Amendment 217
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Providers of hosting services and online platforms shall, after having issued a prior warning, enable, suspend, for a specified period of time, or terminate the provision of their services to recipients of the service that repeatedly provide illegal content. The online platform may request support from the Digital Service Coordinator to establish the frequency for which account suspension is deemed necessary and to set the duration of the suspension. |
Amendment 218
Proposal for a regulation
Article 20 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Online marketplaces shall publish the information on traders suspended pursuant to paragraph 1 of this Article gathered in accordance with Article 22(1) in the database as referred to in Article 15(4). When the suspension expires, the data should be deleted from that database. |
Amendment 219
Proposal for a regulation
Article 20 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
2. Providers of hosting services and online platforms shall, after having issued at least three prior warnings, suspend, for a specified period of time, or terminate the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
Amendment 220
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following: |
3. Providers of hosting services and online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to them. Those circumstances shall include at least the following: |
Amendment 221
Proposal for a regulation
Article 20 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
(a) the absolute numbers of items of illegal content or unfounded notices or complaints, submitted in the past year; |
Amendment 222
Proposal for a regulation
Article 20 – paragraph 3 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the intention of the recipient, individual, entity or complainant. |
(d) where identifiable, the intention of the recipient, individual, entity or complainant. |
Amendment 223
Proposal for a regulation
Article 20 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where: |
|
(a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; |
|
(b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or |
|
(c) the items removed were related to content covered by Directive 2011/93/EU or Directive (EU) 2017/541. |
Amendment 224
Proposal for a regulation
Article 20 – paragraph 3 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
3b. The assessment must be carried out by qualified staff provided with dedicated training on the applicable legal framework. |
Amendment 225
Proposal for a regulation
Article 20 – paragraph 3 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
3c. Without prejudice to Article 4 of Regulation (EU) 2019/1150, providers of hosting services shall do all in their power to ensure that users which have been suspended from the service cannot use it again until such time as the suspension is lifted. |
|
Where an online platform stops providing its services to a trade user, it shall provide that user, at least 15 days before the termination comes into force, with the reasons for its decision and shall inform it of the possibility to challenge the decision under Article 17. |
Amendment 226
Proposal for a regulation
Article 20 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
4. Providers of hosting services and online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
Amendment 227
Proposal for a regulation
Article 21 – paragraph 2 – second subparagraph
|
|
Text proposed by the Commission |
Amendment |
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. |
deleted |
Amendment 228
Proposal for a regulation
Article 21 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Unless instructed otherwise by the informed authority, the provider shall remove or disable the content. It shall store all content and related data for at least six months. |
Amendment 229
Proposal for a regulation
Article 21 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified. |
Amendment 230
Proposal for a regulation
Article 22 – title
|
|
Text proposed by the Commission |
Amendment |
Traceability of traders |
Traceability of traders on online marketplaces |
Amendment 231
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information: |
1. The online marketplace shall ensure that professional traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online marketplace has obtained the following information: |
Amendment 232
Proposal for a regulation
Article 22 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the bank account details of the trader, where the trader is a natural person; |
(c) the payment account details of the trader; |
Amendment 233
Proposal for a regulation
Article 22 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; |
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation] or any relevant act of Union law; |
_________________ |
_________________ |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
Amendment 234
Proposal for a regulation
Article 22 – paragraph 1 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law. |
(f) a self-certification by the trader committing to only offer products or services or content including advertisement, that complies with the applicable rules of Union law. |
Amendment 235
Proposal for a regulation
Article 22 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. |
2. The online marketplace, upon receiving that information, make reasonable efforts to assess whether that information is reliable through requests to the trader to provide supporting documents from reliable sources. |
Amendment 236
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
3. Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
3. Where the online marketplace obtains indications that information under paragraph 1, point (f), is inaccurate it shall remove the product or service directly from their online platform and if any other item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that online marketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Amendment 237
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
Where the trader fails to correct or complete that information, the online platform shall suspend the provision of its service to the trader until the request is complied with. |
Where the trader fails to correct or complete that information, the providers of online marketplaces shall suspend the provision of its service to the trader in relations to the offering of products or services to consumers located in the Union until the request is fully complied with. |
Amendment 238
Proposal for a regulation
Article 22 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The providers of online marketplaces shall ensure that traders are given the ability to discuss any information viewed as inaccurate or incomplete directly with a trader before any suspension of services. This may take the form of the internal complaint-handling system under Article 17. |
Amendment 239
Proposal for a regulation
Article 22 – paragraph 3 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
3b. If an online marketplace rejects an application for services or suspends services to a trader, the trader shall have recourse to the systems under Article 17 and Article 43 of this Regulation. |
Amendment 240
Proposal for a regulation
Article 22 – paragraph 3 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
3c. Traders shall be solely liable for the accuracy of the information provided and shall inform without delay the online marketplace of any changes to the information provided. |
Amendment 241
Proposal for a regulation
Article 22 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
4. The online marketplace shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
Amendment 242
Proposal for a regulation
Article 22 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Without prejudice to paragraph 2, the platform shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. |
5. Without prejudice to paragraph 2, the online marketplaces shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. |
Amendment 243
Proposal for a regulation
Article 22 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. |
6. The online marketplaces shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. |
Amendment 244
Proposal for a regulation
Article 22 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. The online platform shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. |
7. The online marketplace shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. |
Amendment 245
Proposal for a regulation
Article 22 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22a |
|
Obligation to provide information |
|
1. The online interface made available to the trader shall allow access to at least the following information: |
|
(a) the information referred to in Article 22(6); |
|
(b) the information requirements provided for in Articles 6 and 8 of Directive 2011/83/EU; |
|
(c) the information allowing for the unequivocal identification of the product or the service, and, where applicable, the CE marking and the warnings, information and labels, which are mandatory under applicable legislation on product safety and product compliance. |
|
The Commission shall adopt an implementing act listing the items of information required in accordance with the first subparagraph. That implementing act shall be adopted no later than ... [one year after entry into force of this Regulation]. |
|
2. The online marketplace shall check that the information provided by the trader is complete with regard to the lists of information items referred to in points (a) and (b) of Article 22(2) before the offer for the product or service is made available online and shall not authorize the trader to make available such an offer for as long as the information remains incomplete. |
|
3. Where the online marketplace establishes that the information provided by the trader for an offer that has already been published online is not relevant anymore and must be completed, it shall suspend the offer without delay or make it inaccessible and ask the trader to complete that information as soon as possible. |
Amendment 246
Proposal for a regulation
Article 22 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22b |
|
Additional obligations of online marketplaces |
|
1. Where an online marketplace becomes aware of the illegal nature of a product or service offered by a trader on its interface it shall: |
|
(a) immediately remove the illegal product or service from its interface and inform the authorities about that; |
|
(b) maintain an internal database of content removed and/or recipients suspended pursuant to Article 20 to be used by internal content moderation systems tackling the identified risks; |
|
(c) where the online marketplace has the contact details of the recipients of its services, inform such recipients of the service that have purchased said product or service during the past twelve months about the illegality, the identity of the trader and options for seeking redress; |
|
(d) shall compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past six months along with information about the concerned trader and options for seeking redress. |
Amendment 247
Proposal for a regulation
Article 22 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22c |
|
Obligations relating to illegal offers from traders |
|
1. The online marketplaces shall take adequate measures in order to prevent the dissemination by traders using its services of offers for products or services which do not comply with Union law or law of any Member State on the territory of which those offers are made available. |
|
2. Where the online marketplace obtains indication including the elements listed in points (a) and (b) of Article 14(2), and according to which an item of information referred to in Article 22a(1) is inaccurate, that online marketplace shall request the trader to give evidence of the accuracy of that item of information or to correct it, without delay. Where the trader does not provide evidence that the item of information is accurate or evidence that the correction made is regular, the online marketplace shall suspend the offer for the product or service until the trader has complied with the request. |
|
3. Before the trader’s offer is made available on the online marketplace, the online marketplace shall verify, with regard to the information allowing for the unequivocal identification of the product including the information referred to in point (b) of Article 22a(1), if the offer that the trader wishes to propose to consumers located in the Union is mentioned in the list, or the lists, of products or categories of products identified as not compliant, as classified in any freely accessible official online database or online interface whose reference is established by the Commission by means of an implementing act adopted no later than ... [one year after entry into force of this Regulation], and shall not authorize the trader to provide the offer if that verification determines that the product is so listed. |
Amendment 248
Proposal for a regulation
Article 22 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22d |
|
Obligations relating to illegal offers from traders with regard to the applicable law on product safety and product compliance |
|
1. As soon as a market surveillance authority, a customs authority, owners of the rights or a consumer organisation informs the online marketplace that an offer for a product or service is illegal under applicable law on product safety and product compliance, that online marketplace shall remove the offer or disable access to it. |
|
The online marketplace shall inform the trader who has published the illegal offer of the decision taken pursuant to this paragraph in accordance with Articles 15 and 17. |
|
When it informs the trader of the decision to remove the offer or disable access to it, and where the illegality of the offer relates to a default of the product or service which may endanger the health or the safety of consumers, the online marketplace shall request the trader to provide all information able to demonstrate that it has taken appropriate corrective action in accordance with Article 16(3) of Regulation (EU) 2019/1020. |
|
2. Where the online marketplace receives no reply from the trader within 48 hours from the date of the request referred to in paragraph 1 of this Article, it shall take the necessary corrective action referred to in points (c), (d) and (g) of Article 16(3) of Regulation (EU) 2019/1020 without undue delay. |
|
3. The online marketplace shall inform without delay the market surveillance authority or the customs authority of the action taken by the trader or on its own for the application of paragraphs 1 and 2. As soon as a market surveillance authority or a customs authority orders the trader to undertake alternative or additional measures and informs the online marketplace accordingly, that online marketplace shall request the trader to provide all information proving that it has given due effect to the order. |
|
Where the online marketplace does not receive within 48 hours the information according to which the trader has fully complied with the order, the online marketplace shall implement directly the alternative measures ordered by the market surveillance authority or the customs authority without undue delay. |
|
4. The online marketplace may charge the trader with the costs of the measures it has taken in accordance with this Article, by any appropriate means. It shall notify immediately such measure to the trader and inform him of its right to contest that decision in accordance with Articles 17 and 18 or by legal action. |
|
The online marketplace shall not require from traders using its services any advance payments of costs related to the measures it may take in accordance with this Article, nor shall it make access to its services conditional on the acceptance of such payments. |
Amendment 249
Proposal for a regulation
Article 22 e (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22e |
|
Suspension of access of traders to the online marketplace services |
|
1. In accordance with Article 20 the online marketplace shall suspend without undue delay the provision of its services to traders that provide, in a repeated manner or continuously, illegal offers for a product or a service. It shall immediately notify its decision to the trader. |
|
2. Where the online marketplace adopts a decision pursuant to paragraph1 it shall continue to meet its obligations under this Section, in particular regarding consumers who have concluded a contract with the suspended traders. |
|
3. The online marketplace shall inform without delay the competent authority about the decision taken pursuant to paragraph 1. |
Amendment 250
Proposal for a regulation
Article 22 f (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 22f |
|
Right to redress |
|
The online marketplace shall be entitled to redress from the trader who benefited from its services in case of a failure by the trader to comply with its obligations towards the online marketplace or towards consumers, unless the online marketplace has already charged the trader for the costs of the measures it had to take as a consequence. |
|
The consumer is entitled to redress from the online marketplace for the failure of the online marketplace to comply with the obligations under this Section. |
Amendment 251
Proposal for a regulation
Article 23 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures; |
(a) the number of disputes submitted to the certified out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures; |
Amendment 252
Proposal for a regulation
Article 23 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of illegal content, the submission of unfounded notices and the submission of unfounded complaints; |
Amendment 253
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions; |
Amendment 254
Proposal for a regulation
Article 23 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. |
4. The Commission shall adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. |
Amendment 255
Proposal for a regulation
Article 23 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Where published to the general public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions. |
Amendment 256
Proposal for a regulation
Article 24 – title
|
|
Text proposed by the Commission |
Amendment |
Online advertising transparency |
Online advertising transparency and control |
Amendment 257
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: |
1. Online platforms that directly and indirectly display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: |
Amendment 258
Proposal for a regulation
Article 24 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) that the information displayed is an advertisement; |
(a) that the information displayed on the interface or parts thereof is an online advertisement; |
Amendment 259
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) clear, meaningful and uniform information about the parameters used to determine the recipient to whom the advertisement is displayed. |
Amendment 260
Proposal for a regulation
Article 24 paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform. |
Amendment 261
Proposal for a regulation
Article 24 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Online platforms that display advertising on their online interfaces shall ensure that advertisers: |
|
(a) can request and obtain information on where their advertisements have been placed; |
|
(b) can request and obtain information on which broker treated their data; |
|
(c) can indicate on which specific location their ads cannot be placed. In case of non-compliance with this provision, advertisers shall have the right to judicial redress. |
Amendment 262
Proposal for a regulation
Article 24 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
1c. Advertisements that are targeted toward individuals or segments of individuals who are below the age of 18 on the basis of their personal data, behaviour, the tracking of their activities or profiling within the meaning of Article 4(4) of Regulation (EU) 2016/679 shall not be permitted. |
Amendment 263
Proposal for a regulation
Chapter III – Section 4 – title
|
|
Text proposed by the Commission |
Amendment |
Additional obligations for very large online platforms to manage systemic risks |
Additional obligations for very large online platforms, live streaming platforms, instant messaging services used for purposes other than private or non-commercial and search engines to manage systemic risks |
Amendment 264
Proposal for a regulation
Article 25 – title
|
|
Text proposed by the Commission |
Amendment |
Very large online platforms |
Very large online platforms, live streaming platforms, instant messaging services used for purposes other than private or non-commercial and search engines |
Amendment 265
Proposal for a regulation
Article 25 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. |
1. This Section shall apply to online platform services, live streaming platform services, instant messaging services used for purposes other than private or non-commercial and search engine services which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. |
Amendment 266
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
1. Very large online platform services, live streaming platform services, instant messaging services used for purposes other than private or non-commercial and search engine services shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), on an ongoing basis and at least once a year thereafter, the probability and severity of any significant systemic risks including the probability and severity, stemming from the functioning and use made of their services and activities, in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
Amendment 267
Proposal for a regulation
Article 26 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the dissemination of illegal content through their services; |
(a) the dissemination and amplification of illegal content through their services, including unsafe and non-compliant products and services, in case of online marketplaces; |
Amendment 268
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, human dignity, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively, caused by an illegal activity; |
Amendment 269
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) intentional manipulation of their service by means of inauthentic use, such as ‘deep fakes’ or automated exploitation of the service, with an actual or foreseeable negative or illegal effect on the protection of public health, minors, democratic values, media freedom and freedom of expression of journalists, as well as their ability to verify facts, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
Amendment 270
Proposal for a regulation
Article 26 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
2. When conducting risk assessments, very large online platforms shall also take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content. |
Amendment 271
Proposal for a regulation
Article 26 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The outcome of the risk assessment and supporting documents shall be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of establishment. |
Amendment 272
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
1. Very large online platform services, live streaming platform services instant messaging services used for purposes other than private or non-commercial and search engine services shall put in place reasonable, proportionate and effective measures, to mitigate the probability and severity of any significant systemic risks, including the probability and severity, stemming from the functioning and use made of their services identified pursuant to Article 26. Such measures may include, where applicable: |
Amendment 273
Proposal for a regulation
Article 27 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions; |
(a) adapting content moderation or recommender systems, their decision-making processes, design, the features or functioning of their services, or their terms and conditions; |
Amendment 274
Proposal for a regulation
Article 27 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19; |
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 14a; |
Amendment 275
Proposal for a regulation
Article 27 – paragraph 1 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) in case of very large online marketplaces taking into account the information on repeat infringers as referred to in Article 20(1a), when starting a contractual relationship with a trader; |
Amendment 276
Proposal for a regulation
Article 27 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively. |
deleted |
Amendment 277
Proposal for a regulation
Article 27 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Very large online platforms shall take adequate measures to detect inauthentic videos (‘deep fakes’). When detecting such videos, they should label them as inauthentic in a way that is clearly visible for the internet user. |
Amendment 278
Proposal for a regulation
Article 27 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, to the Board in view of issuing specific recommendations and to independent auditors for the purposes of the audit report. |
|
Following the written explanation of the reasons of the very large online platforms not to put in place mitigating measures, and where necessary, the Board shall issue specific recommendations as to the mitigation measures that very large online platforms shall implement instead of those listed in article 27(1). Very large online platforms shall within one month from receiving of these recommendations, implement the recommended measures. |
|
In case of repeated failure of a very large online platform to take effective mitigating measures and in case of repeated non-compliance with the recommendations, the Board may advise the Commission and the Digital Services Coordinators to impose sanctions following Chapter IV. |
Amendment 279
Proposal for a regulation
Article 27 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
(a) identification and assessment of all systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
Amendment 280
Proposal for a regulation
Article 27 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
3. The Commission, in cooperation with the Digital Services Coordinators, and following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. |
Amendment 281
Proposal for a regulation
Article 27 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 27a |
|
Mitigation of risks for the freedom of expression and freedom and pluralism of the media |
|
1. Very large online platforms shall ensure that the exercise of the fundamental rights of freedom of expression and freedom and pluralism of the media is always adequately and effectively protected. |
|
2. Where very large online platforms allow for the dissemination of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790, of audiovisual media services within the meaning of Article 1(1)(a) of Directive 2010/13/EU (AVMS) or of other editorial media, which are published in compliance with applicable Union and national law under the editorial responsibility and control of a press publisher, audiovisual or other media service provider, who can be held liable under the laws of a Member State, the platforms shall be prohibited from removing, disabling access to, suspending or otherwise interfering with such content or services or suspending or terminating the service providers’ accounts on the basis of the alleged incompatibility of such content with their terms and conditions, as well as on the basis of any self-regulatory or co-regulatory standard or measure, including Codes of Conduct pursuant to Article 35 of this Regulation. The same shall apply to books and films or other expressions of opinion or statements of fact for the purpose of exercising the right to freedom of expression as enshrined in Article 11 of the Charter. |
|
3. Very large online platforms shall ensure that their content moderation, their decision-making processes, the features or functioning of their services, their terms and conditions and recommender systems are objective, fair and non-discriminatory. |
Amendment 282
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: |
Amendment 283
Proposal for a regulation
Article 28 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the obligations set out in Chapter III; |
(a) the obligations set out in Chapter III, in particular the quality of the identification, analysis and assessment of the risks referred to in Article26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27; |
Amendment 284
Proposal for a regulation
Article 29 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The main parameters referred to in paragraph 1 shall include at least the following elements: |
|
(a) the main recommendation criteria; |
|
(b) how these criteria are prioritised; |
|
(c) the optimisation goal of the relevant recommender system; and |
|
(d) the role of recipient behaviour in determining recommender system outputs if applicable. |
Amendment 285
Proposal for a regulation
Article 29 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
deleted |
Amendment 286
Proposal for a regulation
Article 29 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The parameters used in recommender systems shall always be fair and non-discriminatory. |
Amendment 287
Proposal for a regulation
Article 29 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Online platforms shall ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service. |
Amendment 288
Proposal for a regulation
Article 30 – title
|
|
Text proposed by the Commission |
Amendment |
Additional online advertising transparency |
Additional online advertising transparency and protection |
Amendment 289
Proposal for a regulation
Article 30 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. |
1. Very large online platforms that display advertising on their online interfaces shall compile and make available to relevant authorities and vetted researchers, meeting the requirements of Article 31(4), through application programming interfaces an easily accessible and searchable a repository containing the information referred to in paragraph 2, until six months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. |
Amendment 290
Proposal for a regulation
Article 30 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed or financed; |
Amendment 291
Proposal for a regulation
Article 30 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; |
deleted |
Amendment 292
Proposal for a regulation
Article 30 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically. |
deleted |
Amendment 293
Proposal for a regulation
Article 30 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Very large online platforms selling advertising for display on their online interface shall ensure via standard contractual clauses with the purchasers of advertising space that the content with which the advertisement is associated is compliant with the terms and conditions of the platform, or with the law of the Member States where the recipients of the service to whom the advertisement will be displayed is located. |
Amendment 294
Proposal for a regulation
Article 30 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Very large online platforms shall be prohibited from profiling or targeting minors with personalised advertising, in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679. |
Amendment 295
Proposal for a regulation
Article 30 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. The very large online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform, in particular: |
|
(a) to withdraw consent or to object to processing; |
|
(b) to obtain access to the personal data concerning the data subject; |
|
(c) to obtain rectification of inaccurate personal data concerning the data subject; |
|
(d) to obtain erasure of personal data without undue delay; |
|
(e) where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a) to (d) of this paragraph have been enclosed in accordance with Article 19 of Regulation (EU) 2016/679. |
Amendment 296
Proposal for a regulation
Article 31 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, and within a maximum of 72 hours, specified in the request, provide information and full and continuous access to data that are necessary to properly monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. With regard to moderation and recommender systems, very large online platforms shall provide upon request the Digital Services Coordinator or the Commission with access to algorithms and associated data that allow the detection of possible biases which could lead to the dissemination of illegal content, or content that is in breach with their terms and conditions, or presents threats to fundamental rights including freedom of expression. Where a bias is detected, very large online platforms should expeditiously correct it following the recommendations of the Digital Services Coordinator or the Commission. Very large online platforms should be able to demonstrate their compliance at every step of the process pursuant to this Article. |
Amendment 297
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide information and access to relevant data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding and mitigation of systemic risks as set out in Articles 26 and 27. |
Amendment 298
Proposal for a regulation
Article 31 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 for a limited time and through online databases or application programming interfaces, as appropriate in an easily accessible and user-friendly format. This shall include personal data only where it is lawfully accessible by the public and without prejudice to Regulation (EU) 2016/679. |
Amendment 299
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
4. In order to be vetted, scientific researchers shall be affiliated with academic institutions, be independent from commercial interests of the very large online platform it seeks data from or its competitors, disclose the sources of funding financing their research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
Amendment 300
Proposal for a regulation
Article 31 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this Regulation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
Amendment 301
Proposal for a regulation
Article 31 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: |
6. Within 15 days following receipt of a request as referred to in paragraphs 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested for the following reasons: |
(a) it does not have access to the data; |
(a) in case of request under paragraph 1, a very large online platform does not have and cannot obtain with reasonable effort access to the data; |
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. |
(b) in case of request under paragraph 2, a very large online platform does not have access to the data or providing access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. |
Amendment 302
Proposal for a regulation
Article 31 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. |
deleted |
The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request. |
|
Amendment 303
Proposal for a regulation
Article 31 – paragraph 7 (new)
|
|
Text proposed by the Commission |
Amendment |
|
7. Upon completion of the research envisaged in Article 31(2), the vetted researchers shall make their research publicly available, taking into account the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2019/679. |
Amendment 304
Proposal for a regulation
Article 31 – paragraph 8 (new)
|
|
Text proposed by the Commission |
Amendment |
|
8. Digital Service Coordinators and the Commission shall, once a year, report the following information: |
|
(a) the number of requests made to them as referred to in paragraphs 1 and 2; |
|
(b) the number of such requests that have been declined by the Digital Service Coordinator or the Commission and the reasons for which they have been declined; |
|
(c) the number of such requests that have been declined by the Digital Service Coordinator or the Commission, including the reasons for which they have been declined, following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1 and 2. |
Amendment 305
Proposal for a regulation
Article 32 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Very large online platforms shall communicate the name and contact details of the compliance officer to the Digital Services Coordinator of establishment and the Commission. |
5. Very large online platforms shall communicate the name and contact details of the compliance officer. |
Amendment 306
Proposal for a regulation
Article 33 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The reports shall include content moderation and shall be published in the official languages of the Member States of the Union. |
Amendment 307
Proposal for a regulation
Article 34 – paragraph 1 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) transmission of data between advertising intermediaries in support of transparency obligations pursuant to points (b) and (c) of Article 24. |
deleted |
Amendment 308
Proposal for a regulation
Article 34 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Absence of agreement on voluntary industry standards shall not prevent the applicability or implementation of any measures outlined in this regulation. |
Amendment 309
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board shall have the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content as defined un Union and notional law and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
Amendment 310
Proposal for a regulation
Article 35 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
2. Where significant systemic risk within the meaning of Article 26(1) in relation to the dissemination of illegal content emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
Amendment 311
Proposal for a regulation
Article 35 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall ensure that the codes of conduct clearly set out their objectives in relation to the dissemination of illegal content, contain a set of harmonised key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all relevant stakeholders, including citizens, at Union level. The Commission and the Board shall also ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of any measures taken and their outcomes, as measured against the key performance indicators that they contain in order to facilitate effective cross-platform monitoring. |
Amendment 312
Proposal for a regulation
Article 35 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives, and publish their conclusions. Furthermore, they shall ensure that there is common alert mechanism managed at Union level to allow for real-time and coordinated responses. |
Amendment 313
Proposal for a regulation
Article 35 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. |
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct after prior warning. |
Amendment 314
Proposal for a regulation
Article 36 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. |
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Article 30 of this Regulation and Article 6 of Directive 2000/31/EC. |
Amendment 315
Proposal for a regulation
Article 36 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30. |
(a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24; |
|
(b) the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30. |
|
Amendment 316
Proposal for a regulation
Article 36 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. |
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those Codes two years after the application of this Regulation. |
Amendment 317
Proposal for a regulation
Article 37 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Member States shall ensure that their Digital Services Coordinators are informed by the relevant national, local and regional authorities on the diversity of platform sectors and issues covered by this Regulation; |
Amendment 318
Proposal for a regulation
Article 37 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it may request the participants to revise the crisis protocol, including by taking additional measures. |
5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it shall request the participants to remove and, where necessary, revise the crisis protocol, including by taking additional measures. |
Amendment 319
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
2. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. |
2. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. Those competent authorities shall have the necessary powers to carry out the tasks or supervise the sectors assigned to them as those attributed to the Digital Services Coordinator for the application and enforcement of this Regulation. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. |
Amendment 320
Proposal for a regulation
Article 38 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Member States shall ensure that the competent authorities have adequate financial and human resources, as well as legal and technical expertise to fulfil their tasks under this Regulation. |
Amendment 321
Proposal for a regulation
Article 39 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law. |
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law or the allocation of additional powers under other applicable law. |
Amendment 322
Proposal for a regulation
Article 40 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation. |
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapter III and final jurisdiction as to disputes on orders issued under Articles 8 and 9. |
Amendment 323
Proposal for a regulation
Article 40 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. By way of derogation from paragraph 1, the Member State in which the end user have their residence shall have jurisdiction for the purposes of Articles 22, 22a and 22b and the Member State in which the authority issuing the order is situated shall have jurisdiction for the purposes of Articles 8 and 9. |
Amendment 324
Proposal for a regulation
Article 40 – paragraph 1 b(new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. The Member State where the consumers have their habitual residence shall have jurisdiction for the purposes of Chapter III, Section 3. |
Amendment 325
Proposal for a regulation
Article 40 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Paragraphs 1, 2 and 3 are without prejudice to the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3. |
4. Paragraphs 1, 2 and 3 are without prejudice to Article 43(2), the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3. |
Amendment 326
Proposal for a regulation
Article 41 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the power to adopt interim measures to avoid the risk of serious harm. |
(e) the power to adopt proportionate interim measures to avoid the risk of serious harm. |
Amendment 327
Proposal for a regulation
Article 41 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Following request to the Commission and in cases of infringements that persist, could cause serious harm to recipients of the service, or could seriously affect their fundamental rights, the Digital Services Coordinator in the Member State where the end users have their residence may be entitled to additional powers in the framework of joint investigations as referred to in Article 46. |
Amendment 328
Proposal for a regulation
Article 42 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Member States shall lay down the rules on penalties applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are implemented in accordance with Article 41. |
1. Member States shall lay down the rules on penalties including administrative fines applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are properly and effectively implemented in accordance with Article 41. |
Amendment 329
Proposal for a regulation
Article 42 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them. |
2. Penalties shall be effective, proportionate and dissuasive. They shall take into particular account the interest of small scale providers and start ups and their economic viability. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them. |
Amendment 330
Proposal for a regulation
Article 43 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
1. Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority and shall inform the person who submitted the complaint. |
Amendment 331
Proposal for a regulation
Article 43 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The Digital Services Coordinator of establishment, in cases concerning a complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established as provided for in paragraph 1, shall assess the matter in a timely manner and shall inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled. |
Amendment 332
Proposal for a regulation
Article 43 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 43a |
|
Rights to effective judicial remedies |
|
1. Without prejudice to any other administrative or non-judicial remedy, any recipient of the service or its representative organisation shall have the right to an effective judicial remedy against a legally binding decision of a Digital Services Coordinator concerning them. |
|
2. In determining whether the very large online platform has complied with its obligations under Article 27(1), and in light of the principle of proportionality, the availability of suitable and effective measures shall be taken into account. |
|
3. Without prejudice to any other administrative or non-judicial remedy, any recipients of the service or its representative organisation shall have the right to an effective judicial remedy where the Digital Service Coordinator which is competent pursuant to Articles 40 and 43 does not handle a complaint or does not inform the recipient of the service within three months on the progress or outcome of the complaint lodged pursuant to Article 43. |
|
Proceedings against a Digital Services Coordinator under this paragraph shall be brought before the courts of the Member State where the Digital Services Coordinator is established. |
Amendment 333
Proposal for a regulation
Article 44 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board. |
1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission, to the European Parliament and to the Board. |
Amendment 334
Proposal for a regulation
Article 44 – paragraph 1a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Based on the annual report communicated by the Digital Services Coordinators, the Commission shall submit to the European Parliament and to the Council a dedicated biennial report analysing the aggregated data on orders referred to in Articles 8, 8a and 9 and issued by the Digital Services Coordinators, with a special attention being paid to potential abusive use of these Articles. The report shall provide a comprehensive overview of the orders to act against illegal content and it shall provide, for a specific period of time, the possibility to assess the activities of Digital Services Coordinators. |
Amendment 335
Proposal for a regulation
Article 44 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned; |
(a) the number and subject matter of orders to act against illegal content and orders to provide information, including at least information on the name of the issuing authority, the name of the provider and the type of action specified in the order, issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned; |
Amendment 336
Proposal for a regulation
Article 45 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. A request or recommendation pursuant to paragraph 1 shall not preclude the possibility of Digital Services Coordinator of the Member State where the recipient of the service resides or is established, to be able to carry out its own investigation concerning a suspected infringement of this Regulation by a provider of an intermediary service. |
Amendment 337
Proposal for a regulation
Article 45 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. |
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1 and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance without undue delay. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, and to the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. |
Amendment 338
Proposal for a regulation
Article 45 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. |
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto and a statement of reason in case of decision, following its investigation, not to take measures to ensure compliance with this Regulation. |
Amendment 339
Proposal for a regulation
Article 45 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Commission shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board. |
6. The Commission , in cooperation with the Digital Services Coordinators shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board. |
Amendment 340
Proposal for a regulation
Article 45 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. |
7. Where, pursuant to paragraph 6, the Commission in cooperation with the Digital Services Coordinators concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. |
|
This information should also be transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1. |
Amendment 341
Proposal for a regulation
Article 46 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Where Digital Services Coordinator of the country of destination considers that an alleged infringement exist and causes serious harm to a large number of recipients of the service in that Member States, or could seriously affect their fundamental rights, it may request to the Commission to set up joint investigations between Digital Services Coordinator of the country of establishment and the requesting Digital Services Coordinator of the country of destination. |
Amendment 342
Proposal for a regulation
Article 46 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. The Commission, in cooperation with the Digital Services Coordinators, shall assess such request and following positive opinion of the Board shall set up a joint investigation where the Digital Services Coordinator of the country of destination can be entitled to exercise the following additional powers with respect to the provider of intermediary services concerned by the alleged infringement: |
|
(a) to obtain access to the confidential version of the reports published by the intermediary service providers referred to in Article13 and where applicable in Articles 23 and 24, as well as to the annual reports drawn up by the other competent authorities pursuant to Article 44; |
|
(b) to obtain access to data collected by the Digital Services Coordinator of the country of establishment for the purpose of supervision of that provider on the territory of the Digital Services Coordinator of the country of destination without prejudice to the Regulation (EU) 2016/679; |
|
(c) to initiate proceedings and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance, where the suspected seriousness of the infringement would require immediate response that would not allow for the provisions of Article 45 to apply; |
|
(d) to request interim measures, as referred to in Article 41(2)(e); |
Amendment 343
Proposal for a regulation
Article 46 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
1c. The Commission decision setting up the joint investigation shall define a deadline by when Digital Services Coordinator of the country of establishment and Digital Services Coordinator launching the request pursuant to paragraph 2 shall agree on a common position on the joint investigation, and where applicable on the enforcement measures to be adopted. If no agreement is reached within this deadline, the case shall be referred to the Commission pursuant to Article 45(5). |
Amendment 344
Proposal for a regulation
Article 47 – paragraph 2 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) contributing to the effective application of Directive 2000/31/EC, Article 3, to prevent fragmentation of the digital single market and the obligations of very large platforms referred to in Article 5 of Regulation 2019/1150; |
Amendment 345
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) give recommendations for the implementation of Article 27 and advise on possible application of sanctions in case of repeated non-compliance; |
Amendment 346
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) issue opinions, recommendations or advice on matters related to Article 34. |
Amendment 347
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, shall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision without undue delay. |
Amendment 348
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, shall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
Amendment 349
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. |
2. When the Commission initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. |
Amendment 350
Proposal for a regulation
Article 52 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, their legal representatives, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. |
Amendment 351
Proposal for a regulation
Article 55 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement. |
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement. |
Amendment 352
Proposal for a regulation
Article 57 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms. |
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide, where necessary, access to its databases and algorithms, and to provide explanations relating to them. |
Amendment 353
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall adopt a non-compliance decision where it finds that the very large online platform concerned does not comply with one or more of the following: |
1. The Commission shall adopt a non-compliance decision, after consulting the Board, where it finds that the very large online platform concerned does not comply with one or more of the following: |
Amendment 354
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently: |
2. The Commission may by decision and in compliance with the proportionality principle, impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently: |
Amendment 355
Proposal for a regulation
Article 73 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. By five years after the entry into force of this Regulation at the latest, and every five years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. |
1. By five years after the entry into force of this Regulation at the latest, and every five years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation. |
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
JURI 8.2.2021 |
|||
Associated committees - date announced in plenary |
20.5.2021 |
|||
Rapporteur for the opinion Date appointed |
Geoffroy Didier 10.5.2021 |
|||
Discussed in committee |
27.5.2021 |
13.7.2021 |
9.9.2021 |
|
Date adopted |
30.9.2021 |
|
|
|
Result of final vote |
+: –: 0: |
15 9 0 |
||
Members present for the final vote |
Pascal Arimont, Gunnar Beck, Geoffroy Didier, Pascal Durand, Ibán García Del Blanco, Jean-Paul Garraud, Mislav Kolakušić, Sergey Lagodinsky, Gilles Lebreton, Karen Melchior, Jiří Pospíšil, Marcos Ros Sempere, Stéphane Séjourné, Raffaele Stancanelli, Adrián Vázquez Lázara, Axel Voss, Marion Walsmann, Tiemo Wölken, Lara Wolters |
|||
Substitutes present for the final vote |
Patrick Breyer, Daniel Buda, Emmanuel Maurel, Nacho Sánchez Amor, Kosma Złotowski |
|||
Substitutes under Rule 209(7) present for the final vote |
Isabel Benjumea Benjumea |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
15 |
+ |
PPE |
Pascal Arimont, Isabel Benjumea Benjumea, Daniel Buda, Geoffroy Didier, Axel Voss, Marion Walsmann |
Renew |
Pascal Durand, Stéphane Séjourné, Adrián Vázquez Lázara |
ID |
Jean‑Paul Garraud, Gilles Lebreton |
ECR |
Raffaele Stancanelli, Kosma Złotowski |
The Left |
Emmanuel Maurel |
NI |
Mislav Kolakušić |
9 |
- |
S&D |
Ibán García Del Blanco, Marcos Ros Sempere, Nacho Sánchez Amor, Lara Wolters, Tiemo Wölken |
Renew |
Karen Melchior |
ID |
Gunnar Beck |
Verts/ALE |
Patrick Breyer, Sergey Lagodinsky |
0 |
0 |
|
|
Key to symbols:
+ : in favour
- : against
0 : abstention
OPINION OF THE COMMITTEE ON CIVIL LIBERTIES, JUSTICE AND HOME AFFAIRS (28.7.2021)
for the Committee on the Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9‑0418/2020 – 2020/0361(COD))
Rapporteur for opinion: Patrick Breyer
(*) Associated committees – Rule 57 of the Rules of Procedure
SHORT JUSTIFICATION
Background
Following three resolutions voted by Parliament, the Commission presented its proposal for a Digital Services Act in December 2020. The proposal aims to ensure harmonised conditions for digital cross-border services to develop in the EU.
The LIBE Opinion
The Opinion focuses on better protecting fundamental rights and addressing illegal content in the digital age, in line with the competence of the LIBE committee. Most amendments implement reports and opinions on the Digital Services Act that have already been supported in Committee or Plenary. Key proposals are:
1. The Digital Services Act should provide for the right to use and pay for digital services anonymously wherever reasonably feasible, in line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data.
2. End-to-end encryption should not be restricted as it is essential for Internet safety.
3. Behavioural and personalised targeting for non-commercial and political advertising should be phased out to protect users and ensure the existence of traditional media, and be replaced by contextual advertising. The same should apply to targeting people based on sensitive data, or to targeting minors. Behavioural and personalised targeting for commercial advertising should only be possible where users have freely opted in, without exposure to “dark” patterns or the risk of being excluded from services, and without being fatigued by consent banners if they have already made a clear choice in their browser/device settings.
4. In the spirit of the case law on communications metadata, public authorities shall be given access to records of personal online activity only to investigate suspects of serious crimes or prevent serious threats to public safety with prior judicial authorisation.
5. Mere conduit intermediaries should not be required to block access to content. Illegal content should be removed where it is hosted.
6. To protect freedom of expression and media freedom, the decision on the legality of content shall rest with the independent judiciary, not with administrative authorities.
7. Intermediaries should not be required to remove information that is legal in the Member State that they are established in (their country of origin). The effect of cross-border removal orders should be limited to the territory of the issuing Member State.
8. A special regime should apply to addressing traders unlawfully promoting or offering products or services in the Union.
9. Online platforms’ terms and conditions shall respect fundamental rights and permit interferences with the free exchange of lawful information only where it is incompatible with the declared purpose of the service.
10. Adverse decisions by online platforms should be subject to judicial redress.
11. Where allegedly illegal content is notified, qualified staff should take a decision after hearing the publisher.
12. Complaints procedures should be available also to notifiers, such as victims of crime, whose notification has not been acted upon.
13. Automated tools for content moderation and content filters should not be mandatory. They should only exceptionally be used by online platforms for ex-ante control to temporarily block manifestly illegal and context-insensitive content, subject to human review of every automated decision. Algorithms cannot reliably identify illegal content and routinely result in the suppression of legal content, including journalistic content.
14. Providers should not be obliged to sanction users for providing illegal content by temporarily "de-platforming" them, since such an obligation would fail to ensure a decision by the judiciary and bypass the legally defined sanctions.
15. The algorithm-driven spreading of problematic content should be contained by giving users control over the algorithms prioritising the information that is presented to them (recommender systems).
16. “Co-regulatory” instruments (“soft law”) such as codes of conduct and crisis protocols should be subject to a special procedure to safeguard transparency, participation, democratic oversight and fundamental rights.
AMENDMENTS
The Committee on Civil Liberties, Justice and Home Affairs calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) So far, the regulatory approach has relied on voluntary cooperation with a view to addressing the new risks and challenges. As this has proved insufficient and there has been a lack of harmonised rules at Union level, Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. Moreover, a fragmentation of rules can have negative consequences on the freedom of expression. |
Amendment 2
Proposal for a regulation
Recital 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(2 a) Complex regulatory requirements at both Union and Member State level have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially small and medium sized companies. |
Amendment 3
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the rights to privacy, to protection of personal data, to freedom of expression including the freedom to receive and impart information and ideas without interference from public authority and regardless of frontiers, and to non-discrimination, as well as the freedom of media, the freedom to conduct a business and consumer protection. Children have particular rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child (UNCRC). The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world. |
Amendment 4
Proposal for a regulation
Recital 8
|
|
Text proposed by the Commission |
Amendment |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website, of an email address or of other contact details from the Union cannot, on that ground alone, be considered as sufficient to constitute a substantial connection to the Union. |
__________________ |
__________________ |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
Amendment 5
Proposal for a regulation
Recital 9
|
|
Text proposed by the Commission |
Amendment |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) 2021/784 of the European Parliament and of the Council29. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
__________________ |
__________________ |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
29 Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (Text with EEA relevance), OJ L 172, 17.5.2021, p. 79. |
Amendment 6
Proposal for a regulation
Recital 11
|
|
Text proposed by the Commission |
Amendment |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected. |
deleted |
Amendment 7
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should underpin the general idea that what is illegal offline should also be illegal online, while ensuring that what is legal offline should also be legal online. The concept of “illegal content” should be defined appropriately and also cover information relating to illegal content, products, services and activities where such information is itself not in compliance with applicable Union or Member State law. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech, child sexual abuse material or terrorist content and unlawful discriminatory content, or that refers in an illegal manner to activities that are illegal, such as the unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law, in particular the Charter, and what the precise nature or subject matter is of the law in question. |
Amendment 8
Proposal for a regulation
Recital 13
|
|
Text proposed by the Commission |
Amendment |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
Amendment 9
Proposal for a regulation
Recital 14
|
|
Text proposed by the Commission |
Amendment |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision on whom to grant access. Information exchanged using interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, are not considered to have been disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
__________________ |
__________________ |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
Amendment 10
Proposal for a regulation
Recital 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(15a) The online activities of a person allow for deep insights into their personality as well as their past and future behaviour, making it possible to manipulate them. The high sensitivity of such information and its potential for abuse requires special protection. In accordance with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the right to use and pay for information society services anonymously wherever reasonable efforts can make this possible. This should apply without prejudice to the obligations in Union law on the protection of personal data. Providers can enable anonymous use of their services by refraining from collecting personal data regarding the recipient and their online activities and by not preventing recipients from using anonymizing networks for accessing the service. Anonymous payment can take place for example by paying in cash, by using cash-paid vouchers or prepaid payment instruments. The general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy and the protection of personal data. According to Regulation (EU) 2016/679 users have a right not to be subject to pervasive tracking when using information society services. Following the jurisprudence on communications meta-data providers should not be required to indiscriminately retain personal data concerning the use of the service by all recipients. Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, and effectively prevents unauthorised third party access. The fact that encryption technology is abused by some for illegal purposes does not justify generally weakening encryption. |
Amendment 11
Proposal for a regulation
Recital 15 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(15b) Targeting individuals based on personal data, including behavioural data, should not be permitted for non-commercial and political purposes. Misleading or obscure advertising for non-commercial and political purposes is a special class of online threat because it influences the core mechanisms that enable the functioning of our democratic society. Targeting minors on the basis of their personal data or targeting individuals on the basis of special categories of data which allow for targeting vulnerable groups should not be permitted. Targeting recipients for commercial purposes should require the recipients’ consent. To ensure that recipients have a real choice, refusing consent should be no more complicated than giving consent, “dark patterns” should not be used to undermine the recipient’s choice and refusing consent should not result in access to the functionalities of the platform being disabled. In order to avoid fatiguing recipients who refuse to consent, terminal equipment settings that signal an objection to processing of personal data should be respected. Displaying contextual advertisements does not require processing personal data and is thus less intrusive. |
Amendment 12
Proposal for a regulation
Recital 18
|
|
Text proposed by the Commission |
Amendment |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
(18) The exemptions from liability established in this Regulation should not apply where the provider of intermediary services has knowledge of, or control over, information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. The exemptions from liability established by this Regulation should not depend on uncertain notions such as an ‘active’, ‘neutral’ or ‘passive’ role of providers. |
Amendment 13
Proposal for a regulation
Recital 22
|
|
Text proposed by the Commission |
Amendment |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, after having become aware of the unlawful nature of content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression, including the right to receive and impart information and ideas without interference by public authority. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
Amendment 14
Proposal for a regulation
Recital 25
|
|
Text proposed by the Commission |
Amendment |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
deleted |
Amendment 15
Proposal for a regulation
Recital 27
|
|
Text proposed by the Commission |
Amendment |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services, cloud infrastructure providers and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
Amendment 16
Proposal for a regulation
Recital 28
|
|
Text proposed by the Commission |
Amendment |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, neither de jure nor de facto. A de facto obligation would occur if the non-implementation of a general or preventive monitoring infrastructure would be uneconomical, for instance due to the significant extra cost of alternative human oversight necessities or due to the threat of significant damage payments. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures in relation to illegal content. |
Amendment 17
Proposal for a regulation
Recital 28 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(28a) Providers of intermediary services should not be obliged to use automated tools for content moderation because such tools are incapable of effectively understanding the subtlety of context and meaning in human communication, which is necessary to determine whether assessed content violates the law or terms of service. |
Amendment 18
Proposal for a regulation
Recital 29
|
|
Text proposed by the Commission |
Amendment |
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. |
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. |
Amendment 19
Proposal for a regulation
Recital 30
|
|
Text proposed by the Commission |
Amendment |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
(30) Orders to act against illegal content or to provide information should be issued by designated competent authorities in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online, or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
Amendment 20
Proposal for a regulation
Recital 30 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(30a) In order to avoid conflicting interpretations of what constitutes illegal content and to ensure the accessibility of information that is legal in the Member State in which the provider is established, orders to act against illegal content should in principle be issued by judicial authorities of the Member State in which the provider has its main establishment, or, if not established in the Union, its legal representative. The judicial authorities of other Member States should be able to issue orders the effect of which are limited to the territory of the Member State where the judicial authority issuing the order is based. A special regime should apply to acting against unlawful commercial offers of goods and services. |
Amendment 21
Proposal for a regulation
Recital 31
|
|
Text proposed by the Commission |
Amendment |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union or Member State law or international law and the interests of international comity. Providers of intermediary services should not be legally required to remove content which is legal in their country of establishment. Competent authorities should be able to order the blocking of content legally published outside the Union only for the territory of the Member State where those competent authorities are established. This should be without prejudice to the right of providers to assess the compliance of specific content with their terms and conditions and subsequently remove non-compliant content even if it is not unlawful in their country of establishment. |
Amendment 22
Proposal for a regulation
Recital 32
|
|
Text proposed by the Commission |
Amendment |
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. |
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about non-personal information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. |
Amendment 23
Proposal for a regulation
Recital 33
|
|
Text proposed by the Commission |
Amendment |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information under Union or Member State law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. |
Amendment 24
Proposal for a regulation
Recital 36
|
|
Text proposed by the Commission |
Amendment |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant and up to date information relating to their point of contact, including the languages to be used in such communications. Such information should be notified to the Digital Service Coordinator in the Member State of establishment. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location. |
Amendment 25
Proposal for a regulation
Recital 38
|
|
Text proposed by the Commission |
Amendment |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. A summary of the terms and conditions should also be made publicly available. In order to safeguard the fundamental right to freedom of expression, providers should not be allowed to arbitrarily suppress legal content or act against those providing it. Acting against legal information is justifiable only where that information is incompatible with the declared purpose of the service. For example, where the purpose of an online forum is to discuss a certain issue, supplying information on unrelated topics may be incompatible with the purpose of the service. |
Amendment 26
Proposal for a regulation
Recital 39
|
|
Text proposed by the Commission |
Amendment |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Providers offering their services in more than one Member State should provide a breakdown of the information by Member State. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
__________________ |
__________________ |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 27
Proposal for a regulation
Recital 40
|
|
Text proposed by the Commission |
Amendment |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and accordingly remove or disable access to that content ('action'). The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
Amendment 28
Proposal for a regulation
Recital 41
|
|
Text proposed by the Commission |
Amendment |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent, non-arbitrary and non-discriminatory processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
Amendment 29
Proposal for a regulation
Recital 42
|
|
Text proposed by the Commission |
Amendment |
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. |
(42) Where a hosting service provider decides to remove, disable access to, or restrict proposals by recommender systems of information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, that provider should in a clear and user-friendly manner inform the recipient and, where possible, the notifier of its decision, the reasons for its decision and the available redress possibilities for the recipient to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. That obligation should not apply if the recipient has repeatedly provided manifestly illegal content in the past or if the removal is based on an order to act against illegal content and the competent authority issuing the order had decided not to disclose information for reasons of public security. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. The restriction of proposals by recommender systems can take place, for example, by practices of ‘shadow-banning’ content. |
Amendment 30
Proposal for a regulation
Recital 42 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(42a) When moderating content, mechanisms voluntarily employed by platforms should in principle not lead to ex-ante control measures based on automated tools or upload filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in over blocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to staff of private contractors that lack sufficient independence, qualification and accountability. Ex-ante control measures based on automated tools or upload filtering of content should be understood to mean making publishing subject to an automated decision. It should exceptionally be permitted if the automated decision is effective for a limited period of time, subject to human review and reliably limited to information previously classified as manifestly illegal, irrespective of its context, the identity and the intention of the recipient providing it. Filtering automated content submissions such as spam should be permitted. Where automated tools are used for ex-post content moderation, the provider should ensure human decisions on any action to be taken and the protection of legal content. |
Amendment 31
Proposal for a regulation
Recital 43
|
|
Text proposed by the Commission |
Amendment |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they have more than 4,5 million users in Union or meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
__________________ |
__________________ |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 32
Proposal for a regulation
Recital 44
|
|
Text proposed by the Commission |
Amendment |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
(44) Recipients of the service and organisations or public bodies representing consumers’ interests designated by a Member State as qualified to bring representative actions should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. These systems should also be available to notifiers. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the applicable law. Online platforms concerned should also be able to seek judicial redress against these decisions in accordance with the applicable law. |
Amendment 33
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
__________________ |
__________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 34
Proposal for a regulation
Recital 47
|
|
Text proposed by the Commission |
Amendment |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation might undermine trust and harm the rights and legitimate interests of the parties concerned. Therefore, online platforms should be entitled to put in place appropriate, proportionate and reliable safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should be entitled to temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of manifestly illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
Amendment 35
Proposal for a regulation
Recital 48
|
|
Text proposed by the Commission |
Amendment |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that a serious criminal offence involving a threat to the life of a person, including vulnerable recipients such as children, is imminent such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing the information that has given rise to its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
__________________ |
__________________ |
44 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
44 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
Amendment 36
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
Amendment 37
Proposal for a regulation
Recital 53
|
|
Text proposed by the Commission |
Amendment |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no proportionate alternative and less restrictive measures that would effectively achieve the same result. |
Amendment 38
Proposal for a regulation
Recital 56
|
|
Text proposed by the Commission |
Amendment |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures where mitigation is possible without adversely impacting fundamental rights. |
Amendment 39
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of manifestly illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of manifestly illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is manifestly illegal content or incompatible with an online platform’s terms and conditions. |
Amendment 40
Proposal for a regulation
Recital 58
|
|
Text proposed by the Commission |
Amendment |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment where mitigation is possible without adversely impacting fundamental rights. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations. The decision as to the choice of measures should remain with the very large online platform. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, without adversely impacting the fundamental rights of the recipients of the service. |
Amendment 41
Proposal for a regulation
Recital 59
|
|
Text proposed by the Commission |
Amendment |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
(59) Very large online platforms should, where appropriate, conduct their impact assessments and design their mitigation measures related to any adverse impact with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. The outcome of their impact assessments should be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of their Member State of establishment. |
Amendment 42
Proposal for a regulation
Recital 61
|
|
Text proposed by the Commission |
Amendment |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, a statement of reasons for the failure to reach such a conclusion should be included in the audit opinion. |
Amendment 43
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, provided in a clear and user-friendly manner, including options that are not based on profiling of the recipient. |
Amendment 44
Proposal for a regulation
Recital 64
|
|
Text proposed by the Commission |
Amendment |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the dissemination of illegal content using the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including personal data, trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. Researchers should be independent and not subject to any conflict of interest. Neither the researchers, nor the entities or institutions they work for should in the five years prior to the start of the research activities have received financing from a company that is affected by or has a direct interest in the findings of the research. Researchers should respect a minimum cool-off period of five years between the publication of their findings and working for any company that is affected by or has a direct interest in the findings of that research. |
Amendment 45
Proposal for a regulation
Recital 68
|
|
Text proposed by the Commission |
Amendment |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. |
Amendment 46
Proposal for a regulation
Recital 69
|
|
Text proposed by the Commission |
Amendment |
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan. |
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. |
Amendment 47
Proposal for a regulation
Recital 71 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(71a) "Soft law" instruments such as codes of conduct and crisis protocols may pose a risk to fundamental rights because, unlike legislation, they are not subject to democratic scrutiny and their compliance with fundamental rights is not subject to judicial review. In order to enhance accountability, participation and transparency, procedural safeguards for drawing up codes of conduct and crisis protocols are needed. |
Amendment 48
Proposal for a regulation
Recital 89
|
|
Text proposed by the Commission |
Amendment |
(89) The Board should contribute to achieving a common Union perspective on the consistent application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and codes of conduct and analyse emerging general trends in the development of digital services in the Union. |
(89) The Board should contribute to achieving a common and consistent Union application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and codes of conduct and analyse emerging general trends in the development of digital services in the Union. |
Amendment 49
Proposal for a regulation
Article 1 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities. |
(c) rules on the implementation and enforcement of the requirements set out in this Regulation, including as regards the cooperation of and coordination between the competent authorities. |
Amendment 50
Proposal for a regulation
Article 1 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. |
3. This Regulation shall apply to intermediary services provided to recipients of the service in the Union, irrespective of the place of establishment of the providers of those services. |
Amendment 51
Proposal for a regulation
Article 1 – paragraph 5 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) Union law on copyright and related rights; |
deleted |
Amendment 52
Proposal for a regulation
Article 1 – paragraph 5 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted]; |
(d) Regulation (EU) 2021/784; |
Amendment 53
Proposal for a regulation
Article 1 – paragraph 5 – point i
|
|
Text proposed by the Commission |
Amendment |
(i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. |
deleted |
Amendment 54
Proposal for a regulation
Article 1 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. This Regulation shall not apply to questions relating to information society services covered by Regulation (EU) 2016/679 and Directive 2002/58/EC. |
Amendment 55
Proposal for a regulation
Article 2 – paragraph 1 – point n
|
|
Text proposed by the Commission |
Amendment |
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information; |
(n) ‘advertisement’ means information designed to directly or indirectly promote information, products or services of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against direct or indirect remuneration specifically for promoting that information, product or service; |
Amendment 56
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank, prioritise or curate in its online interface specific information, products or services to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
Amendment 57
Proposal for a regulation
Article 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 2a |
|
Digital privacy |
|
1. Without prejudice to Regulation (EU) 2016/679 and Directive 2002/58/EC, providers of information society services shall make reasonable efforts to enable the use of and payment for that service without collecting personal data of the recipient. |
|
Member States shall not impose a general obligation on providers of information society services to limit the anonymous or pseudonymous use of their services. |
|
2. Operators of online platforms may process personal data concerning the use of the service by a recipient for the sole purpose of operating a recommender system only where the recipient has given his or her explicit consent as defined in Article 4, point (11), of Regulation (EU) 2016/679. |
|
3. Member States shall not oblige providers of information society services to generally and indiscriminately retain personal data of the recipients of their services. Any targeted retention of a specific recipient’s data shall be ordered by a judicial authority in accordance with Union or Member State law. |
|
4. Providers of information society services shall have the right to provide and support encryption services of their choice. Member States shall not impose an obligation on providers of information society services to limit the level of their security and encryption measures. |
Amendment 58
Proposal for a regulation
Article 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 2b |
|
Targeting of digital advertising |
|
1. Providers of information society services shall not collect or process personal data as defined in Article 4, point (1), of Regulation (EU) 2016/679 for the purpose of targeting the recipients to whom advertisements are displayed. |
|
2. By way of derogation from paragraph 1, for the purpose of targeting the recipients to whom advertisements for commercial purposes are displayed, providers of information society services may only collect and use the personal data of recipients who have given their consent as defined in Article 4, point (11), of Regulation (EU) 2016/679 explicitly to such collection and use. Refusing consent shall be no more difficult or time-consuming to the recipient than giving consent. Providers shall not use a method that is designed with the purpose or has the effect of subverting or impairing a recipient’s free decision on whether to consent. Recipients whose terminal equipment signals that they object to the processing of personal data when using information society services pursuant to Article 21(5) of Regulation (EU) 2016/679 shall not be asked for consent. |
|
3. Where access to a service requires consent as referred to in paragraph 2 and a recipient has refused to give such consent, the recipient shall be given other fair and reasonable options to access the service. |
|
4. The personal data referred to in paragraph 2 shall not be collected or used for the purpose of |
|
(a) targeting recipients based on the actual or likely racial or ethnic origin, the political opinions, the religious or philosophical beliefs, the trade union membership, the health, the sex life or the sexual orientation of a recipient, or |
|
(b) targeting recipients below the age of 18. |
|
5. This Articles shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as the editorial content in which the advertisement is displayed, keywords, or the geographical region of the recipients to whom an advertisement is displayed. |
Amendment 59
Proposal for a regulation
Article 3 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
deleted |
Amendment 60
Proposal for a regulation
Article 4 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
2. This Article shall not affect the possibility for a court, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 61
Proposal for a regulation
Article 5 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
4. This Article shall not affect the possibility for a court, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement. |
Amendment 62
Proposal for a regulation
Article 6
|
|
Text proposed by the Commission |
Amendment |
Article 6 |
deleted |
Voluntary own-initiative investigations and legal compliance |
|
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
|
Amendment 63
Proposal for a regulation
Article 7 – title
|
|
Text proposed by the Commission |
Amendment |
No general monitoring or active fact-finding obligations |
No general monitoring or active fact-finding or automated content moderation obligations |
Amendment 64
Proposal for a regulation
Article 7 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. |
No general obligation, neither de jure nor de facto, to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity or to prevent it shall be imposed on those providers. |
Amendment 65
Proposal for a regulation
Article 7 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Providers of intermediary services shall not be obliged to use automated tools for content moderation or for monitoring the behaviour of a large number of natural persons. |
Amendment 66
Proposal for a regulation
Article 8 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
1. Providers of intermediary services shall, upon the receipt via a secure communications channel of an order to act against one or more specific items of illegal content, issued by a national judicial authority, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the order, without undue delay, specifying the action taken and the moment when the action was taken. |
Amendment 67
Proposal for a regulation
Article 8 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
This Article shall apply mutatis mutandis in respect of competent administrative authorities ordering online platforms to act against traders unlawfully promoting or offering products or services in the Union. |
Amendment 68
Proposal for a regulation
Article 8 – paragraph 2 – point a – introductory part
|
|
Text proposed by the Commission |
Amendment |
(a) the orders contains the following elements: |
(a) the order contains the following elements: |
Amendment 69
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent -1 (new)
|
|
Text proposed by the Commission |
Amendment |
|
— the identification details of the judicial authority issuing the order, including the date, time stamp and electronic signature of the authority, that allows the recipient to authenticate the order; |
Amendment 70
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent -1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— a reference to the legal basis for the order; |
Amendment 71
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed; |
— a sufficiently detailed statement of clear reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed; |
Amendment 72
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider of the service and to the recipient of the service who provided the content; |
— clear and user-friendly information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content, including information about effective remedies, as well as the deadlines for appeal; |
Amendment 73
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— where necessary and proportionate, the decision not to disclose information about the removal of or disabling of access to the content for reasons of public security, such as the prevention, investigation, detection and prosecution of serious crime, for as long as necessary, but not exceeding six weeks from that decision. |
Amendment 74
Proposal for a regulation
Article 8 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; |
(b) the territorial scope of an order addressed to a provider that has its main establishment in the Member State issuing the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; |
Amendment 75
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the territorial scope of an order addressed to a provider that has its main establishment in another Member State is limited to the territory of the Member State issuing the order; |
Amendment 76
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(bb) the territorial scope of an order addressed to a provider or its representative that has its main establishment outside the Union is limited to the territory of the Member State issuing the order; |
Amendment 77
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
First subparagraph, points (ba) and (bb), shall not apply where online platforms are ordered to act against traders established in the same Member State as the issuing authority, that are unlawfully promoting or offering products or services in the Union. |
Amendment 78
Proposal for a regulation
Article 8 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67. |
3. The Digital Services Coordinator from the Member State of the authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67. |
Amendment 79
Proposal for a regulation
Article 8 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Member States shall ensure that the judicial authorities may, at the request of an applicant whose rights are infringed by illegal content being accessible, issue against the relevant provider of hosting services an order in accordance with this Article to remove or disable access to that content, including by way of an interlocutory injunction. |
Amendment 80
Proposal for a regulation
Article 9 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
1. Providers of intermediary services shall, upon receipt via a secure communications channel of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by a national judicial authority on the basis of the applicable Union or national law, in conformity with Union law, for the purposes of preventing, investigating, detecting and prosecuting serious crime or preventing serious threats to public security inform via a secure communications channel and without undue delay the authority of issuing the order of the effect given to the order, and, where no effect has been given to the order, a statement explaining the reasons. |
Amendment 81
Proposal for a regulation
Article 9 – paragraph 2 – point -a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(-a) the order is issued for the purpose of preventing, investigating, detecting and prosecuting serious crime or preventing serious threats to public security; |
Amendment 82
Proposal for a regulation
Article 9 – paragraph 2 – point -a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(-aa) the order seeks information on a suspect or suspects of serious crime or of a serious threat to public security; |
Amendment 83
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
|
|
Text proposed by the Commission |
Amendment |
|
— the identification details of the judicial authority issuing the order and authentication of the order by that authority, including the date, time stamp and electronic signature of the authority issuing the order to provide information; |
Amendment 84
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— a reference to the legal basis for the order; |
Amendment 85
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences; |
— a sufficiently detailed statement of clear reasons explaining the objective for which the information is required, setting out why the order is necessary and proportional, taking due account of the impact of the order on the fundamental rights of the specific recipient of the service whose data is sought and of the seriousness of the offence; |
Amendment 86
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
— a unique identifier of the recipients of the service on whom information is sought; |
Amendment 87
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
— where the information sought constitutes personal data within the meaning of Article 4, point (1), of Regulation (EU) 2016/679 or Article 3, point (1), of Directive (EU) 2016/680, a justification that the order is in accordance with applicable data protection law; |
Amendment 88
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider and to the recipients of the service concerned; |
— information about redress mechanisms available to the provider and to the recipients of the service concerned; |
Amendment 89
Proposal for a regulation
Article 9 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control; |
(b) the order only requires the provider to provide information already legally collected for the purposes of providing the service and which lies within its control; |
Amendment 90
Proposal for a regulation
Article 9 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Digital Services Coordinator from the Member State of the national judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the order referred to in paragraph 1 to all Digital Services Coordinators through the system established in accordance with Article 67. |
3. The Digital Services Coordinator from the Member State of the national judicial authority issuing the order shall, without undue delay, transmit a copy of the order referred to in paragraph 1 to all Digital Services Coordinators through the system established in accordance with Article 67. |
Amendment 91
Proposal for a regulation
Article 9 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. |
4. Where information is sought for the purpose of preventing, investigating, detecting or prosecuting serious crime, the conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. |
Amendment 92
Proposal for a regulation
Article 9 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The provider shall inform, without undue delay, the recipient whose data is being sought. As long as this is necessary and proportionate for the purpose of protecting the fundamental rights of another person, the issuing judicial authority, taking due account of the impact of the order on the fundamental rights of the person whose data is sought, may decide that the provider shall delay informing the recipient. Such a decision shall be duly justified, and specify the duration of the delay which shall not exceed six weeks. |
Amendment 93
Proposal for a regulation
Article 9 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. This Article shall apply, mutatis mutandis, in respect of competent administrative authorities ordering online platforms to provide the information listed in Article 22 for other purposes than those set out in paragraph 1. |
Amendment 94
Proposal for a regulation
Article 9 – paragraph 4 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
4c. Providers of intermediary services shall disclose personal data on recipients of their service requested by public authorities only where the conditions set out in this Article are met. |
Amendment 95
Proposal for a regulation
Article 9 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 9a |
|
Common European information exchange system |
|
The Commission shall adopt implementing acts pursuant to Article 291 of the Treaty on the Functioning of the European Union, establishing a common European information exchange system with secure channels for the handling of authorised cross-border communications, authentication and transmission of the orders referred to in Articles 8 and 9 of this Regulation and, where applicable, of the requested data between the competent judicial authority and the provider. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70 of this Regulation. |
Amendment 96
Proposal for a regulation
Article 10 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact. |
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact, and shall ensure that that information is up to date. Providers of intermediary services shall provide that information, including the name, the electronic mail address and telephone number, of their single point of contact, to the Digital Service Coordinator in the Member State where they are established. |
Amendment 97
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, unambiguous and easily comprehensible language and shall be publicly available in an easily accessible format. A summary of the terms and conditions setting out the most important points in concise, clear and unambiguous language, shall be made publicly available. Providers of intermediary services shall offer the possibility of easily opting-out from optional clauses, and inform about the remedies available. |
Amendment 98
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
Amendment 99
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The terms and conditions of providers of intermediary services may exclude the hosting of lawful information from those services or otherwise limit the access to information that is lawful or suspend or terminate the provision of the service to recipients for providing lawful information only where the information is incompatible with the declared purpose of the service. |
Amendment 100
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Terms and conditions of providers of intermediary services shall respect the essential principles of fundamental rights enshrined in the Charter. |
Amendment 101
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. Terms that do not comply with this Article shall not be binding on recipients. |
Amendment 102
Proposal for a regulation
Article 12 – paragraph 2 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
2d. Very large online platforms as defined in Article 25 shall publish their terms and conditions in the official languages of all Member States in which they offer their services. |
Amendment 103
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish in an easily accessible manner, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall be searchable and archived for further use. Those reports shall include breakdowns at Member State level and, in particular, information on the following, as applicable: |
Amendment 104
Proposal for a regulation
Article 13 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, the action taken, and the average time needed for taking the action; |
Amendment 105
Proposal for a regulation
Article 13 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) the complete number of content moderators allocated for each official language per Member State, and a qualitative description of whether and how automated tools for content moderation are used in each official language; |
Amendment 106
Proposal for a regulation
Article 13 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action; |
Amendment 107
Proposal for a regulation
Article 13 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; |
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures, as well as the measures taken to qualify content moderators and to ensure that non-infringing content is not affected; |
Amendment 108
Proposal for a regulation
Article 13 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed. |
Amendment 109
Proposal for a regulation
Article 14 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, clearly visible, user-friendly, and located close to the content in question. They shall allow for the submission of notices on a case-by case-basis, exclusively by non-automated electronic means. |
Amendment 110
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can unambiguously, without reasonable doubt, identify the manifest illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
Amendment 111
Proposal for a regulation
Article 14 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
(b) a clear indication of the electronic location of that information, in particular, where applicable, the exact URL or URLs, and, where necessary additional information enabling the identification of the illegal content; |
Amendment 112
Proposal for a regulation
Article 14 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) where an alleged infringement of an intellectual property right is notified, evidence that the entity submitting the notice is the rights holder of the intellectual property right that is allegedly infringed or is authorised to act on behalf of the rights holder ; |
Amendment 113
Proposal for a regulation
Article 14 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The individual or entity submitting the notice may optionally provide the information set out in point (c), which shall not be disclosed to the content provider except in cases of alleged infringements of intellectual property rights referred to in point (ca). |
Amendment 114
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
deleted |
Amendment 115
Proposal for a regulation
Article 14 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Upon receipt of the notice, the service provider shall, using available contacts details, notify the information providers using available contact details of the elements referred to in paragraph 2 and give them the opportunity to reply before taking a decision. |
Amendment 116
Proposal for a regulation
Article 14 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. Notified information shall remain accessible until a decision is taken in respect thereof. |
Amendment 117
Proposal for a regulation
Article 14 – paragraph 4 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
4c. The provider shall ensure that decisions on notices are taken by qualified staff to whom adequate initial and ongoing training on the applicable law and international human rights standards as well as appropriate working conditions shall be provided. |
Amendment 118
Proposal for a regulation
Article 14 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
5. The provider shall also, without undue delay, notify the submitting individual or entity as well as the information provider of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
Amendment 119
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent non-arbitrary and non-discriminatory manner. Where they use automated means for that processing, they shall include information on such use in the notification referred to in paragraph 4. That shall include meaningful information about the procedure followed, the technology used, and the criteria and reasoning supporting the decision. |
Amendment 120
Proposal for a regulation
Article 14 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. The mechanism referred to in paragraph 1 shall be provided free of charge. Where notices are manifestly unfounded or excessive, in particular because of their repetitive character, the provider of hosting services may refuse to act on the request. |
Amendment 121
Proposal for a regulation
Article 15 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
1. Where a provider of hosting services decides to remove or disable access to, or restrict proposals by recommender systems of, specific items of information provided by the recipients of the service, irrespective of the means used for removing or disabling access to, or restricting proposals of, that information, it shall inform the recipient and the notifier, to the extent they provided contact details, at the latest at the time of the removal or disabling of access, or the restricting of proposals, of the decision and provide a clear and specific statement of reasons for that decision. |
Amendment 122
Proposal for a regulation
Article 15 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access; |
(a) whether the decision entails either the removal of, or the disabling of access to, or the restricting of proposals by recommender systems of, the information and, where relevant, the territorial scope of the disabling of access or the restricting of proposals; |
Amendment 123
Proposal for a regulation
Article 15 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means; |
(c) where applicable, information on the means used in taking the decision, and in any case where the decision was taken in respect of content detected or identified using automated means; |
Amendment 124
Proposal for a regulation
Article 15 – paragraph 2 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress. |
(f) clear and user-friendly information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress. |
Amendment 125
Proposal for a regulation
Article 15 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The obligations pursuant to this Article shall not apply
|
|
(a) to manifestly illegal content if the recipient has repeatedly provided manifestly illegal content in the past; or |
|
(b) where the removal or disabling of access, as referred to in paragraph 1 of this Article, is based on an order in accordance with Article 8 and the competent authority that issued that order decides that it is necessary and proportionate that there be no disclosure for reasons of public security, such as the prevention, investigation, detection or prosecution of serious criminal offences, paragraphs 1-4 of this Article shall be suspended for as long as necessary, but not exceeding six weeks from that decision,and the hosting service provider shall not disclose any information. That competent authority may extend that period by a further six weeks, where such non-disclosure continues to be justified. |
|
|
Amendment 126
Proposal for a regulation
Article 15 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. Paragraphs 2 and 4 shall not apply to providers of hosting services that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
Amendment 127
Proposal for a regulation
Article 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15a |
|
Content moderation |
|
1. Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of information for content moderation, except where |
|
(a) automated content moderation decisions to remove or disable access to, or restrict proposals by recommender systems of, specific items are limited to information which is identical to information previously classified by qualified staff or a judicial authority as manifestly illegal irrespective of its context, the identity and the intention of the recipient providing it; |
|
(b) the technology used is in itself sufficiently reliable in that it limits to the maximum extent possible the rate of errors where information is wrongly assumed to be identical to information previously classified as illegal content; |
|
(c) the technology used does not prevent the accessibility and proposals by recommender systems of information which is not illegal content; and |
|
(d) automated content moderation decisions to remove or disable access to, or restrict proposals by recommender systems of, specific items are reviewed expeditiously by qualified staff and, in the absence of expeditious human confirmation, are no longer effective. |
|
Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff. |
|
2. Paragraph 1 shall not apply to moderating information which has most likely been uploaded by automated means. |
|
3. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter. |
Amendment 128
Proposal for a regulation
Article 16 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, unless they have more than 4,5 million users in the Union. |
Amendment 129
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
1. Online platforms shall provide recipients of the service and qualified entities as defined in Article 3, point (4), of Directive (EU) 2020/18281a, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform: |
|
__________________ |
|
1a Directive (EU) 2020/1828 of the European Parliament and of the Council of 25 November 2020 on representative actions for the protection of the collective interests of consumers and repealing Directive 2009/22/EC (OJ L 409, 4.12.2020, p. 1) |
Amendment 130
Proposal for a regulation
Article 17 – paragraph 1 – point -a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(-a) decisions taken not to act after having received a notice pursuant to Article 14; |
Amendment 131
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions to remove or disable access to, or restrict proposals by recommender systems of, the information; |
Amendment 132
Proposal for a regulation
Article 17 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) decisions to suspend or terminate the recipients’ account. |
(c) decisions to suspend or terminate the recipients’ account; |
Amendment 133
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) any other decisions that adversely affect the recipient’s access to significant features of the platform’s regular services, including monetisation of information. |
Amendment 134
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall make the rules of procedure of their internal complaint handling system publicly accessible. When a recipient of the service wants to present a complaint, the online platform shall make these rules easily accessible to him or her in a clear, user-friendly and easily accessible manner. |
Amendment 135
Proposal for a regulation
Article 17 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and non-arbitrary manner. Where a complaint against a decision referred to in paragraph 1, points (a) to (ca), contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not manifestly illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
Amendment 136
Proposal for a regulation
Article 17 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Upon receipt of a complaint against a decision pursuant to point (-a) of Paragraph 1, the online platform shall notify the information provider of the complaint, using available contact details, and give the information provider the opportunity to reply before taking a decision. |
Amendment 137
Proposal for a regulation
Article 17 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. In the case of a complaint against a decision pursuant to point (-a) of Paragraph 1, this shall apply, mutatis mutandis, to information providers who have provided contact details. In case the decision referred to in paragraph 1 is sustained by the internal complaint-handling system, detailed explanation on how it is in line with the platform’s terms and conditions or applicable law shall be provided. |
Amendment 138
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Recipients of the service addressed by the decisions referred to in Article 17(1) and qualified entities as defined in Article 3, point (4), of Directive (EU) 2020/1828, shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Amendment 139
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law. |
The first subparagraph is without prejudice to the right of the recipient concerned to seek redress against the decision of the online platform before a court in accordance with the applicable law, as well as the right of the online platform concerned to seek redress against the decision of the out-of-court dispute settlement body before a court in accordance with the applicable law. |
Amendment 140
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms; |
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and is legally distinct from and functionally independent of the government of the Member State and any other private body; |
Amendment 141
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) it includes legal experts; |
Amendment 142
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute; |
(b) it has the necessary expertise and qualification on issues concerning one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, therefore allowing the body to contribute effectively and adequately to the settlement of a dispute; |
Amendment 143
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure that are clearly visible and easily accessible to the public. |
Amendment 144
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Out-of-court dispute settlement procedures shall preferably be free of charge for the recipient of the service. In the event that costs are applied, the procedure shall be accessible, attractive and inexpensive for recipients of the service. To that end, costs should not exceed a nominal fee. |
Amendment 145
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 3
|
|
Text proposed by the Commission |
Amendment |
Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlement. |
Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees publicly available, and shall make them known to the recipient of the services and the online platform concerned before engaging in the dispute settlement. |
Amendment 146
Proposal for a regulation
Article 18 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. This article is without prejudice to the provisions laid down in Article 43 concerning the ability of recipients of the services to file complaints with the Digital Services Coordinator of their country of residence or, in the case of very large online platforms, the Commission. |
Amendment 147
Proposal for a regulation
Article 18 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 18a |
|
Judicial redress |
|
Member States shall ensure that their judicial authorities, at the request of a recipient of the service who is subject to the decision of an online platform and in accordance with the relevant national law, are entitled to review the legality of such a decision and, where appropriate, to issue interlocutory injunctions, where the decision: |
|
(a) amounts to removing or disabling access to or restricting proposals by recommender systems of information provided by that recipient; |
|
(b) amounts to suspending or terminating the provision of the service, in whole or in part, to that recipient; |
|
(c) amounts to suspending or terminating the recipient’s account; or |
|
(d) adversely affects the recipient’s access to significant features of the online platform’s regular services, including to monetisation of information. |
Amendment 148
Proposal for a regulation
Article 19 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. |
Amendment 149
Proposal for a regulation
Article 19 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; |
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content within a designated area; |
Amendment 150
Proposal for a regulation
Article 19 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger and their designated area of expertise in accordance with paragraph 2. |
Amendment 151
Proposal for a regulation
Article 19 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated or incorrect notices or notices regarding legal information through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
Amendment 152
Proposal for a regulation
Article 19 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger |
Amendment 153
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content, or for which they have received two or more orders to act regarding illegal content in the previous 12 months, unless those orders were later overturned. |
Amendment 154
Proposal for a regulation
Article 20 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
2. Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
Amendment 155
Proposal for a regulation
Article 20 – paragraph 3 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the intention of the recipient, individual, entity or complainant. |
(d) where identifiable, the intention of the recipient, individual, entity or complainant. |
Amendment 156
Proposal for a regulation
Article 20 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
4. Online platforms shall set out, in a clear and user-friendly manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
Amendment 157
Proposal for a regulation
Article 21 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. |
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life of persons is imminent, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide the information that gave rise to the suspicion. |
Amendment 158
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. |
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative and may inform Europol. |
Amendment 159
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. |
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to be imminent, or the Member State where a suspected offender resides or is located, or the Member State where a victim of the suspected offence resides or is located. |
Amendment 160
Proposal for a regulation
Article 22 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; |
(b) a copy of the identification document of the trader on which the name, any information concerning the address contained in the document, the issuing authority and the date of validity is visible or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; |
__________________ |
__________________ |
50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC |
50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC |
Amendment 161
Proposal for a regulation
Article 22 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. The information referred to in paragraph 1, point (b), shall be deleted as soon as it has been compared to the information referred to in point (a) of that paragraph. |
Amendment 162
Proposal for a regulation
Article 23 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing clearly between suspensions enacted after the receipt of multiple orders to act, for for the provision of manifestly illegal content, for the submission of manifestly unfounded notices and for the submission of manifestly unfounded complaints; |
Amendment 163
Proposal for a regulation
Article 23 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied. |
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied, including human review, as well as meaningful information about the procedure followed, the criteria and reasoning applied, and the logic involved in the automated decision-making. |
Amendment 164
Proposal for a regulation
Article 24 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) that the information displayed is an advertisement; |
(a) that the information displayed is an advertisement, including through prominent and harmonised marking; |
Amendment 165
Proposal for a regulation
Article 24 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed and, if different, the natural or legal person who finances the advertisement; |
Amendment 166
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) clear, meaningful and uniform information about the parameters used to target or determine the recipient to whom the advertisement is displayed. |
Amendment 167
Proposal for a regulation
Article 24 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1, point (a). That implementing act shall be adopted in accordance with the advisory procedure referred to in Article 70 of this Regulation. |
Amendment 168
Proposal for a regulation
Article 24 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Online platforms shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed. They shall also inform competent public authorities, upon their request. |
Amendment 169
Proposal for a regulation
Article 24 – paragraph 1 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
Online platforms that display advertising on their online interfaces shall give easy access to competent public authorities, and to NGOs and researchers that act in the public interest, upon their request, to information related to direct and indirect payments or any other remuneration received to display the corresponding advertisement on their online interfaces. |
Amendment 170
Proposal for a regulation
Article 26 – title
|
|
Text proposed by the Commission |
Amendment |
Risk assessment |
Impact assessment |
Amendment 171
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
1. Very large online platforms shall effectively and diligently identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter and always before launching new services, the probability and severity of any adverse impact of the design, functioning and use made of their services in the Union, in particular on fundamental rights, including any systemic impact at the level of a Member State. This impact assessment shall be specific to their services and shall include the following systemic risks: |
Amendment 172
Proposal for a regulation
Article 26 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the dissemination of illegal content through their services; |
(a) the dissemination of illegal content through their services, where the content is manifestly illegal or where orders pursuant to Article 8 have been received; |
Amendment 173
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any negative effects for the exercise of fundamental rights, in particular the rights to respect for private and family life, to the protection of personal data and to freedom of expression and information, the prohibition of discrimination and the rights of the child as well as to the freedom of the press, as enshrined in the Charter; |
Amendment 174
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use, without prejudice to Article 2a, or automated exploitation of the service, or undisclosed paid influence, with an actual or foreseeable negative effect on fundamental rights. |
Amendment 175
Proposal for a regulation
Article 26 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
2. When conducting impact assessments, very large online platforms shall take into account, in particular, the effects of their content moderation systems, recommender systems and systems for selecting, targeting, and displaying advertisement, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions. |
Amendment 176
Proposal for a regulation
Article 26 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Very large online platforms shall communicate the outcome of the impact assessment and supporting documents to the Board of Digital Service Coordinators and the Digital Services Coordinator of their Member State of establishment. A summary version of the impact assessment shall be made publicly available in an easily accessible format. |
Amendment 177
Proposal for a regulation
Article 27 – title
|
|
Text proposed by the Commission |
Amendment |
Mitigation of risks |
Specific measures to mitigate adverse impacts |
Amendment 178
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
1. Very large online platforms shall put in place transparent, appropriate, proportionate and effective mitigation measures tailored to address the specific adverse impact identified pursuant to Article 26, where mitigation is possible without adversely impacting other fundamental rights. Such measures may include, where applicable: |
Amendment 179
Proposal for a regulation
Article 27 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions; |
(a) adapting content moderation or recommender systems and online interfaces, their decision-making processes, the features or functioning of their services, or their terms and conditions; |
Amendment 180
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) appropriate technical and operational measures or capacities, such as appropriate staffing or technical means to expeditiously remove or disable access to illegal content which the platform is aware of or has received an order to act upon; |
Amendment 181
Proposal for a regulation
Article 27 – paragraph 1 – point a b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ab) easily accessible and user-friendly mechanisms for users to report or flag allegedly illegal content, and mechanisms for user moderation; |
Amendment 182
Proposal for a regulation
Article 27 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide; |
(b) targeted measures aimed at limiting or discontinuing the display of advertisements in association with the service they provide for specific content; |
Amendment 183
Proposal for a regulation
Article 27 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk; |
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection and resolution of adverse impacts; |
Amendment 184
Proposal for a regulation
Article 27 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively. |
deleted |
Amendment 185
Proposal for a regulation
Article 27 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The decision as to the choice of measures shall remain with the very large online platform. |
Amendment 186
Proposal for a regulation
Article 27 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in paragraph 1 of this Article, it shall provide a written explanation to the independent auditors setting out the reasons why those measures were not put in place so as to enable the preparation of the audit report pursuant to Article 28(3). |
Amendment 187
Proposal for a regulation
Article 27 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
(a) identification and assessment of the most prominent and recurrent adverse impacts reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
Amendment 188
Proposal for a regulation
Article 27 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) best practices for very large online platforms to mitigate the systemic risks identified. |
(b) best practices for very large online platforms to mitigate the adverse impacts identified. |
Amendment 189
Proposal for a regulation
Article 27 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general recommendations on the application of paragraph 1 in relation to specific impacts, in particular to present best practices and propose possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. Before adopting those recommendations the Commission shall organise public consultations. |
Amendment 190
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the obligations set out in Chapter III, in particular the quality of the identification, analysis and assessment of the adverse impacts referred to in Article 26, and the necessity, proportionality and effectiveness of the impact mitigation measures referred to in Article 27. |
Amendment 191
Proposal for a regulation
Article 28 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the obligations set out in Chapter III; |
deleted |
Amendment 192
Proposal for a regulation
Article 28 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37. |
deleted |
Amendment 193
Proposal for a regulation
Article 28 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Very large online platforms shall ensure auditors have access to all relevant information to perform their duties. |
Amendment 194
Proposal for a regulation
Article 28 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) are independent from the very large online platform concerned; |
(a) are independent from, and do not have conflicts of interest with, the very large online platform concerned and other very large online platforms; |
Amendment 195
Proposal for a regulation
Article 28 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards. |
(c) have proven objectivity and professional ethics, based in particular on adherence to relevant codes of practice or appropriate standards. |
Amendment 196
Proposal for a regulation
Article 28 – paragraph 3 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) a description of specific elements where the auditor could not reach a conclusion, and an explanation of why these elements could not be conclusively audited; |
Amendment 197
Proposal for a regulation
Article 28 – paragraph 3 – point d b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(db) a description of the third-parties consulted as part of the audit; |
Amendment 198
Proposal for a regulation
Article 28 – paragraph 3 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) an audit opinion on whether the very large online platform subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative; |
(e) an audit opinion on whether the very large online platform subject to the audit meaningfully complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative; |
Amendment 199
Proposal for a regulation
Article 28 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified. |
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them. They shall, within one month from receiving those recommendations, adopt an audit implementation report. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified. |
Amendment 200
Proposal for a regulation
Article 29 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, meaningful information about the logic involved and the main parameters used in their recommender systems, and they shall provide clear and user-friendly options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
Amendment 201
Proposal for a regulation
Article 29 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Very large online platforms that use recommender systems shall allow the recipient of the service to have information presented to them in chronological order only. |
Amendment 202
Proposal for a regulation
Article 30 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed and related payments received, where that information is available; |
Amendment 203
Proposal for a regulation
Article 30 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; |
(d) whether the advertisement was intended to exclude or be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose or, if applicable, the selected contexts in which the advertisement was placed; |
Amendment 204
Proposal for a regulation
Article 31 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only request, access, and use that data for those purposes. |
Amendment 205
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. Upon a reasoned request from the Digital Services Coordinator of establishment, three Digital Services Coordinators of destination, or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research in the public interest. |
Amendment 206
Proposal for a regulation
Article 31 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. This shall include personal data only where it is lawfully accessible to the public. |
Amendment 207
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
|
|
Text proposed by the Commission |
Amendment |
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: |
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following three reasons: |
Amendment 208
Proposal for a regulation
Article 31 – paragraph 6 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. |
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information; |
Amendment 209
Proposal for a regulation
Article 31 – paragraph 6 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) insofar as personal data is concerned, giving access to the data would violate applicable Union or Member State data protection law. |
Amendment 210
Proposal for a regulation
Article 31 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Upon completion of their research envisaged in paragraph 2, the vetted researchers shall make their findings publicly available, taking into account the rights and interests of the recipients of the service concerned. |
Amendment 211
Proposal for a regulation
Article 32 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. |
2. Very large online platforms shall only designate persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3 as compliance officers. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. |
Amendment 212
Proposal for a regulation
Article 33 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) a report setting out the results of the risk assessment pursuant to Article 26; |
(a) a report setting out the results of the impact assessment pursuant to Article 26; |
Amendment 213
Proposal for a regulation
Article 33 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the related risk mitigation measures identified and implemented pursuant to Article 27; |
(b) the specific mitigation measures identified and implemented pursuant to Article 27; |
Amendment 214
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board may facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and adverse impacts, in accordance with Union law, in particular on competition and the protection of privacy and personal data. |
Amendment 215
Proposal for a regulation
Article 35 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
2. Where significant adverse impacts within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission shall invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific impact mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
Amendment 216
Proposal for a regulation
Article 35 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. |
deleted |
Amendment 217
Proposal for a regulation
Article 35 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 2, and may regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. |
Amendment 218
Proposal for a regulation
Article 36 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. |
1. The Commission may facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. |
Amendment 219
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
Amendment 220
Proposal for a regulation
Article 36 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. |
deleted |
Amendment 221
Proposal for a regulation
Article 37 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. |
Amendment 222
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures: |
2. The Commission may encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures: |
Amendment 223
Proposal for a regulation
Article 37 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols. |
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. |
Amendment 224
Proposal for a regulation
Article 37 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 37a |
|
Procedure for drawing up codes of conduct and crisis protocols |
|
1. Before initiating or facilitating the negotiation or the revision of codes of conduct or crisis protocols, the Commission shall |
|
(a) consider the appropriateness of proposing legislation instead; |
|
(b) publish the elements of the code or protocol which it aims to propose or advocate; |
|
(c) invite the European Parliament, the Council, the European Union Agency for Fundamental Rights (FRA), the European Data Protection Supervisor and the public to express their opinion and publish their opinions; |
|
(d) conduct a Fundamental Rights Impact Assessment and publish the findings. |
|
2. The Commission shall subsequently publish the elements of the envisaged code or protocol which it intends to propose or advocate. It shall not propose or advocate elements which the European Parliament or the Council object to or which have not been subject to the process set out in paragraph 1. |
|
3. The Commission shall allow representatives of civil society organisations which advocate for the interests of the recipients of relevant services, the European Parliament, the Council and FRA to observe the negotiations and to have access to all documents pertaining to them. The Commission shall offer compensation to civil society participants. |
|
4. The Commission shall publish the codes of conduct and crisis protocols and to whom they apply and shall keep that information updated. |
Amendment 225
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. |
Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. |
Amendment 226
Proposal for a regulation
Article 38 – paragraph 3 – subparagaph 1
|
|
Text proposed by the Commission |
Amendment |
Member States shall designate the Digital Services Coordinators within two months from the date of entry into force of this Regulation. |
Member States shall designate the Digital Services Coordinators within two months from the date of entry into force of this Regulation. When a Member State is subject to a procedure referred to in Article 7(1) or 7(2) of the Treaty on European Union, the Commission shall confirm that the Digital Services Coordinator proposed by that Member State fulfils the requirements laid down in Article 39 of this Regulation before that Digital Services Coordinator can be designated. |
Amendment 227
Proposal for a regulation
Article 39 – paragraph -1 (new)
|
|
Text proposed by the Commission |
Amendment |
|
-1. Member States shall ensure that the Digital Services Coordinators are legally distinct from and functionally independent of their respective governments and of any other public or private body. |
Amendment 228
Proposal for a regulation
Article 41 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period; |
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period, with the exception of information that is protected by professional secrecy requirements or immunities and privileges in accordance with the applicable law; |
Amendment 229
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding; |
(a) the power to accept the lawful commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding; |
Amendment 230
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures: |
3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of hosting services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures: |
Amendment 231
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place. |
(b) where the Digital Services Coordinator considers that the provider has not complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving an imminent threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access to that infringing content of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place. |
Amendment 232
Proposal for a regulation
Article 42 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned. |
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual world-wide income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual world-wide income or turnover of the provider concerned. |
Amendment 233
Proposal for a regulation
Article 42 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. |
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average world-wide daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. |
Amendment 234
Proposal for a regulation
Article 43 – title
|
|
Text proposed by the Commission |
Amendment |
Right to lodge a complaint |
Right to lodge a complaint and right to an effective judicial remedy |
Amendment 235
Proposal for a regulation
Article 43 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority, and inform the person who lodged the complaint thereof. |
Amendment 236
Proposal for a regulation
Article 43 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Pursuant to paragraph 1 the Digital Services Coordinator of establishment, in cases concerning complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, shall assess the matter in a timely manner and shall inform the Digital Services Coordinator of the Member State where the recipient resides or is established,on how the complaint has been handled. |
Amendment 237
Proposal for a regulation
Article 43 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Without prejudice to any other administrative or non-judicial remedy, a recipient shall have the right to an effective judicial remedy where the competent Digital Services Coordinator does not handle a complaint or does not inform the recipient within three months on the progress or outcome of the complaint lodged pursuant to paragraph 1. |
Amendment 238
Proposal for a regulation
Article 44 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned; |
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial authority, or an administrative authority pursuant to Article 8(1) or Article 9(4b), of the Member State of the Digital Services Coordinator concerned; |
Amendment 239
Proposal for a regulation
Article 45 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph1. |
Amendment 240
Proposal for a regulation
Article 45 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Within two months of receiving information about measures as referred to in paragraph 7, the Commission shall conclude whether the assessment or the measures taken pursuant to that paragraph are incompatible with this Regulation. Where the Commission concludes that the assessment or the measures taken pursuant to paragraph 7 are incompatible with this Regulation, it shall take a final decision on this matter by means of an implementing act. That implementing act shall be adopted in accordance with the examination procedure referred to in Article 70(3). |
Amendment 241
Proposal for a regulation
Article 47 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) Contributing to the consistent application of this Regulation and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation; |
(a) Contributing to the consistent application of this Regulation across the Union and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation; |
Amendment 242
Proposal for a regulation
Article 48 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available. |
5. The Board may invite experts and observers to attend its meetings, and shall cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available. |
Amendment 243
Proposal for a regulation
Article 48 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Board shall adopt its rules of procedure, following the consent of the Commission. |
6. The Board shall adopt its rules of procedure by a two-thirds majority of its members. |
Amendment 244
Proposal for a regulation
Article 49 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation; |
(d) advise the Commission to take the measures referred to in Article 51 and adopt opinions on draft Commission measures and on other issues concerning very large online platforms in accordance with this Regulation; |
Amendment 245
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) issue opinions, recommendations or advice on matters related to Article 34. |
Amendment 246
Proposal for a regulation
Article 50 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35. |
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. |
Amendment 247
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, or upon request of at least three of the Digital Services Coordinators of destination, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
Amendment 248
Proposal for a regulation
Article 52 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period, with the exception of information covered by professional secrecy requirements or by immunities and privileges in accordance with the applicable law. |
Amendment 249
Proposal for a regulation
Article 56 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action. |
1. If, during proceedings under this Section, the very large online platform concerned offers lawful commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action. |
Amendment 250
Proposal for a regulation
Article 56 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) where the very large online platform concerned acts contrary to its commitments; or |
(b) where the very large online platform concerned acts contrary to its lawful commitments; or |
Amendment 251
Proposal for a regulation
Article 57 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms. |
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation and the Charter by the very large online platform concerned, including the operation of any algorithm in the provision of the services of that platform. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms. |
Amendment 252
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently: |
In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total world-wide turnover in the preceding financial year where it finds that that platform, intentionally or negligently: |
Amendment 253
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently: |
The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total world-wide turnover in the preceding financial year, where they intentionally or negligently: |
Amendment 254
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: |
The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average world-wide daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: |
Amendment 255
Proposal for a regulation
Article 67 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission shall adopt implementing acts laying down the practical and operational arrangements for the functioning of the information sharing system and its interoperability with other relevant systems. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. |
3. The Commission shall adopt implementing acts laying down the practical and operational arrangements for the functioning of the information sharing system and its interoperability with other relevant systems. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70(2). |
Amendment 256
Proposal for a regulation
Article 70 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where reference is made to this Article, Article 4 of Regulation (EU) No 182/2011 shall apply. |
2. Where reference is made to this paragraph, Article 4 of Regulation (EU) No 182/2011 shall apply. |
Amendment 257
Proposal for a regulation
Article 70 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Where reference is made to this paragraph, Article 5 of Regulation (EU) No 182/2011 shall apply. |
ANNEX: LIST OF ENTITIES OR PERSONS
FROM WHOM THE RAPPORTEUR HAS RECEIVED INPUT
The following list is drawn up on a purely voluntary basis under the exclusive responsibility of the rapporteur. The rapporteur has received input from the following entities or persons in the preparation of the opinion, until the adoption thereof in committee:
1. 5Rights Foundation
2. Access Now
3. Adevinta
4. Adigital
5. Advertising Information Group (AIG)
6. AirBnb Germany
7. Allied for Startups
8. Amazon
9. Amnesty International
10. APCO Worldwide
11. ARD and ZDF
12. Article 19
13. Association of Commercial Television in Europe (ACT)
14. Association of European Radios (AER)
15. Association of Television and Radio Sales Houses (EGTA)
16. Automattic, Jodel, Seznam, Twitter and Vimeo
17. Avaaz
18. AWO
19. Axel Springer
20. BEUC: The European Consumer Organisation
21. Bitkom
22. Bouygues Europe
23. Bundesverband Digitalpublisher und Zeitungsverleger (BDZV)
24. Bundesvereinigung Deutscher Apothekerverbände (ABDA)
25. Center for Democracy and Technology (CDT)
26. CENTR
27. Civil Liberties Union for Europe (Liberties)
28. Classifieds Marketplaces Europe (CME)
29. Cloud Infrastructure Services Providers in Europe (CISPE)
30. Cloudflare
31. Coalition for App Fairness (CAF)
32. Computer & Communications Industry Association (CCIA)
33. Deutscher Anwaltverein (DAV)
34. Deutscher Gewerkschaftsbund (DGB)
35. Digital Online Tech Europe (DOT)
36. Dropbox
37. DuckDuckGo
38. E-Commerce Europe (ECOM)
39. Electronic Frontier Foundation (EFF)
40. Etsy
41. EU DisinfoLab
42. Eurocities
43. EuroISPA
44. Europabeauftragter der deutschen Landesmedienanstalten (DLM)
45. Europe’s Videogaming Industry (ISFE)
46. European Association of E-Pharmacies (EAEP)
47. European Brands Association (AIM)
48. European Broadcasting Union (EBU)
49. European Cities
50. European Council of the Liberal Professions (CEPLIS)
51. European Digital Rights (EDRi)
52. European Disability Forum
53. European Federation of Journalists (EFJ)
54. European Games Developer Federation (EGDF)
55. European Gaming and Betting Association (EGBA)
56. European Holiday Home Association (EHHA)
57. European Internet Services Providers Association (EuroISPA)
58. European Magazine Media Association (EMMA)
59. European Media
60. European Newspaper Publishers' Association (ENPA)
61. European Policy Centre
62. European Regulators Group for Audiovisual Media Services (ERGA)
63. European Tech Alliance (EUTA)
64. European Telecommunications Network Operators' Association (ETNO)
65. Federation of European Data and Marketing (FEDMA)
66. Federation of Small Businesses (FSB)
67. Fondation Descartes
68. Gesellschaft für Freiheitsrechte (GFF)
69. Glassdoor
70. Global Witness
71. Google
72. GSM Association (GSMA)
73. Hate Aid
74. IAB Europe
75. Imaging Consumables Coalition of Europe, Middle East and Africa (ICCE)
76. Information Technology Industry Council (ITI)
77. International Video Federation (IVF)
78. Internet Commission
79. Internet Society
80. Magazine Media
81. Match Group
82. Microsoft
83. Missing Children Europe
84. Mozilla
85. News Media Europe
86. Orange
87. Panoptykon
88. Pinterest
89. Political Intelligence
90. Rakuten Group
91. Reddit
92. Reporters without Borders
93. Seznam.cz, Lilo, Google, Verizon Media and Microsoft
94. Shopify
95. Snap
96. Society of Audiovisual Authors (SAA)
97. Spitzenorganisation der Filmwirtschaft (SPIO)
98. Swedish Trade Association
99. Telefonica
100. Together Against Counterfeiting (TAC) Alliance
101. Tutanota
102. Twitch
103. Twitter
104. Verband der öffentlichen Wirtschaft und Gemeinwirtschaft Österreichs (VÖWG)
105. Verband Deutscher Zeitschriftenverleger e. V. (VDZ)
106. Verbraucherzentrale Bundesverband (vzbv)
107. Vodafone
108. Wikimedia Foundation
109. World Federation of Advertisers (WFA)
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
LIBE 8.2.2021 |
|||
Associated committees - date announced in plenary |
20.5.2021 |
|||
Rapporteur for the opinion Date appointed |
Patrick Breyer 22.4.2021 |
|||
Discussed in committee |
12.4.2021 |
3.6.2021 |
21.6.2021 |
14.7.2021 |
Date adopted |
14.7.2021 |
|
|
|
Result of final vote |
+: –: 0: |
37 24 0 |
||
Members present for the final vote |
Magdalena Adamowicz, Katarina Barley, Pernando Barrena Arza, Pietro Bartolo, Nicolas Bay, Vladimír Bilčík, Vasile Blaga, Ioan-Rareş Bogdan, Patrick Breyer, Saskia Bricmont, Jorge Buxadé Villalba, Damien Carême, Caterina Chinnici, Clare Daly, Anna Júlia Donáth, Lena Düpont, Cornelia Ernst, Laura Ferrara, Nicolaus Fest, Jean-Paul Garraud, Maria Grapini, Sylvie Guillaume, Evin Incir, Sophia in ‘t Veld, Patryk Jaki, Marina Kaljurand, Fabienne Keller, Peter Kofod, Łukasz Kohut, Moritz Körner, Alice Kuhnke, Jeroen Lenaers, Juan Fernando López Aguilar, Lukas Mandl, Nuno Melo, Nadine Morano, Javier Moreno Sánchez, Maite Pagazaurtundúa, Nicola Procaccini, Emil Radev, Paulo Rangel, Ralf Seekatz, Michal Šimečka, Birgit Sippel, Sara Skyttedal, Martin Sonneborn, Tineke Strik, Ramona Strugariu, Annalisa Tardino, Dragoş Tudorache, Tom Vandendriessche, Bettina Vollath, Jadwiga Wiśniewska, Elena Yoncheva, Javier Zarzalejos |
|||
Substitutes present for the final vote |
Bartosz Arłukowicz, Damian Boeselager, Isabel Santos, Yana Toom, Miguel Urbán Crespo, Isabel Wiseler-Lima |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
37 |
+ |
ID |
Peter Kofod |
NI |
Laura Ferrara, Martin Sonneborn |
PPE |
Bartosz Arłukowicz |
Renew |
Anna Júlia Donáth, Sophia in 't Veld, Fabienne Keller, Moritz Körner, Maite Pagazaurtundúa, Michal Šimečka, Ramona Strugariu, Yana Toom, Dragoş Tudorache |
S&D |
Katarina Barley, Pietro Bartolo, Caterina Chinnici, Maria Grapini, Sylvie Guillaume, Evin Incir, Marina Kaljurand, Łukasz Kohut, Juan Fernando López Aguilar, Javier Moreno Sánchez, Isabel Santos, Birgit Sippel, Bettina Vollath, Elena Yoncheva |
The Left |
Pernando Barrena Arza, Clare Daly, Cornelia Ernst, Miguel Urbán Crespo |
Verts/ALE |
Damian Boeselager, Patrick Breyer, Saskia Bricmont, Damien Carême, Alice Kuhnke, Tineke Strik |
24 |
- |
ECR |
Jorge Buxadé Villalba, Patryk Jaki, Nicola Procaccini, Jadwiga Wiśniewska |
ID |
Nicolas Bay, Nicolaus Fest, Jean-Paul Garraud, Annalisa Tardino, Tom Vandendriessche |
PPE |
Magdalena Adamowicz, Vladimír Bilčík, Vasile Blaga, Ioan-Rareş Bogdan, Lena Düpont, Jeroen Lenaers, Lukas Mandl, Nuno Melo, Nadine Morano, Emil Radev, Paulo Rangel, Ralf Seekatz, Sara Skyttedal, Isabel Wiseler-Lima, Javier Zarzalejos |
Key to symbols:
+ : in favour
- : against
0 : abstention
OPINION OF THE COMMITTEE ON ECONOMIC AND MONETARY AFFAIRS (26.10.2021)
for the Committee on the Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9‑0418/2020 – 2020/0361(COD))
Rapporteur for opinion: Mikuláš Peksa
AMENDMENTS
The Committee on Economic and Monetary Affairs calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 1
|
|
Text proposed by the Commission |
Amendment |
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole. |
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks, not least cybersecurity risks, and challenges, both for individual users and for society and the economy as a whole. |
_________________ |
_________________ |
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). |
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). |
Amendment 2
Proposal for a regulation
Recital 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(1a) The digitalisation of European society and its economy is often leaving policy makers, corporations and citizens struggling to catch up. Furthermore, the accumulation of data is regularly creating an uneven competitive level on the market since this is being used as a tool to determine who enters and who exits the market. |
Amendment 3
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice without lock-in effects. |
Amendment 4
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
Amendment 5
Proposal for a regulation
Recital 4
|
|
Text proposed by the Commission |
Amendment |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
(4) Therefore, in order to safeguard and improve the functioning of the internal market and ensure that citizens’ fundamental rights are respected, a targeted set of uniform, effective, risk-based and proportionate mandatory rules should be established at Union level. This Regulation provides the right conditions and competitive settings for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers, fostering interoperability and assure the possibility for new entries to penetrate the market. By using requirements that are technology neutral, innovation and the competitiveness of Union companies should not be hampered but instead be stimulated. |
Amendment 6
Proposal for a regulation
Recital 5
|
|
Text proposed by the Commission |
Amendment |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and their responsibility to uphold fundamental rights. |
__________________ |
__________________ |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
Amendment 7
Proposal for a regulation
Recital 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(5a) Given the cross-border nature of the services concerned, Union action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and to ensure that equal right of access to and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms would also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws rely on intermediary services to reach end-users. Therefore, accessibility requirements for intermediary services, including their online interfaces, should be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. That aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals. |
Amendment 8
Proposal for a regulation
Recital 8
|
|
Text proposed by the Commission |
Amendment |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. |
_________________ |
_________________ |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). |
Amendment 9
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined to cover information relating to illegal content, products, services and activities following the Member States of origin principle. The illegal nature of such content, products or services is defined by relevant Union law or national law in accordance with Union law. That concept should, for example, be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
Amendment 10
Proposal for a regulation
Recital 12 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(12a) Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content, including the content which represents an expression of polemic or controversial views in the course of public debate, should not be considered as illegal content. Similarly, material, such as an eye-witness video of a potential crime, should not be considered as illegal, merely because its depicts an illegal act. An assessment shall determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes. |
Amendment 11
Proposal for a regulation
Recital 13
|
|
Text proposed by the Commission |
Amendment |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, comment sections, readers' forums or editorial communities of newspapers and editorial platforms, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
Amendment 12
Proposal for a regulation
Recital 14
|
|
Text proposed by the Commission |
Amendment |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation, as they are not considered to be disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. |
__________________ |
__________________ |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36 |
Amendment 13
Proposal for a regulation
Recital 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(15a) The general collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy in the digital age. In line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the possibility to access information society services rights to use and pay for information society services anonymously wherever technically possible. Similarly users should have a right not to be subject to tracking when using information society services. To that end, the processing of personal data concerning the use of services should be limited to the extent strictly necessary to provide the service and to bill the users. |
Amendment 14
Proposal for a regulation
Recital 18
|
|
Text proposed by the Commission |
Amendment |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
(18) The exemptions from liability established in this Regulation should not apply where the provider of intermediary services has knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
Amendment 15
Proposal for a regulation
Recital 22
|
|
Text proposed by the Commission |
Amendment |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously and in good faith to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation, without prejudice to Article 6, in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
Amendment 16
Proposal for a regulation
Recital 27
|
|
Text proposed by the Commission |
Amendment |
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
(27) New technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem that makes it harder for both policymakers to manage, as well as for new entrants to penetrate the market. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, cloud infrastructure services, Virtual Private Networks (VPN) or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. |
Amendment 17
Proposal for a regulation
Recital 28
|
|
Text proposed by the Commission |
Amendment |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature nor should they use automated tools for content moderation. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Nothing in this Regulation should prevent providers from enacting end-to-end encryption of their services. |
Amendment 18
Proposal for a regulation
Recital 31
|
|
Text proposed by the Commission |
Amendment |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. |
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. |
Amendment 19
Proposal for a regulation
Recital 31 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(31a) The Commission should ensure the proper enforcement of this Regulation at Union and Member State level, in order to avoid potential inequalities, differences of approach and unfair competition within or from outside the Union. |
Amendment 20
Proposal for a regulation
Recital 35
|
|
Text proposed by the Commission |
Amendment |
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online. |
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should be obliged to comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices, safeguarding the competitive nature of the sector by assuring the possibility for new entrants to penetrate the market, and protecting fundamental rights online. |
Amendment 21
Proposal for a regulation
Recital 36
|
|
Text proposed by the Commission |
Amendment |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . |
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. This contact point could be the same contact point that has been created in accordance with other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location. |
Amendment 22
Proposal for a regulation
Recital 38
|
|
Text proposed by the Commission |
Amendment |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. |
(38) Whilst the freedom of contract of providers of intermediary services should be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes and the protection of fundamental values such as freedom and pluralism of the media. |
Amendment 23
Proposal for a regulation
Recital 39
|
|
Text proposed by the Commission |
Amendment |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report in a standardised and machine-readable format and in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
__________________ |
__________________ |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 24
Proposal for a regulation
Recital 43
|
|
Text proposed by the Commission |
Amendment |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. In this regard, the Commission and Digital Service Coordinators should have the possibility to work together on information and guidelines for the voluntary implementation of this Regulation for micro or small enterprises. Furthermore, the Commission and Digital Services Coordinators are also encouraged to do so for medium enterprises, which while not benefitting from the liability exemptions in Section 3, may sometimes lack the legal resources necessary to ensure proper understanding and compliance with this Regulation. |
_________________ |
_________________ |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 25
Proposal for a regulation
Recital 47
|
|
Text proposed by the Commission |
Amendment |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
(47) The misuse of services of online platforms by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
Amendment 26
Proposal for a regulation
Recital 48
|
|
Text proposed by the Commission |
Amendment |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a criminal offence. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. |
_________________ |
|
44 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). |
|
Amendment 27
Proposal for a regulation
Recital 49
|
|
Text proposed by the Commission |
Amendment |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
Amendment 28
Proposal for a regulation
Recital 50
|
|
Text proposed by the Commission |
Amendment |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of some of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System.45 The online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for that information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
_________________ |
_________________ |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
Amendment 29
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. The advertisement led model has generated deep changes in the way information is presented and has created new data collection patterns and business models that are not always positive. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that data collection is kept to a minimum, the maximisation of revenue from advertising does not limit the quality of the service and the recipients of the service have extensive individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
Amendment 30
Proposal for a regulation
Recital 53
|
|
Text proposed by the Commission |
Amendment |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address challenges to fundamental rights, there being no alternative and less restrictive measures that would effectively achieve the same result. |
Amendment 31
Proposal for a regulation
Recital 54
|
|
Text proposed by the Commission |
Amendment |
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. |
(54) Very large online platforms may cause societal and economic risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative socioeconomic impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary and taking into account the evolution of Union’s population. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. |
Amendment 32
Proposal for a regulation
Recital 55
|
|
Text proposed by the Commission |
Amendment |
(55) In view of the network effects characterising the platform economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator should be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation. |
(55) In view of the network effects characterising the platform economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market, economic actors and consumers. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator should be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation. |
Amendment 33
Proposal for a regulation
Recital 56
|
|
Text proposed by the Commission |
Amendment |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement at both Union and national level, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate and transparent mitigating measures to redress in particular filtering bubbles and filtering effects. |
Amendment 34
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant societal and economic systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
Amendment 35
Proposal for a regulation
Recital 58
|
|
Text proposed by the Commission |
Amendment |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. Such reinforcement could include the expansion and resource allocation to content moderation in languages other than English. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, the competitive aspect of the economy, security to trade, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
Amendment 36
Proposal for a regulation
Recital 60 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(60a) Auditors of digital services, whether independent or not, need to have specific technological and operational competences and expertise in the sector,. They also need to be knowledgeable in the relevant social, economic and human rights issues, among others. Whether as SMEs or multinationals, extensions of existing accountancy and auditing, legal, and ICT consultancy or similar companies cannot be automatically assumed to have the required knowhow to qualify as auditors. Member States and the Commission should therefore develop protocols – following consultation with all actors involved – by which to assess and accredit auditors of digital services, preferably according to clear rules based on Union practice, and thereby to establish registers of accredited auditors on a national and on a Union level. |
Amendment 37
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should ensure that recipients are appropriately informed of the use of recommender systems, and that recipients can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. Very large online platforms should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
Amendment 38
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
(63) Advertising systems used by very large online platforms pose particular risks at both economic and political level, and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. In particular, the accumulation of personal data by online platforms is converted into massive commercial assets often used as a way to give an advantage to certain economic players. Therefore, very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
Amendment 39
Proposal for a regulation
Recital 65
|
|
Text proposed by the Commission |
Amendment |
(65) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation. |
(65) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society and the economy, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation. |
Amendment 40
Proposal for a regulation
Recital 65 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(65a) Interoperability requirements for very large online platforms are desirable as they can create new opportunities for the development of innovative services, overcome the lock-in effect of closed platforms and ensure competition and user choice. Very large online platforms should provide an application programming interface through which third-party platforms and their recipients can interoperate with the ancillary services and, where possible, the main functionalities and recipients of the core services offered by the platform. The interoperability requirements do not prevent platforms from offering non-core additional features to their recipients. |
Amendment 41
Proposal for a regulation
Recital 66
|
|
Text proposed by the Commission |
Amendment |
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. |
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, interoperability of content hosting platforms or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. |
Amendment 42
Proposal for a regulation
Recital 68
|
|
Text proposed by the Commission |
Amendment |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society, the economy and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which from a microeconomic perspective are particularly harmful for vulnerable recipients of the service, such as children but which could also hamper the competitive aspect of the market. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
Amendment 43
Proposal for a regulation
Recital 71
|
|
Text proposed by the Commission |
Amendment |
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. |
(71) In case of extraordinary circumstances affecting public security, the economy of one or more Member States, or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. |
Amendment 44
Proposal for a regulation
Recital 71 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(71a) In order to ensure that the systemic role of very large online platforms does not endanger the internal market by unfairly excluding innovative new entrants, including SMEs, entrepreneurs and start-ups, additional rules are needed to allow recipients of a service to switch or connect and interoperate between online platforms or internet ecosystems. Therefore, interoperability obligations should require very large online platforms to share appropriate tools, data, expertise, and resources. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of technical interfaces (Application Programming Interface), that would allow recipients of service or other market participants to benefit from the key functionalities of very large online platforms to exchange information. |
Amendment 45
Proposal for a regulation
Recital 77
|
|
Text proposed by the Commission |
Amendment |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers, human resources and financial means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. Furthermore, the Digital Services Coordinator of each Member State should establish a structured working relationship with the national competition authorities as well as the financial regulatory authorities working on their territory. |
Amendment 46
Proposal for a regulation
Recital 87
|
|
Text proposed by the Commission |
Amendment |
(87) In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, the Commission to intervene and exercise its investigatory and enforcement powers under this Regulation. |
(87) In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, assistance from the Commission or otherwise ask the Commission to intervene and exercise its investigatory and enforcement powers under this Regulation. |
Amendment 47
Proposal for a regulation
Recital 93 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(93a) However, the sector of digital services is a fast moving one in which Union cannot afford Regulation that is lagging behind technological and operational innovations. Governance structures should remain fit for purpose, flexible and transparent. While ensuring accountability on the part of players in the sector, they themselves should remain accountable. Regulatory structures in which any one institution is granted powers so that it seemingly can operate as prosecution, jury and judge, could easily create problems of checks and balances thereby stimulating more litigation; it could also be less flexible in dealing with innovation. Therefore the Board should during the first five years after the entry into force of this Regulation carry out a continuous assessment of governance structures related to this Regulation and eventually make recommendations for their improvement, their streamlining, and the consolidation of effective checks and balances mechanisms. |
Amendment 48
Proposal for a regulation
Recital 94
|
|
Text proposed by the Commission |
Amendment |
(94) Given the importance of very large online platforms, in view of their reach and impact, their failure to comply with the specific obligations applicable to them may affect a substantial number of recipients of the services across different Member States and may cause large societal harms, while such failures may also be particularly complex to identify and address. |
(94) Given the importance of very large online platforms, in view of their reach and impact, their failure to comply with the specific obligations applicable to them may affect a substantial number of recipients of the services across different Member States and may cause large societal and economic harms, while such failures may also be particularly complex to identify and address. |
Amendment 49
Proposal for a regulation
Recital 97
|
|
Text proposed by the Commission |
Amendment |
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. |
(97) The Commission should, on the basis of this Regulation and other relevant Union law, decide whether or not to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. |
Amendment 50
Proposal for a regulation
Recital 99
|
|
Text proposed by the Commission |
Amendment |
(99) In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers |
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers. |
Amendment 51
Proposal for a regulation
Recital 100
|
|
Text proposed by the Commission |
Amendment |
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods. |
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for systemic non-compliance with the relevant obligations and breach of the procedural rules, subject to appropriate limitation periods. Systemic non-compliance is a pattern of online harm that, when the individual harms are added up, constitutes an aggregation of systemic harm to active recipients of the service across three or more Member States. |
Amendment 52
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes: |
1. This Regulation lays down harmonised rules on the provision of intermediary services in order to improve the functioning of the internal market whilst ensuring the rights enshrined in the Charter, in particular the freedom of expression and information in an open and democratic society. In particular, it establishes: |
Amendment 53
Proposal for a regulation
Article 1 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) contribute to the proper functioning of the internal market for intermediary services; |
(a) contribute to the proper functioning of the internal market for intermediary services and affected economic actors and encourage competition; |
Amendment 54
Proposal for a regulation
Article 1 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
(b) set out uniform rules for a safe, accessible, including for persons with disabilities, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected; |
Amendment 55
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) achieve a high level of consumer protection in the Digital Single Market. |
Amendment 56
Proposal for a regulation
Article 1 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. |
3. This Regulation shall apply to intermediary services directed at and provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. |
Amendment 57
Proposal for a regulation
Article 1 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. This Regulation shall respect the fundamental rights recognised by the Charter and the fundamental rights constituting general principles of Union law. Accordingly, this Regulation may only be interpreted and applied in accordance with those fundamental rights, including the freedom of expression and information, as well as the freedom and pluralism of the media. When exercising the powers set out in this Regulation, all public authorities involved shall aim to achieve, in situations where the relevant fundamental rights conflict, a fair and proportionate balance between the rights concerned. |
Amendment 58
Proposal for a regulation
Article 1 – paragraph 5 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) Directive (EU) 2019/882; |
Amendment 59
Proposal for a regulation
Article 1 – paragraph 5 – point b b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(bb) Directive (EU) 2019/770 - digital content; |
Amendment 60
Proposal for a regulation
Article 1 – paragraph 5 – point b c (new)
|
|
Text proposed by the Commission |
Amendment |
|
(bc) COM/2018/819 - Directive on distant sales of goods; |
Amendment 61
Proposal for a regulation
Article 1 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. The Commission shall by ... [one year after the date of adoption of this Regulation] publish guidelines with regard to the relations between this Regulation and those legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations enlisted in these legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent. |
Amendment 62
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a significant number of users in one or more Member States; or |
— a significant number of average monthly active recipients in one or more Member States; or |
Amendment 63
Proposal for a regulation
Article 2 – paragraph 1 – point g
|
|
Text proposed by the Commission |
Amendment |
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
(g) ‘illegal content’ means any information, which, in itself or through the sale of products or provision of services is not in compliance with Union law or the law of the Member State of origin, irrespective of the precise subject matter or nature of that law; |
Amendment 64
Proposal for a regulation
Article 2 – paragraph 1 – point h
|
|
Text proposed by the Commission |
Amendment |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature or functionality of another service or principle service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation. |
Amendment 65
Proposal for a regulation
Article 2 – paragraph 1 – point n
|
|
Text proposed by the Commission |
Amendment |
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information; |
(n) ‘advertisement’ means information designed to directly or indirectly promote or rank information, products or services of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface or parts thereof against direct or indirect remuneration specifically for promoting that information, product or service; |
Amendment 66
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed as well as ranking and prioritisation techniques; |
Amendment 67
Proposal for a regulation
Article 2 – paragraph 1 – point p
|
|
Text proposed by the Commission |
Amendment |
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; |
(p) ‘content moderation’ means the activities, either through automated or manual means, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, monetisation and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; |
Amendment 68
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qa) ‘online marketplace’ means a service using software, including a website, part of a website or an application, operated by or on behalf of a trader which allows consumers to conclude distance contracts with other traders or consumers, according to Directive (EU) 2019/2161; |
Amendment 69
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qb) ‘trusted flagger’ means an economically and politically neutral entity representing collective interests which is dedicated to detecting, identifying and notifying illegal content and has relevant expertise and competence; |
Amendment 70
Proposal for a regulation
Article 2 – paragraph 1 – point q c (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qc) 'persons with disabilities' means persons within the meaning of Article 3(1) of Directive(EU) 2019/882; |
Amendment 71
Proposal for a regulation
Article 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 2a |
|
Protection of consumer rights in a data based economy |
|
1. Where technically possible and in accordance with Union law, a provider of information society service acting as a provider of intermediary service shall enable the use of and payment for that service without collecting the personal data of the recipient. |
|
2. A provider of an information society service acting as a provider of intermediary service shall process personal data concerning the use of the service by a recipient only to the extent strictly necessary to enable the recipient to use the service or to charge the recipient for the use of the service. An operator of an online platform shall only be allowed to process personal data concerning the use of the service by a recipient for the sole purpose of operating a recommender system where the recipient has given his or her explicit consent, as defined in Article 4(11) of Regulation (EU) 2016/679. Member States shall not require a provider of information society services to retain personal data concerning the use of the service by all recipients. |
|
3. A provider of an information society service shall have the right to provide and support end-to-end encryption services. |
|
4. User profiling carried out by the information society service providers shall only be conducted on the basis of the data provided with the user´s clear consent, in line with Regulation 2016/679. Information society service providers are explicitly prohibited from carrying out profiling on third persons who are not users of the service. |
Amendment 72
Proposal for a regulation
Article 4 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. |
(e) the provider acts expeditiously and in good faith to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. |
Amendment 73
Proposal for a regulation
Article 5 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or |
(a) does not have actual knowledge of illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal contentis apparent; or |
Amendment 74
Proposal for a regulation
Article 5 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content. |
(b) upon obtaining such knowledge or awareness, acts expeditiously and in good faith to remove or to disable access to the illegal content. |
Amendment 75
Proposal for a regulation
Article 5 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders. It is important that hosting services adopt the highest standards of transparency to highlight, in a way that would lead an average and reasonably well-informed consumer to understand, that the information comes from a third party and is not offered by the hosting service. |
Amendment 76
Proposal for a regulation
Article 6 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Providers of intermediary services shall ensure that such measures shall be accompanied with appropriate safeguards, such as human oversight, documentation, traceability or any additional measure to ensure that own-initiative investigations are accurate, fair, non-discriminatory and transparent. |
Amendment 77
Proposal for a regulation
Article 7 – title
|
|
Text proposed by the Commission |
Amendment |
No general monitoring or active fact-finding obligations |
No general monitoring or active fact-finding or automated content moderation obligations |
Amendment 78
Proposal for a regulation
Article 7 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. No provision of this Regulation shall be understood as mandating, requiring or recommending the use of automated decision-making, or the monitoring of the behaviour of a large number of natural persons. When using moderation content automated tools, intermediary services should always ensure a human oversight of each decision to remove, disable, restrict or modify in any way the information content. |
Amendment 79
Proposal for a regulation
Article 7 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
This Regulation shall not prevent providers from offering end-to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability. |
Amendment 80
Proposal for a regulation
Article 8 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, execute that order and inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
Amendment 81
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider of the service and to the recipient of the service who provided the content; |
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content; which may be sought in the Member State of establishment of the provider of the service and/or in the Member State of establishment of the recipient of the service who provided the content; |
Amendment 82
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
- the order is transmitted via secure channels established between the Digital Services Coordinator of establishment and the providers of intermediary services; |
Amendment 83
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
- the order shall clarify the neutrality and non-discriminatory approach of the decision; |
Amendment 84
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
- precise identification elements of the recipients of the service concerned; |
Amendment 85
Proposal for a regulation
Article 9 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the order is transmitted via secure channels established between the Digital Services Coordinator of establishment and the providers of intermediary services. |
Amendment 86
Proposal for a regulation
Article 10 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact. |
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact, in a clear and user-friendly manner. |
Amendment 87
Proposal for a regulation
Article 10 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision. |
Amendment 88
Proposal for a regulation
Article 10 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Providers of intermediary services shall specify in the information referred to in paragraph 2, the official language or languages of the Union, which can be used to communicate with their points of contact and which shall include at least one of the official languages of the Member State in which the provider of intermediary services has its main establishment or where its legal representative resides or is established. |
3. Providers of intermediary services shall specify in the information referred to in paragraph 2, the official language or languages of the Union, which can be used to communicate with their points of contact and which shall include the official languages of the Member State in which the provider of intermediary services has its main establishment or offers its activities or where its legal representative resides or is established. |
Amendment 89
Proposal for a regulation
Article 10 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Any request to providers of intermediary services, according to this Regulation, shall be transmitted through the Digital Service Coordinator in the Member State of establishment, who is responsible for collecting requests and information from all relevant sources. |
Amendment 90
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. Such restrictions shall in no way serve to provide selected economic actors with competitive advantages. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. The use of algorithmic decision-making processes shall be notified to users whenever it is applied. |
Amendment 91
Proposal for a regulation
Article 12 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Providers of intermediary services shall list the restrictions in relation to the use of their service for the dissemination of content deemed illegal under Union or Member State law in a clear and user-friendly manner separately from the general conditions for the use of their service so as to make the user aware of what is deemed illegal and what is subject to the terms and conditions for the use of the service. |
Amendment 92
Proposal for a regulation
Article 12 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. The information mentioned in paragraphs 1 and 1a shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible and machine-readable format. Providers of intermediary services except those that qualify as micro or small enterprises within the meaning of the Annex to the Commission Recommendation 2003/361/EC shall make publicly available a summary of the terms and conditions, setting out the most important points in concise, clear and unambiguous language. |
Amendment 93
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall act in a diligent, transparent, non-discriminatory, coherent, predictable, non-arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
Amendment 94
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Terms and conditions of providers of intermediary services shall respect the essential principles of fundamental rights as enshrined in the Charter and in international law. |
Amendment 95
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Terms and conditions that do not comply with this Article shall not be binding on recipients in accordance with Directive 93/13/EC. |
Amendment 96
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. All changes in terms and conditions shall be fully in accordance with this Article. Intermediary service providers shall inform the users of all changes in terms and conditions at least one month before their implementation. |
Amendment 97
Proposal for a regulation
Article 12 – paragraph 2 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
2d. To preserve and strengthen the internal market and the transparency of the services provided, the provider shall, as much as possible, use similar terms and conditions across the whole internal market, with divergences being clearly marked and justified. |
Amendment 98
Proposal for a regulation
Article 12 – paragraph 2 e (new)
|
|
Text proposed by the Commission |
Amendment |
|
2e. The very large online platforms shall consult their terms of service with the Digital Services Coordinator and take into account the recommendations that the Digital Services Coordinator may have. |
Amendment 99
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish in a standardised and machine-readable format, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
Amendment 100
Proposal for a regulation
Article 13 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; |
(c) the content moderation engaged in through the provider's voluntary own-initiative investigations as referred to in Article 6, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures, as well as the measures taken to qualify content moderators and the safeguards to ensure that non-infringing content is not affected; |
Amendment 101
Proposal for a regulation
Article 13 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
2. Points (b), (c) and (d) of paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to the Commission Recommendation 2003/361/EC. |
Amendment 102
Proposal for a regulation
Article 14 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. |
1. Providers of hosting services shall put mechanisms in place to allow any individual or non-governmental entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means and may include a clearly identifiable banner or single reporting button, allowing the users of those services to notify the providers of hosting services in a quick and easy manner. |
Amendment 103
Proposal for a regulation
Article 14 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
(b) where possible a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
Amendment 104
Proposal for a regulation
Article 14 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; |
(c) where possible the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; |
Amendment 105
Proposal for a regulation
Article 14 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) where an alleged infringement of an intellectual property right is notified, evidence that the entity submitting the notice is the holder of the intellectual property right that is allegedly infringed or is authorised to act on behalf of the holder of that right; |
Amendment 106
Proposal for a regulation
Article 14 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete. |
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete as well as the relationship, economic or otherwise, if any, the individual or entity has with the notified entity. |
Amendment 107
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 solely in respect of the specific item of information concerned, when the provider of hosting services can unequivocally identify the illegal nature of the content. |
Amendment 108
Proposal for a regulation
Article 14 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Upon receipt of the notice of alleged copyright infringement, the service provider shall notify the information providers, using available contact details, of the elements referred to in paragraph 2 and give them the opportunity to reply, within a minimum of 5 working days, before taking a decision and, if applicable, before disabling access to the referred content. |
Amendment 109
Proposal for a regulation
Article 14 – paragraph 4 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
4b. The provider shall ensure that decisions on notices are taken by qualified staff provided with appropriate working conditions, including professional support, qualified psychological assistance and legal advice. |
Amendment 110
Proposal for a regulation
Article 14 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
5. The provider shall also, without undue delay, notify that individual or entity who submitted the notice and the information provider of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
Amendment 111
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent, fair and non-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. That notification shall include information about the procedure followed, the technology used and the criteria and reasoning supporting the decision, as well as the logic involved in the automated decision-making |
Amendment 112
Proposal for a regulation
Article 15 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access; |
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope and the duration of the disabling of access; |
Amendment 113
Proposal for a regulation
Article 15 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground; |
(d) where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground including explanations in relation to the arguments submitted under Article 14(2)(a), where relevant; |
Amendment 114
Proposal for a regulation
Article 15 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground; |
(e) where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider or incompatibility with fundamental rights, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground; |
Amendment 115
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Where a provider of hosting services decides to not remove or disable access to specific items of information provided by the recipients of the service, detected through the mechanisms established in Article 14, it shall inform the user who notified the online platform of the content and where needed, the recipient of the decision without undue delay. The notification of such a decision can be done through automated means. |
Amendment 116
Proposal for a regulation
Article 15 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the redress possibilities referred to in point (f) of paragraph 2. |
3. The information provided by the providers of hosting services in accordance with this Article shall be accessible, including for persons with disabilities, clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the redress possibilities referred to in point (f) of paragraph 2. |
Amendment 117
Proposal for a regulation
Article 15 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data. |
4. Very large online platforms shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible, including for persons with disabilities, machine-readable and reusable database managed and published by the Commission. That information shall not contain personal data. |
Amendment 118
Proposal for a regulation
Article 15 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Micro and small enterprises within the meaning of the Annex to the Commission Recommendation 2003/361/EC shall be exempt from the obligations set out in points b, c and f of paragraph 2 of this Article. |
Amendment 119
Proposal for a regulation
Article 16 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises within the meaning of the Annex to the Commission Recommendation 2003/361/EC. |
Amendment 120
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
1. Online platforms shall provide to all recipients of the service and qualified entities as defined in Article 3, point (4) of Directive (EU) 2020/1828, for a period of at least six months following the decision referred to in this paragraph, the access to an effective and user-friendly internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge. Complaints can be lodged against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
Amendment 121
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions to remove, disable, restrict or in any other way modify access to the information; |
Amendment 122
Proposal for a regulation
Article 17 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Complaints can also be lodged against decisions made by the online platform not to remove, not to disable, not to suspend and not to terminate access to accounts. |
Amendment 123
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure for their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities. |
Amendment 124
Proposal for a regulation
Article 17 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent, transparent and non-arbitrary manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. If requested by the complainant, the online platform shall also publicly inform of the reversal of the decision. Without prejudice to horizontal laws, where the decision referred to in paragraph 1 is manifestly wrong and infringes the fundamental rights of the recipient of the service, the very large online platform shall provide financial compensation. When determining the amount of the financial compensation the very large online platform shall also take into account whether the decision referred to in paragraph 1 prevented the recipient of the service from benefitting from the use of the platform. |
Amendment 125
Proposal for a regulation
Article 17 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. |
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and are reviewed by qualified staff provided with appropriate working conditions, including professional support, qualified psychological assistance and legal advice. |
Amendment 126
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Recipients of the service addressed by the decisions referred to in Article 17(1) and qualified entities as defined in Article 3, point (4) of Directive (EU) 2020/1828, shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected by the recipient with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Amendment 127
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the dispute settlement is easily accessible through electronic communication technology; |
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology; |
Amendment 128
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) it is capable of settling dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union; |
(d) it is capable of settling dispute in a swift, efficient and cost-effective manner and at least in the language of the recipient to whom the decision referred to in Article 17 is addressed; |
Amendment 129
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. |
(e) the dispute settlement takes place in accordance with clear, transparent and fair rules of procedure. |
Amendment 130
Proposal for a regulation
Article 18 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive. |
6. This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive, and also does not affect the recipient’s right to settle disputes in court. |
Amendment 131
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) it is operationally independent from government and public authorities and it does not have conflict of interest related to the submission of these notices. |
Amendment 132
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(cb) it publishes, at least once a year, a clear and easily comprehensible report on notices submitted in accordance with Article 14 during the relevant period covered by the report. The report shall contain: |
|
- a summary of notices categorised by the identity of the provider of hosting services; |
|
- the type of content notified; |
|
- the specific legal provisions allegedly breached by the content notified; |
|
- the action taken by the provider; |
|
- any potential conflicts of interest and sources of funding; and |
|
- an explanation of the procedures in place to ensure the trusted flagger maintains its independence. |
Amendment 133
Proposal for a regulation
Article 19 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
3. Digital Services Coordinators and the Board shall communicate the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
Amendment 134
Proposal for a regulation
Article 19 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. |
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database in an easily accessible, including for persons with disabilities, and machine-readable format and keep the database updated. |
Amendment 135
Proposal for a regulation
Article 19 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
Amendment 136
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Online platforms shall suspend, only for a reasonable short period of time and after having issued a prior warning and provided a comprehensive explanation, the provision of their services to recipients of the service that frequently provide illegal content. |
Amendment 137
Proposal for a regulation
Article 20 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
(a) the absolute numbers of items of illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
Amendment 138
Proposal for a regulation
Article 20 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The provider shall ensure that the assessment is carried out by qualified staff provided with appropriate working conditions, including professional support, qualified psychological assistance and legal advice. |
Amendment 139
Proposal for a regulation
Article 20 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
4. Online platforms shall set out, in a clear, accessible, including for persons with disabilities, and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
Amendment 140
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. |
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it has its main establishment or its legal representative and also transmit the information to Europol for appropriate follow up. |
Amendment 141
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information: |
1. Where an online platform allows consumers to conclude distance contracts with traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the trader has provided the following information to the online platform: |
Amendment 142
Proposal for a regulation
Article 22 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; |
(b) a passport or a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; |
_________________ |
_________________ |
50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC |
50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC |
Amendment 143
Proposal for a regulation
Article 22 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; |
(d) to the extent the contract relates to products that are subject to the Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and the Council51 |
_________________ |
_________________ |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
Amendment 144
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Further to point f, traders from within the Union and from third countries shall have the option to voluntarily upload the relevant documents certifying that their goods meet the consumer protection standards of the Union. Online platforms that facilitate the sale of harmonised consumer goods by a seller in a third country to a consumer in the Union, shall make reasonable efforts to verify that the product bears the required conformity mark (CE mark) and that it has other relevant documents (e.g. EU declaration of conformity). Those provisions are without prejudice to Article 6 of Directive (EU) 2011/83, Article 7 of Directive (EU) 2005/29 and Article 4 of Regulation (EU) 2019/1020. |
Amendment 145
Proposal for a regulation
Article 22 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. |
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. Provided that the online platform has made reasonable efforts to assess the information in points (a), (d) and (e), online platform shall not be held liable for inaccurate information provided by the trader. |
Amendment 146
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Where the online platform obtains indications, through its reasonable efforts under paragraph 2 or through Member States' consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Amendment 147
Proposal for a regulation
Article 22 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship, including the period for redress, with the trader concerned. They shall subsequently delete the information. |
Amendment 148
Proposal for a regulation
Article 23 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
Amendment 149
Proposal for a regulation
Article 23 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. Online platforms shall clearly state how and for what purpose they collect data from users of the service and how, to whom and for what purpose they further disseminate the data collected. |
Amendment 150
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: |
Online platforms that directly or indirectly display advertising on their online interfaces or parts thereof shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual consumer, in a clear, concise but meaningful, uniform and unambiguous manner and in real time: |
Amendment 151
Proposal for a regulation
Article 24 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) that the information displayed is an advertisement; |
(a) that the information displayed is an advertisement and whether the advertisement is a result of an automated mechanism, such as an advertising exchange mechanism; |
Amendment 152
Proposal for a regulation
Article 24 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly finances the advertisement; |
Amendment 153
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) whether the advertising is based on any form of algorithmic targeting; |
Amendment 154
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) meaningful information about the parameters used to target and display the advertisement, which allows the consumer to determine why and how the advertisement is shown to him or her. That information shall also include an explanation on how to change those parameters; |
Amendment 155
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the remuneration that is given by the advertiser. |
Amendment 156
Proposal for a regulation
Article 24 – paragraph 1 – subpargraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
When the online platform is subleasing part of its online presentation to a third party, the platform shall ensure that all the transparency requirements set out in this Article are fulfilled. |
Amendment 157
Proposal for a regulation
Article 24 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Online platforms shall take steps to phase out collecting or processing personal data, as defined in Article 4, point (1), of Regulation (EU) 2016/679, for the purpose of targeting recipients for non-commercial and political advertising, in favour of contextual advertising. The same would apply to targeting people based on sensitive data, or to targeting minors. This Article is without prejudice to the regulation (EU) .../.... on greater transparency in political advertising. |
Amendment 158
Proposal for a regulation
Article 24 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
For the purpose of targeting the recipients to whom advertisements for commercial purposes are displayed, online platforms shall offer users the possibility to easily opt-out from micro-targeted tracking and advertisements that are based on their behaviour data or other profiling techniques, within the meaning of Article 4(4) of Regulation (EU) 2016/679. Personal data used for online advertising shall be used in accordance with the conditions for consent laid out in Article 7 of Regulation (EU) 2016/679. |
Amendment 159
Proposal for a regulation
Article 25 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. |
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3, or if they exercise a dominant position in a specific market sector as defined in relevant Union law. |
Amendment 160
Proposal for a regulation
Article 26 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the dissemination of illegal content through their services; |
(a) details on the dissemination of illegal content through their services and impacted jurisdictions; |
Amendment 161
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in the Charter respectively, including when these negative effects are caused by algorithmic bias; |
Amendment 162
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) malfunctioning or intentional manipulation of their service, including by means of automated exploitation of the service, with an actual or foreseeable negative effect on fundamental rights; |
Amendment 163
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) impact on the economy and the competitiveness of each Member State or the relevant Union market. |
Amendment 164
Proposal for a regulation
Article 26 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, the effects of their content moderation systems, recommender systems and systems for selecting and displaying advertisement. |
Amendment 165
Proposal for a regulation
Article 26 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. To ensure a high level of public control and transparency, those yearly risk assessments should be made as transparent as possible, by means of open access data, without prejudice of Directive 2016/943 (trade secrets). |
Amendment 166
Proposal for a regulation
Article 26 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. The outcome of the risk assessment and the supporting documents shall be communicated to the Board and to the Digital Services Coordinator of establishment. A summary version of the risk assessment shall be made publicly available in an easily accessible format, including for persons with disabilities. |
Amendment 167
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(aa) appropriate staffing to deal with notices and complaints, including where automatic systems are used; |
Amendment 168
Proposal for a regulation
Article 27 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk; |
(c) reinforcing the internal processes, not solely based on automated systems, or supervision of any of their activities in particular as regards detection of systemic risk; |
Amendment 169
Proposal for a regulation
Article 27 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) targeted measures aimed at reducing electricity and water consumption, heat production and CO2 emissions related to the provision of the service and to the technical infrastructure. |
Amendment 170
Proposal for a regulation
Article 27 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33 and taking note of their real or likely economic and competitive consequences, if any; |
Amendment 171
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Those reports shall be disseminated to the general public, free of charge, and with due regard to business secrets, include standardised, open data describing the systemic risks, especially risks to fundamental rights and socioeconomic ones. |
Amendment 172
Proposal for a regulation
Article 27 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general recommendations on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those recommendations the Commission shall organise public consultations. |
Amendment 173
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: |
Amendment 174
Proposal for a regulation
Article 28 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Audits should be performed on at least: |
|
(a) the clarity, coherence and predictable enforcement of terms of service with particular regard to the applicable fundamental rights as enshrined in the Charter; |
|
(b) the completeness, methodology and consistency of the transparency reporting obligations as set out in Articles 13, 23, 24 and 30 as well as on respect for the highest possible transparency reporting standards; |
|
(c) accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers to notices of illegal content and violations of terms of service and the accuracy of classification (illegal or violation of terms and conditions) of removed information; |
|
(d) internal and third-party complaint handling mechanisms; |
|
(e) interaction with trusted flaggers and independent assessment of accuracy, response times, efficiency and whether there are indications of abuse; |
|
(f) diligence with regard to verification of the traceability of traders; |
|
(g) the effectiveness of and compliance with codes of conduct; |
|
(h) data sufficiency, aiming to reduce data generation, in general, and traffic, wherever possible, including, in particular, the reduction of associated electricity consumption and resources from data centres, as referred to in Article 27; |
|
(i) readiness to participate in the crisis protocols referred to in Article 37. |
|
Audits on the subjects referred to in points (a) to (g) may be combined where the organisation performing the audits has subject-specific expertise in the relevant area. |
Amendment 175
Proposal for a regulation
Article 28 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle. |
Amendment 176
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which: |
2. Audits performed pursuant to the paragraphs above shall be performed by organisations which: |
Amendment 177
Proposal for a regulation
Article 28 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) are independent from the very large online platform concerned; |
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months; |
Amendment 178
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) have not provided an audit to the same very large online platform for more than three consecutive years. |
Amendment 179
Proposal for a regulation
Article 28 – paragraph 3 – point f
|
|
Text proposed by the Commission |
Amendment |
(f) where the audit opinion is not positive, operational recommendations on specific measures to achieve compliance. |
(f) where the audit opinion is not positive, operational recommendations on specific measures to achieve compliance and risk-based remediation timelines with a focus on rectifying issues that have the potential of causing most harm to users of the service as a priority; |
Amendment 180
Proposal for a regulation
Article 28 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The audits should be submitted to the Digital Services Coordinators, the European Union Agency for Fundamental Rights and to the Commission immediately after their completion. Audit findings, that do not include sensitive information, shall be made public. Digital Services Coordinators, European Union Agency for Fundamental Rights and the Commission may provide a public comment on the audits. |
Amendment 181
Proposal for a regulation
Article 29 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
1. Online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available. Very large online platform shall include at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679, as well as keep a log of all the significant changes implemented to the recommender system. |
Amendment 182
Proposal for a regulation
Article 30 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly financed the advertisement; |
Amendment 183
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the pricing methods used to determine the size of financial compensation that will be received by the platform for the dissemination of each advertisement; |
Amendment 184
Proposal for a regulation
Article 31 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. |
Amendment 185
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article. |
Amendment 186
Proposal for a regulation
Article 31 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Upon reasoned request, very large online platforms shall provide access to data, in particular aggregated and anonymised data, to vetted researchers, who meet the requirements set out in paragraph 4, for the purposes of scientific and academic research. Very large online platforms may deny access to the data if such access would compromise trade secrets or the security of the service. Such denial shall be duly justified. |
Amendment 187
Proposal for a regulation
Article 31 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. |
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. That shall include personal data only where it is lawfully accessible to the public. |
Amendment 188
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of knowledge in the fields related to the investigations, and shall commit to preserve the specific data security and confidentiality requirements corresponding to each request. |
Amendment 189
Proposal for a regulation
Article 31 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. The delegated acts should also lay out the technical conditions needed to ensure confidentiality and security of information by the vetted researchers once they acquire access to the data,. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
Amendment 190
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
|
|
Text proposed by the Commission |
Amendment |
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: |
6. Within 15 days following receipt of a request as referred to in paragraphs 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission or the vetted researchers, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: |
Amendment 191
Proposal for a regulation
Article 31 – paragraph 7 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request. |
The Digital Services Coordinator of establishment or the Commission or the vetted researchers shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request. |
Amendment 192
Proposal for a regulation
Article 31 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Research conducted under that regime should always be built on open access principles, without prejudice to copyright rules, and use standardised data sets to ensure a high level of transparency and accountability with regard to the proper use of provided data. |
Amendment 193
Proposal for a regulation
Article 31 – paragraph 7 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
7b. Upon completion of their research, the vetted researchers that have been granted access to data shall publish their findings without disclosing personal data and without prejudice to Directive 2016/943 (trade secrets). |
Amendment 194
Proposal for a regulation
Article 33 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. |
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every three months, in a standardised, machine-readable and accessible format, including for persons with disabilities. |
Amendment 195
Proposal for a regulation
Article 33 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 33a |
|
Interoperability |
|
1. Very large online platforms shall make at least ancillary services and, where possible, core functionalities of their services interoperable with other online platforms to enable cross-platform communication to the extent technically possible. That obligation shall not limit, hinder or delay their ability to solve security issues and shall be in compliance with all their responsibilities, especially regarding fundamental rights, protection of privacy and data, intellectual property rights, security and safety. |
|
2. Very large online platforms shall publicly document all application programming interfaces they make available and update them continuously. |
|
3. Very large online platforms shall take steps towards enabling third parties to audit their recommender systems and make operational recommendations on how to better prevent the dissemination of illegal content. Such audits shall have the utmost regard to security and privacy of users. Access to third party recommender systems shall be temporarily limited in cases of demonstrable abuse by the third party provider or when justified by an immediate requirement to address technical problems such as serious security vulnerability. |
|
4. The Commission shall adopt implementing measures specifying the nature and scope of the obligations set out in paragraphs 1 and 2, taking into account, not only the individual cases of different very large online providers, but also the diversity and complexity of the market as a whole. |
Amendment 196
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) interoperability of the core functions of very large online platforms pursuant to Article 33a. |
Amendment 197
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
Amendment 198
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the different types of data that can be used. |
Amendment 199
Proposal for a regulation
Article 37 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security, the economy or public health. The Commission shall be responsible for drafting, implementing and scrutinising the crisis protocols and shall annually report on them to the European Parliament. |
Amendment 200
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures: |
2. The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, especially those exercising a dominant position, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures: |
Amendment 201
Proposal for a regulation
Article 37 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. All crisis protocols are to be subjected to scrutiny by the appropriate committees of the European Parliament. |
Amendment 202
Proposal for a regulation
Article 37 – paragraph 5 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
5b. Readiness to participate in already existing crisis protocols should be assessed in a risk assessment outlined in Article 26. |
Amendment 203
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Member States shall designate the status of Digital Services Coordinator based on the following criteria: |
|
(a) the authority has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; |
|
(b) it represents collective interests and is independent from any online platform; |
|
(c) it has the capacity to carry out its activities in a timely, diligent and objective manner. |
Amendment 204
Proposal for a regulation
Article 39 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law. |
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law. Digital Services Coordinators shall draw up a report and publish it in the information sharing system pursuant to Article 67 and present it to the European Parliament. |
Amendment 205
Proposal for a regulation
Article 44 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board. |
1. Digital Services Coordinators shall draw up an annual reports on their activities under this Regulation. They shall make the annual reports available to the public in a standardised, machine-readable and accessible format, including for persons with disabilities, and shall communicate them to the Commission and to the Board. |
Amendment 206
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) an assessment of the interpretation of the Country of Origin principle in the supervisory and enforcement activities of the Digital Services Coordinators, especially with regard to Article 45. |
Amendment 207
Proposal for a regulation
Article 44 – paragraph 2 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
These reports shall give due consideration to highly sensitive information and business secrets; |
Amendment 208
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where a Digital Services Coordinator has reasons to suspect that a provider of an intermediary service, not under the jurisdiction of the Member State concerned, infringed this Regulation, it shall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. |
Where a Digital Services Coordinator has reasons to suspect that a provider of an intermediary service, not under the jurisdiction of the Member State concerned, infringed this Regulation, it shall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. The Digital Services Coordinator of establishment shall acknowledge the receipt of the request and confirm that it will assess the matter and take the necessary investigatory and enforcement measures within 10 working days.. |
|
Where the Digital Services Coordinator of establishment initiates proceedings, it shall share with the requesting Digital Services Coordinator all the information gathered during the proceedings related to the case. |
Amendment 209
Proposal for a regulation
Article 45 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The Digital Service Coordinator of establishment shall make available to any Digital Services Coordinator in the territory where the service provider operates, the data collected for the purpose of the supervision of that provider and which relates to the territory of the Digital Service Coordinator. |
Amendment 210
Proposal for a regulation
Article 45 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. |
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. |
Amendment 211
Proposal for a regulation
Article 45 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. |
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation, or where appropriate the reasons why it considers that the case should not be investigated. |
Amendment 212
Proposal for a regulation
Article 47 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established. |
1. An independent advisory and coordination group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established. |
Amendment 213
Proposal for a regulation
Article 47 – paragraph 2 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) facilitating communication between multiple Digital Service Coordinators and creating a safe space for open information exchange. |
Amendment 214
Proposal for a regulation
Article 48 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure. |
3. The Board shall be chaired by a president elected from among its members. The chair of the Board shall not be allowed to lead any national regulatory office in a Member State at the same time. The mandate of the chair shall be limited to a maximum of three years, renewable once. The chair of the Board shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure. |
Amendment 215
Proposal for a regulation
Article 48 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Board shall adopt its rules of procedure, following the consent of the Commission. |
6. The Board shall adopt its rules of procedure, by a two-thirds majority of its members and shall organise its own operational arrangements. |
Amendment 216
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where there are reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. |
Amendment 217
Proposal for a regulation
Article 51 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 51 a |
|
Requirements for the Commission |
|
1. The Commission shall perform its tasks under this Regulation in an impartial, transparent and timely manner. The Commission shall ensure that its units given responsibility for this Regulation have the adequate technical, financial and human resources to carry out their tasks. |
|
2. When carrying out its tasks and exercising its powers in accordance with this Regulation, the Commission shall act with complete independence. It shall remain free from any direct or indirect external influence and shall neither seek nor take instructions from any other public authority or any private party. |
Amendment 218
Proposal for a regulation
Article 52 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Upon request, the Commission shall transmit the information obtained to the Digital Services Coordinator of establishment and to the Board. |
Amendment 219
Proposal for a regulation
Article 55 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. A decision under paragraph 1 shall apply for a specified period of time and may be renewed in so far this is necessary and appropriate. |
2. A decision under paragraph 1 shall apply for a specified period of time and may be renewed in so far this is necessary and appropriate. When adopting such a decision, the Commission shall immediately inform the Board and the Digital Service Coordinator of establishment. |
Amendment 220
Proposal for a regulation
Article 57 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms. |
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms, without prejudice to Directive (EU) 2016/943 on trade secrets. |
Amendment 221
Proposal for a regulation
Article 58 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Before adopting the decision pursuant to paragraph 1, the Commission shall communicate its preliminary findings to the very large online platform concerned. In the preliminary findings, the Commission shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings. |
2. Before adopting the decision pursuant to paragraph 1, the Commission shall communicate its preliminary findings to the very large online platform concerned, to the Board and to the Digital Service Coordinator of establishment. In the preliminary findings, the Commission shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings. |
Amendment 222
Proposal for a regulation
Article 58 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. |
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision and inform the Board and the Digital Service Coordinator of establishment. |
Amendment 223
Proposal for a regulation
Article 67 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Relevant committees of the European Parliament shall be given access to that information sharing system to provide democratic oversight. |
Amendment 224
Proposal for a regulation
Article 73 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. By five years after the entry into force of this Regulation at the latest, and every five years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. |
1. By three years after the entry into force of this Regulation at the latest, and every three years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. |
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
ECON 8.2.2021 |
|||
Rapporteur for the opinion Date appointed |
Mikuláš Peksa 10.5.2021 |
|||
Discussed in committee |
1.9.2021 |
|
|
|
Date adopted |
26.10.2021 |
|
|
|
Result of final vote |
+: –: 0: |
52 5 3 |
||
Members present for the final vote |
Gerolf Annemans, Gunnar Beck, Marek Belka, Isabel Benjumea Benjumea, Lars Patrick Berg, Stefan Berger, Gilles Boyer, Engin Eroglu, Markus Ferber, Jonás Fernández, Raffaele Fitto, Frances Fitzgerald, Luis Garicano, Sven Giegold, Valentino Grant, Claude Gruffat, José Gusmão, Enikő Győri, Eero Heinäluoma, Michiel Hoogeveen, Danuta Maria Hübner, Stasys Jakeliūnas, France Jamet, Othmar Karas, Billy Kelleher, Ondřej Kovařík, Georgios Kyrtsos, Aurore Lalucq, Aušra Maldeikienė, Pedro Marques, Costas Mavrides, Jörg Meuthen, Csaba Molnár, Siegfried Mureşan, Caroline Nagtegaal, Luděk Niedermayer, Lefteris Nikolaou-Alavanos, Piernicola Pedicini, Lídia Pereira, Kira Marie Peter-Hansen, Sirpa Pietikäinen, Dragoş Pîslaru, Evelyn Regner, Antonio Maria Rinaldi, Alfred Sant, Martin Schirdewan, Joachim Schuster, Ralf Seekatz, Pedro Silva Pereira, Paul Tang, Irene Tinagli, Ernest Urtasun, Inese Vaidere, Johan Van Overtveldt, Stéphanie Yon-Courtin, Marco Zanni, Roberts Zīle |
|||
Substitutes present for the final vote |
Janusz Lewandowski, Mikuláš Peksa, Mick Wallace |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
52 |
+ |
ECR |
Lars Patrick Berg, Raffaele Fitto, Michiel Hoogeveen, Johan Van Overtveldt, Roberts Zīle |
ID |
Valentino Grant, Antonio Maria Rinaldi, Marco Zanni |
NI |
Enikő Győri |
PPE |
Isabel Benjumea Benjumea, Stefan Berger, Markus Ferber, Frances Fitzgerald, Danuta Maria Hübner, Othmar Karas, Georgios Kyrtsos, Janusz Lewandowski, Aušra Maldeikienė, Siegfried Mureşan, Luděk Niedermayer, Lídia Pereira, Sirpa Pietikäinen, Ralf Seekatz, Inese Vaidere |
Renew |
Gilles Boyer, Engin Eroglu, Luis Garicano, Billy Kelleher, Ondřej Kovařík, Caroline Nagtegaal, Dragoş Pîslaru, Stéphanie Yon-Courtin |
S&D |
Marek Belka, Jonás Fernández, Eero Heinäluoma, Aurore Lalucq, Pedro Marques, Costas Mavrides, Csaba Molnár, Evelyn Regner, Alfred Sant, Joachim Schuster, Pedro Silva Pereira, Paul Tang, Irene Tinagli |
Verts/ALE |
Sven Giegold, Claude Gruffat, Stasys Jakeliūnas, Piernicola Pedicini, Mikuláš Peksa, Kira Marie Peter-Hansen, Ernest Urtasun |
5 |
- |
ID |
Gunnar Beck, Jörg Meuthen |
NI |
Lefteris Nikolaou-Alavanos |
The Left |
José Gusmão, Mick Wallace |
3 |
0 |
ID |
Gerolf Annemans, France Jamet |
The Left |
Martin Schirdewan |
Key to symbols:
+ : in favour
- : against
0 : abstention
OPINION OF THE COMMITTEE ON TRANSPORT AND TOURISM (30.9.2021)
for the Committee on Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC
( COM(2020)0825 – C9-0418/2020 – 2020/0361(COD))
Rapporteur for opinion: Roman Haider
SHORT JUSTIFICATION
Background
The basic framework regulating the provision of digital services in the internal market is defined in the E-Commerce Directive dating from 2000. The goal of that directive is to allow borderless access to digital services across the EU and to harmonise the core aspects for such services. Since then, the nature, scale, and importance of digital services for the economy and society has dramatically changed. Business models, which emerged with large online platforms, have transformed the landscape of digital services in the EU. These services are now used by a majority of EU citizens on a daily basis, and are based on multi-sided business models. When it comes to the accommodation sector, short-term rental online platforms have in recent years reshaped the accommodation market.
Over the years, an important area of legal uncertainty for digital service providers has been the scope of the definition of information society services in the EU acquis and, in particular, concerning the collaborative economy online platforms in the transport and accommodation sector. The line between the online services, offered at a distance, and the underlying services, usually offered offline, has not always been clear. The consequences of the separation of these services are significant given that online services may fall within the scope of the E-Commerce Directive while the underlying services fall within sector-specific rules or horizontal EU legal acts. Operators have often mentioned such legal uncertainty as a source of concern for their growth.
The relevant provisions of the E-Commerce Directive have been interpreted by the Court of Justice of the EU in recent years. UberPop (C-434/15) is considered not to be an information society service but Airbnb (C-390/18) and Star Taxi App (C-62/19) are.
The Commission proposal follows the interpretation by the Court of Justice on the scope of the definition of information society services. According to the new proposal, the definition of information society service does not apply if the intermediation service forms an integral part of an overall service whose main component constitutes a service coming under another legal qualification (transport).
The Rapporteur’s position
Digital services have become an important backbone of the digital economy and have deeply contributed profoundly to social and economic change in the EU. At the same time, the use of those services has also become the source of new risks and challenges. According to the rapporteur, the Covid-19 pandemic has inevitably made the need for reform even more urgent, as business transactions of all kinds are increasingly moving online. Moreover, economic recovery in many crisis-hit economic sectors is dependent on online business - including tourism, which has particularly suffered from the effects of the pandemic.
The Rapporteur points out that the key principles set out in the E-Commerce Directive are still valid today and stresses that further regulation of this area must not be a downgrade for platforms or users. This concerns both the monitoring of the digital service and, especially in the case of online platforms, the increasing challenges in monitoring the services they mediate.
The provision of illegal services such as non-compliant accommodation services on short-term rental platforms has expanded significantly in recent years. Authorities lack information and technical capability for inspecting technically complex digital services. This concerns both the supervision of the digital service and, in the case of online platforms in particular, the increasing challenges of supervising the underlying services they intermediate, such as accommodation or transport services. There has been a differentiated application of the existing rules by Member States, and ultimately by national courts.
Vis-a-vis the above-mentioned shortcomings, the Commission proposal provides, in the view of the Rapporteur, a horizontal legal framework to ensure legal clarity and reduce legal fragmentation for the cross-border provision of digital services. However, the Rapporteur considers that the Commission proposal lacks ambition when tackling the problem of illegal short term rentals. While the powers for national and local authorities to act against illegal content (illegal short term rentals) and to provide information are a step in the right direction, the DSA should clarify, in the view of the Rapporteur, platforms’ responsibility to ensure compliance with registration schemes and other applicable rules in place across the EU, for instance when short term rentals lack a registration number or have exceeded applicable night caps.
The Rapporteur reckons that accommodation providers should remain responsible for complying with all applicable regulations, public authorities should remain responsible for enforcing these regulations, and platforms should be responsible for ensuring that only accommodations having a validated registration number can be rented out. In this regard, the Rapporteur considers that a code of conduct for short term rentals could clarify Member States’ ability to take action, as well as the scope of the duty of care incumbent upon platforms.
According to the Rapporteur, only free market access can lead to economic recovery and must not necessarily be seen as a disadvantage for SMEs vis-a-vis very large online platforms (VLOPs). In this context, the Rapporteur considers that the Regulation should not add red tape to SMEs that could restrict their market access or even make it impossible.
The Rapporteur highlights the following considerations:
- The overriding principle of the proposed Regulation must be the protection of users, which must take precedence over the protection of platforms;
- the Rapporteur sees the definition of advertising as too broad, which could lead to ambiguities and restrictions in the accommodation sector, as advertising is very specific in many sectors;
- the DSA must respect the principle of free competition and cannot be an instrument that hampers market access to market players because of their size, market share or business concept;
- the Rapporteur stresses that this Regulations does not affect the right of the Member States to regulate the services provided to users by the platforms.
AMENDMENTS
The Committee on Transport and Tourism calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
deleted |
Amendment 2
Proposal for a regulation
Recital 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(2a) Complex regulatory requirements both on Union and Member State level have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially small and medium sized companies. |
Amendment 3
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible, transparent and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment, for ensuring legal certainty and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
Amendment 4
Proposal for a regulation
Recital 4
|
|
Text proposed by the Commission |
Amendment |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and create and maintain a safer environment and legal certainty for platforms, users and public authorities, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. |
Amendment 5
Proposal for a regulation
Recital 5
|
|
Text proposed by the Commission |
Amendment |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. In this regard, stresses the importance and the particularities of the transport and tourism online platform market, which require a sector-specific approach and special attention. |
__________________ |
__________________ |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
Amendment 6
Proposal for a regulation
Recital 6
|
|
Text proposed by the Commission |
Amendment |
(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union. |
(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport of persons and goods, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union. |
Amendment 7
Proposal for a regulation
Recital 7
|
|
Text proposed by the Commission |
Amendment |
(7) In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide services in the Union, as evidenced by a substantial connection to the Union. |
(7) In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide services in the Union, as evidenced by a substantial connection to the Union. Considering that the digital economy, particularly platforms, can have a significant impact on long-established regulated business models in many strategic sectors such as transportation and hospitality, the Commission should foster a level-playing field between online platforms and traditional enterprises operating in the transport and tourism sectors. |
Amendment 8
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities and should be based on the general idea that what is illegal offline should also be illegal online. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images or videos, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law, offering services that require a licence or approval from a competent national authority without having the appropriate credentials and the listing of illegal short-term holiday rentals. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
Amendment 9
Proposal for a regulation
Recital 23
|
|
Text proposed by the Commission |
Amendment |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
(23) In order to ensure the effective and reliable protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present relevant and accurate information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
Amendment 10
Proposal for a regulation
Recital 34
|
|
Text proposed by the Commission |
Amendment |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should target illegal practises and aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
Amendment 11
Proposal for a regulation
Recital 37
|
|
Text proposed by the Commission |
Amendment |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. |
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Nothing in this Regulation prohibits the providers of intermediary services from establishing collective representation or obtaining the services of a legal representative by other means, including contractual ones, provided that the legal representative can fulfil the role assigned to it in this Regulation. Providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, should be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation. |
Amendment 12
Proposal for a regulation
Recital 43
|
|
Text proposed by the Commission |
Amendment |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
(43) To avoid disproportionate burdens, the additional obligations and administrative requirements imposed on online platforms under this Regulation should not apply to micro or small or medium enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. This Regulation should highlight the importance of collaborative economy platforms in the Transport and Tourism sectors, on which services are provided by both individuals and professionals and avoid imposing disproportionate information obligations and unnecessary administrative burden on peer-to-peer providers of services. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro-, small and medium-sized enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
__________________ |
__________________ |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 13
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, depending on the severity of the illegal activity, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
_________________ |
_________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 14
Proposal for a regulation
Recital 49
|
|
Text proposed by the Commission |
Amendment |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders, products and services they offer are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products or services. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should verify such information and store it in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
Amendment 15
Proposal for a regulation
Recital 50
|
|
Text proposed by the Commission |
Amendment |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45. As an example, freely accessible online databases of short-term holiday rentals compliant with national and local requirements could be set up and maintained by relevant authorities. Online platforms listing such properties could check whether short-term holiday rentals can be legally rented out. Additional verification means for platforms could include requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purposes of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
_________________ |
_________________ |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
Justification
The DSA can play a major role in tackling the sales of illegal short term rentals (STRs).
Amendment 16
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein; |
Justification
This obligation shall not place too heavy burden on small and medium enterprises, which rely heavily on targeted advertising as their main way to attract potential customers;
Amendment 17
Proposal for a regulation
Recital 54 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(54a) In this context, it is important to note that specific sectors such as short-term rental accommodation platforms, even if classified as a very large online platform, may pose only a low risk to the protection of users' fundamental rights. |
Amendment 18
Proposal for a regulation
Recital 55
|
|
Text proposed by the Commission |
Amendment |
(55) In view of the network effects characterising the platform economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator should be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation. |
(55) In view of the network effects characterising the platform, gig and on-demand economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator could be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation, provided there is a legitimate reason. |
Amendment 19
Proposal for a regulation
Recital 56
|
|
Text proposed by the Commission |
Amendment |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade and transport and tourism services. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
Amendment 20
Proposal for a regulation
Recital 59
|
|
Text proposed by the Commission |
Amendment |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts, civil society organisations and consumer protection associations. |
Amendment 21
Proposal for a regulation
Recital 61
|
|
Text proposed by the Commission |
Amendment |
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
(61) The audit report should be substantiated, so as to give a meaningful, factual and objective account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. |
Amendment 22
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient, giving the latter a choice regarding the purchase of services and products. |
Amendment 23
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision and enforcement on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation, fundamental rights and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. Nevertheless, these obligations should not apply to transaction-based platforms, such as online travel agents (OTAs), since they differ from advertising-based platform and they already provide for the requested information, unless the remuneration is related to a specific placement. |
Amendment 24
Proposal for a regulation
Recital 71
|
|
Text proposed by the Commission |
Amendment |
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. |
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation, where the need arises for rapid dissemination of reliable information or where contingency plans may be needed for specific sectors that could be seriously hit by those extraordinary circumstances. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crises protocols should be developed transparently with due regard to users and their rights. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. |
Amendment 25
Proposal for a regulation
Recital 81
|
|
Text proposed by the Commission |
Amendment |
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. |
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations and parties with a legitimate interest should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. |
Amendment 26
Proposal for a regulation
Recital 106 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(106a) In order to enhance the relationship between stakeholders and local authorities in the short-term rental market and mobility services, this Regulation should aim to ensure legal certainty and clarity in this market by creating a governance framework formalising the cooperation between short-term rental and mobility platforms and national, regional and local authorities, aiming especially to share best practices and thus facilitating their daily business; |
Amendment 27
Proposal for a regulation
Article 1 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services; |
(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services, taking into account the sector-specific business model; |
Amendment 28
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) facilitate innovation, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market. |
Amendment 29
Proposal for a regulation
Article 2 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business or profession; |
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft or profession; |
Amendment 30
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
|
|
Text proposed by the Commission |
Amendment |
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service; |
— a ‘hosting’ information society service that consists of the storage of digital information provided by, and at the request of, a recipient of the service, unless that activity is an ancillary feature of another service and, for objective and technical reasons cannot be used without that other service; |
Amendment 31
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ha) ‘short -term rental’ means a furnished accommodation for residential use that is repeatedly let for short periods against consideration, including on a non-professional basis, to a transient clientele which does not take up residence there, and that does not constitute the lessor's main residence. |
Amendment 32
Proposal for a regulation
Article 2 – paragraph 1 – point n
|
|
Text proposed by the Commission |
Amendment |
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information; |
(n) ‘advertisement’ means information designed to directly or indirectly promote or rank information, products or services of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information and that is not contingent on the completion of a given transaction; |
Amendment 33
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank or prioritise in its online interface or parts thereof specific information, products or services to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
Amendment 34
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qa) ‘online marketplace’ means a service using software, including a website, part of a website or an application, operated by or on behalf of a trader which allows consumers to conclude distance contracts with other traders or consumers. |
Amendment 35
Proposal for a regulation
Article 5 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content. |
(b) upon obtaining such knowledge or awareness, acts immediately to remove or to disable access to the illegal content. |
Amendment 36
Proposal for a regulation
Article 5 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider. |
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority, predominant influence or the control of the provider. |
Amendment 37
Proposal for a regulation
Article 5 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
3. Providers of online marketplace services shall not be able to benefit from the liability exemption under Article 5(1), where such an online platform presents the specific item of information, product or service or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority, or control. |
Amendment 38
Proposal for a regulation
Article 6 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they take the necessary voluntary own-initiative investigation measures for detecting, identifying and removing, or disabling of access to, illegal content to comply with the requirements of Union law, including those set out in this Regulation. |
Amendment 39
Proposal for a regulation
Article 7 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers, unless the information society service plays an active role in approving, modifying or editing the information issued by the recipient of the service. |
Amendment 40
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
|
|
Text proposed by the Commission |
Amendment |
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed; |
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of law infringed; |
Amendment 41
Proposal for a regulation
Article 9 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
Amendment 42
Proposal for a regulation
Article 10 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Providers of intermediary services shall provide the Digital Services Coordinator in the Member State where they are established with their contact details. |
Amendment 43
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, unambiguous, straightforward and comprehensible language and shall be publicly available in an easily accessible format. |
Amendment 44
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation. |
Amendment 45
Proposal for a regulation
Article 13 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a micro or small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof. |
Amendment 46
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned, if the illegality of the specific item of information is sufficiently precise and adequately substantiated based on the assessment of the provider. |
Amendment 47
Proposal for a regulation
Article 14 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
5. The provider shall also, without any delay that cannot be properly justified, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
Amendment 48
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Providers of hosting services shall, where the information provided is sufficiently clear, process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, within a maximum of 30 days and in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
Amendment 49
Proposal for a regulation
Article 14 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. Paragraphs 2, 4 and 5 shall not apply to providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation2003/361/EC. In addition, paragraph 2 and 4-5 shall not apply to enterprises that previously qualified for the status of a medium-sized, small or microenterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof. |
Amendment 50
Proposal for a regulation
Article 15 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, without undue delay and at the latest within 24 hours after such removing or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
Amendment 51
Proposal for a regulation
Article 15 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the facts and circumstances relied on in taking the decision, including where relevant whether the decision was taken pursuant to a notice submitted in accordance with Article 14; |
(b) the facts and circumstances relied on in taking the decision; |
Amendment 52
Proposal for a regulation
Article 15 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Paragraphs 2, 3 and 4 shall not apply to providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, those paragraphs shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof. |
Amendment 53
Proposal for a regulation
Article 16 – title
|
|
Text proposed by the Commission |
Amendment |
Exclusion for micro and small enterprises |
Exclusion for micro, small and medium enterprises |
Amendment 54
Proposal for a regulation
Article 16 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
This Section shall not apply to online platforms that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation2003/361/EC. This Section shall not apply to enterprises that previously qualified for the status of a medium-sized, small or microenterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof. |
Amendment 55
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions to remove, restrict or disable access to the information; |
Amendment 56
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) any other decision that affects the availability, visibility or accessibility of the content and/or of the recipient's account or the recipient's access to relevant platform services and features. |
Amendment 57
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly, offered in official national language and in English, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
Amendment 58
Proposal for a regulation
Article 17 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
3. Online platforms shall handle complaints submitted through their internal complaint-handling system within a maximum of 30 days and in a diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. |
Amendment 59
Proposal for a regulation
Article 17 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. |
4. Online platforms shall inform complainants without any delay that cannot be properly justified of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. |
Amendment 60
Proposal for a regulation
Article 19 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. |
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay, depending on the severity of the illegal activity. The obligation to prioritise notices submitted by trusted flaggers shall be without prejudice to other notices, when the trustworthiness, severity and urgency of these notices can be considered exceptional. |
Amendment 61
Proposal for a regulation
Article 19 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Notices submitted by local, regional and national authorities should be processed and decided upon with an equivalent degree of priority and delay as the notices provided by entities which have been awarded a trusted flagger status. |
Amendment 62
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Online platforms shall suspend, for a reasonable period of time, not exceeding 30 days, and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
Amendment 63
Proposal for a regulation
Article 20 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
2. Online platforms shall suspend, for a reasonable period of time, not exceeding 60 days, and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
Amendment 64
Proposal for a regulation
Article 20 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Platforms shall ensure that the personal data of consumers are treated pursuant to the Regulation (EU) 2016/679. |
Amendment 65
Proposal for a regulation
Article 22 – title
|
|
Text proposed by the Commission |
Amendment |
Traceability of traders |
Traceability of traders, products and services |
Amendment 66
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information: |
1. Where an online platform allows consumers to conclude distance contracts with professional traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained, and has made reasonable efforts to verify the completeness and reliability of, the following information: |
Amendment 67
Proposal for a regulation
Article 22 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the name, address, telephone number and electronic mail address of the trader; |
(a) the name, address, telephone number and electronic mail address of the trader and, as required under Union or national law, of the authorised representative of the trader, where required. |
Amendment 68
Proposal for a regulation
Article 22 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the bank account details of the trader, where the trader is a natural person; |
(c) the bank account details of the trader; |
Amendment 69
Proposal for a regulation
Article 22 – paragraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) where the trader is registered in a trade register or similar public register, the trade register in which the trader is registered and its registration number or equivalent means of identification in that register; |
(e) where the trader is subject to registration obligations in a trade register or similar public register, the register in which the trader is registered and its registration number or equivalent means of identification in that register, in particular by automated means, without engaging in general monitoring in accordance with Article 7; |
Amendment 70
Proposal for a regulation
Article 22 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) information and documentation about products and services required by Union or national law or by relevant technical standards and specifications, including product safety requirements. |
Amendment 71
Proposal for a regulation
Article 22 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. The online marketplace provider shall also undertake its best efforts to only allow the offer of products or services by traders that comply with the applicable rules of Union and national law. |
Amendment 72
Proposal for a regulation
Article 22 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. |
2. The online platform shall, upon receiving that information, and prior to allowing traders to use its services, make reasonable efforts to verify whether the information referred to in paragraph 1 is reliable, complete and up-to-date through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources in a way that does not result in costly active fact-finding exercises. |
Amendment 73
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. |
Where the online platform obtains indications that any item of information referred to in paragraph 1 or a visual representation or description thereof obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in a swift manner to ensure that all information is reliable, accurate, up-to-date and complete, without delay or within the time period set by Union and national law. |
Amendment 74
Proposal for a regulation
Article 22 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. |
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information within a reasonable time. |
Amendment 75
Proposal for a regulation
Article 22 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Without prejudice to paragraph 2, the platform shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. |
5. Without prejudice to paragraph 2, the platform shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Articles 8 and Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. |
Amendment 76
Proposal for a regulation
Article 22 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. The online platform shall not subvert or impair consumers’ decision-making or choice via the structure or function of operation of their online interface. |
Amendment 77
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: |
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear concise but meaningful, uniform and unambiguous manner and in real time: |
Amendment 78
Proposal for a regulation
Article 24 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) that the information displayed is an advertisement; |
(a) that the information displayed is an advertisement including when it is a result of an automated mechanism; |
Amendment 79
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) information about the main parameters used to target and determine the recipient to whom the advertisement is displayed. |
Amendment 80
Proposal for a regulation
Article 24 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 24 a |
|
Recommender systems for online platforms |
|
1. Online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available. Online platforms shall ensure consumers are not profiled by default, unless consumers genuinely opt-in, in line with the requirements established under Regulation (EU) 2016/679. |
|
2. Online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. However, the online platform shall not subvert or impair consumers’ autonomy, decision-making, or choice via the structure, function or manner of operation of their online interface or a part there of. |
Amendment 81
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipients of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission. |
The Digital Services Coordinator of establishment shall regularly verify, at least every two months, whether an online platform would qualify as a very large online platform in line with paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission. |
Amendment 82
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from four months after that publication. |
The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list regularly updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from four months after that publication. |
Amendment 83
Proposal for a regulation
Article 25 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 25a |
|
The right to an account |
|
1. Very large online platforms shall have the obligation of providing the ability for every user with legal intentions to create an account. The user shall be able to verify his account. |
Amendment 84
Proposal for a regulation
Article 29 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
1. Very large online platforms that use recommender systems, or other systems to prioritise content, including reducing the visibility of content, shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
Amendment 85
Proposal for a regulation
Article 29 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
2. Very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. However, the online platform shall not subvert or impair consumers’ autonomy, decision-making or choice via the structure, function or manner of operation of their online interface or a part thereof. |
Amendment 86
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or conflicts of interest, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
Amendment 87
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on fair competition and the protection of personal data. |
Amendment 88
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
2. The Commission shall aim to ensure that the codes of conduct pursue an effective and accurate transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on fair competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: |
Amendment 89
Proposal for a regulation
Article 36 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 36a |
|
Codes of conduct for the accomodation sector |
|
1. The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between online platforms, relevant service providers in the accommodation sector and relevant authorities to contribute to addressing illegal short - term rentals and to facilitate the enforcement of registration and authorisation schemes; |
Amendment 90
Proposal for a regulation
Article 37 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. |
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. The Commission is encouraged to develop, in cooperation with online platforms, national and European consumer organisations and civil society organisations and all related stakeholders, contingency plans for the tourism sector for future crises, including standards for force majeure-based cancellations, travel warnings and information flows. |
Amendment 91
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Member States shall make publicly available, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted. |
Member States shall make publicly available, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted. The Commission, after consulting the Member States, shall provide guidance to ensure a consistent approach on how national, local and regional authorities should relate to their Digital Services Coordinators. |
Amendment 92
Proposal for a regulation
Article 39 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have necessary technical, financial and human resources to carry out their tasks. Such resources could include - and not be limited to - access to training and regular exchanges with the service provider to understand the specificities of their business model. |
Amendment 93
Proposal for a regulation
Article 39 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party. |
2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall not take instructions from any other public authority or any private party. Digital Services Coordinators shall be able to seek information from a public authority or private party if it deems it necessary to carry its role and power and still maintain its independence and neutrality. |
Justification
Any DSC should be able to seek information from a public authority or private party if it deems it necessary for matters outside of its competence and knowledge.
Amendment 94
Proposal for a regulation
Article 43 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Recipients of the service, representative organisations and other parties with a legitimate interest, shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Amendment 95
Proposal for a regulation
Article 48 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure. |
3. The Board shall be chaired and guided by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure. |
Justification
The governance of the DSA is a key determinant to ensure harmonization and consistent application of the rules, and the role of the Commission as the defender and enforcer of the (digital) Single Market within this must be properly recognized and emphasized.
Amendment 96
Proposal for a regulation
Article 48 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission shall provide administrative and analytical support for the activities of the Board pursuant to this Regulation. |
4. The Commission shall provide administrative and analytical support for the activities of the Board pursuant to this Regulation. The Board shall respect and take into account Commission’s, as Digital Single Market guardian, guidance and analytical support into its decisions. |
Justification
The governance of the DSA is a key determinant to ensure harmonization and consistent application of the rules, and the role of the Commission as the defender and enforcer of the Digital Single Market within this must be properly recognized and emphasized.
Amendment 97
Proposal for a regulation
Article 49 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) issue opinions, recommendations or advice to Digital Services Coordinators in accordance with this Regulation; |
(c) as and when requested by a Digital Services Coordinator, issue non-legally binding opinions, and recommendations, in discussion with all involved stakeholders, which serve as a way to remedy the problem and ensure a consistent enforcement of this Regulation. |
Amendment 98
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) responsible for ensuring that the conditions for country-of-origin derogation are interpreted strictly and narrowly to ensure consistent application of this Regulation. |
Amendment 99
Proposal for a regulation
Article 50 – paragraph 4 – introductory part
|
|
Text proposed by the Commission |
Amendment |
4. The Digital Services Coordinator of establishment shall communicate to the Commission, the Board and the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable: |
4. The competent Digital Services Coordinator shall communicate to the Commission, the Board and the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable: |
Amendment 100
Proposal for a regulation
Article 51 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The Commission shall carry out the tasks described under this Section in full independence from undue political or corporate interference. |
Amendment 101
Proposal for a regulation
Article 52 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. |
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a period of two months. |
Amendment 102
Proposal for a regulation
Article 52 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect or misleading information. |
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect, false or misleading information. |
Amendment 103
Proposal for a regulation
Article 52 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading. |
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect, false or misleading. |
Amendment 104
Proposal for a regulation
Article 54 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. In order to carry out the tasks assigned to it under this Section, the Commission may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1). |
1. In order to carry out the tasks assigned to it under this Section, the competent authorities of the Member State may, at the request of the Commission, conduct on-site inspections within the meaning of Article 12 of Regulation (EU) No 139/2004 at the premises of the very large online platform concerned or other person referred to in Article 52(1). |
Amendment 105
Proposal for a regulation
Article 54 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. During on-site inspections the Commission and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data-handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1). |
3. During on-site inspections the competent authorities of the Member States may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data-handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1). |
Amendment 106
Proposal for a regulation
Article 54 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union. |
4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered at the request of the Commission. The request shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union. |
Amendment 107
Proposal for a regulation
Article 59 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) supply incorrect, incomplete or misleading information in response to a request pursuant to Article 52 or, when the information is requested by decision, fail to reply to the request within the set time period; |
(a) supply incorrect, incomplete, false or misleading information in response to a request pursuant to Article 52 or, when the information is requested by decision, fail to reply to the request within the set time period; |
Amendment 108
Proposal for a regulation
Article 69 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation]. |
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for five years from [date of expected adoption of the Regulation]. |
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
TRAN 8.2.2021 |
|||
Rapporteur for the opinion Date appointed |
Roman Haider 8.3.2021 |
|||
Date adopted |
27.9.2021 |
|
|
|
Result of final vote |
+: –: 0: |
42 7 0 |
||
Members present for the final vote |
Magdalena Adamowicz, Andris Ameriks, José Ramón Bauzá Díaz, Izaskun Bilbao Barandica, Paolo Borchia, Marco Campomenosi, Massimo Casanova, Ciarán Cuffe, Johan Danielsson, Karima Delli, Anna Deparnay-Grunenberg, Ismail Ertug, Gheorghe Falcă, Giuseppe Ferrandino, Mario Furore, Søren Gade, Isabel García Muñoz, Jens Gieseke, Elsi Katainen, Kateřina Konečná, Elena Kountoura, Julie Lechanteux, Peter Lundgren, Benoît Lutgen, Elżbieta Katarzyna Łukacijewska, Marian-Jean Marinescu, Tilly Metz, Cláudia Monteiro de Aguiar, Caroline Nagtegaal, Jan-Christoph Oetjen, Philippe Olivier, João Pimenta Lopes, Rovana Plumb, Tomasz Piotr Poręba, Dominique Riquet, Dorien Rookmaker, Massimiliano Salini, Vera Tax, Barbara Thaler, István Ujhelyi, Henna Virkkunen, Petar Vitanov, Elissavet Vozemberg-Vrionidi, Roberts Zīle, Kosma Złotowski |
|||
Substitutes present for the final vote |
Ignazio Corrao, Josianne Cutajar, Tomasz Frankowski, Markus Pieper |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
42 |
+ |
ECR |
Peter Lundgren, Tomasz Piotr Poręba, Roberts Zīle, Kosma Złotowski |
ID |
Paolo Borchia, Marco Campomenosi, Massimo Casanova, Julie Lechanteux, Philippe Olivier |
NI |
Mario Furore |
PPE |
Magdalena Adamowicz, Gheorghe Falcă, Tomasz Frankowski, Jens Gieseke, Elżbieta Katarzyna Łukacijewska, Benoît Lutgen, Marian-Jean Marinescu, Cláudia Monteiro de Aguiar, Markus Pieper, Massimiliano Salini, Barbara Thaler, Henna Virkkunen, Elissavet Vozemberg-Vrionidi |
Renew |
José Ramón Bauzá Díaz, Izaskun Bilbao Barandica, Søren Gade, Elsi Katainen, Caroline Nagtegaal, Jan-Christoph Oetjen, Dominique Riquet |
S&D |
Andris Ameriks, Josianne Cutajar, Johan Danielsson, Ismail Ertug, Giuseppe Ferrandino, Isabel García Muñoz, Rovana Plumb, Vera Tax, István Ujhelyi, Petar Vitanov |
The Left |
Kateřina Konečná, Elena Kountoura |
7 |
- |
NI |
Dorien Rookmaker |
The Left |
João Pimenta Lopes |
Verts/ALE |
Ignazio Corrao, Ciarán Cuffe, Karima Delli, Anna Deparnay-Grunenberg, Tilly Metz |
0 |
0 |
|
|
Key to symbols:
+ : in favour
- : against
0 : abstention
OPINION OF THE COMMITTEE ON CULTURE AND EDUCATION (5.10.2021)
for the Committee on the Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9‑0418(2020) – 2020/0361(COD))
Rapporteur for opinion: Sabine Verheyen
SHORT JUSTIFICATION
On 15 December 2020, the European Commission published its legislative proposal for a Digital Services Act (DSA), as a significant step forward in regulating online content at Union level, especially in guaranteeing online safety and the protection of fundamental rights in the digital environment.
The proposed Regulation aims to establish a more accountable online environment by imposing obligations on platforms to act against illegal content, whilst empowering platforms’ users with enhanced transparency and traceability, and better reporting systems.
It sets, as a lex generalis, horizontal and harmonised standards for a wide range of online platforms. The proposal also aims to revise the liability regime of Directive 2000/13/EC (e-Commerce Directive), notably by replacing the « notice and take down » mechanisms by « notice and action » mechanisms. Such a regime, as well as the principle of « no general obligation on the providers to monitor», has been the basis of Directive (EU) 2018/1808 (AVMSD) in regulating Video Sharing Platforms (VSPs).
However, this all-encompassing approach could bring about unintended consequences and interfere with sector-specific legislation. In the media and audiovisual sectors, this leads to overlaps with legislation at national and Union level as well as to legal uncertainties and discrepancies, whilst at the same time restricting Member States when taking regulatory measures to address cultural issues in relation to intermediary online service providers.
Overall, the Rapporteur welcomes the proposal. However, whilst supporting its main objectives, the Rapporteur would like to stress that the Regulation should enable users to continue accessing content that reflects media pluralism, cultural and linguistic diversity, as well as reliable news and information, with due respect to fundamental freedoms such as the freedom of expression.
In this context, the Rapporteur suggests a series of amendments in order to clarify the proposed provisions and improve the objectives they aim to achieve.
More specifically, the main points of the draft opinion are to:
(i) Ensure legal consistency with the AVMSD:
As the Regulation amends certain provisions of the e-Commerce Directive and proposes a series of articles (from Article 14 of the proposal) that overlap or partially cover Article 28b of the AVMSD, it is crucial to ensure that such a revision does not affect the regulation of VSPs as laid down in the revised AVMSD. The Rapporteur therefore considers it crucial that the Directive remains the key legal instrument for harmonising standards for audiovisual content online at Union level, by clarifying that the Regulation does not affect existing or future sectorial measures nor those that aim to promote cultural diversity, media freedom and pluralism.
(ii) Harmonise the existing rules on the removal of illegal content:
The Rapporteur welcomes the fact that the general principles of the liability regime set up in the e-Commerce Directive have been maintained and supports the proposed ‘notice-and-action’ mechanisms as a fundamental requirement for all platforms providing services in the Digital Single Market.
In that regard, clear definitions and effective procedures are of utmost important. Moreover, in cases of illegal content, such as incitement to terrorism, hate speech, or child sexual abuse material as well as infringements of intellectual property rights, it is crucial to ensure that service providers take quick and effective measures within a short period of time, not only to remove illegal content from their services but also that such content remains inaccessible after being removed.
(iii) Editorial responsibility:
Media service providers are strictly regulated at both Union and national level, guided by professional editorial standards no matter how their content and services are consumed. It is thus essential to protect editorial independence in the media sector. In that context, the Rapporteur considers that commercial online platforms should not be allowed to exercise a supervisory function over legally distributed online content originating from service providers who exercise editorial responsibility and consistently adhere to Union and national law as well as journalistic and editorial principles. Media service providers should also remain solely responsible for the content and services they produce, as platforms cannot be held either responsible or liable for the content offered by media service providers on their platforms.
(iv) Enhance transparency
The Rapporteur considers that the Regulation should set high transparency standards on all online platforms regarding algorithmic decision-making processes and content recommendations. It is essential that users better understand how platforms’ recommender systems affect the visibility, accessibility and availability of content and services online, as algorithm-based content recommendations may have a serious impact on cultural diversity.
AMENDMENTS
The Committee on Culture and Education calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice and rights. |
Amendment 2
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’). These rights include, among others, the right to freedom of expression and information, freedom and pluralism of media, the right privacy and to protection of personal data, the freedom to conduct a business, the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination. |
Amendment 3
Proposal for a regulation
Recital 9
|
|
Text proposed by the Commission |
Amendment |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
(9) Respecting the Union’s subsidiary competence to take cultural aspects into account in its action under Article 167(4) of the Treaty on the Functioning of the European Union, this Regulation should not affect Member States’ competences in their respective cultural policies, in particular as regards national measures addressed to intermediary service providers in order to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity. This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply while not affecting the Member States’ competences to adopt and further develop laws, regulations and other measures in order to secure and promote the freedom of expression and information in the media, promoting press freedom in line with the Charter of fundamental rights as well as cultural and linguistic diversity. Where those acts leave Member States the possibility of adopting certain measures at national level, this possibility should remain unaffected by this Regulation, in particular their right to adopt stricter measures. In the event of a conflict between this Regulation, on the one hand, and Directive 2010/13/EU or the national transposition instruments adopted by Member States to comply with that Directive, on the other hand, the latter should prevail. |
__________________ |
__________________ |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
Amendment 4
Proposal for a regulation
Recital 11
|
|
Text proposed by the Commission |
Amendment |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected. |
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 of the European Parliament and of the Council1a, as transposed in national laws, which establish specific rules and procedures that should remain unaffected. As a whole, it is necessary for this Regulation to ensure legal certainty for platforms and safeguard the fundamental rights of users. No provision in this Regulation should lead to less favourable solutions to guarantee a high level of protection of copyright and related rights that the one prevailing before its entry into force or after in the Union’s and its Member States’ positive law relating to the protection of literary and artistic property. |
|
__________________ |
|
1a Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (OJ L 130, 17.5.2019, p. 92). |
Amendment 5
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) Currently, the definitions of ‘illegal content’ varies based on national law, and ambiguous definitions of this term in the Regulation would create an unpredictable regulatory environment for all digital service providers in Europe. Without a clear definition, digital service providers and intermediaries would be held to opaque and unreasonable standards. Confusion about what constitutes illegal content could lead service providers and intermediaries to wrongfully restrict some types of content, which would harm fundamental rights such as the freedom of expression and opinion. In order to achieve therefore the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation, the concept of “illegal content” should underpin the general principle that what is illegal offline should also be illegal online, and what is legal offline should also be legal online. The concept should also be defined appropriately to cover information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use or illegal dissemination of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
Amendment 6
Proposal for a regulation
Recital 13
|
|
Text proposed by the Commission |
Amendment |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, search engines, content-sharing platforms or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms for the purposes of this Regulation, where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. |
Amendment 7
Proposal for a regulation
Recital 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(15a) Ensuring that providers of intermediary services can offer effective end-to-end encryption to data is essential for trust in and security of digital services in the Digital Single Market, and effectively prevents unauthorised third-party access. |
Amendment 8
Proposal for a regulation
Recital 18
|
|
Text proposed by the Commission |
Amendment |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. |
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including but not limited to the cases where the provider optimises, promotes or moderates content, beyond offering basic search and indexing functionalities that are absolutely necessary to navigate the content, or incites users to upload content, irrespective of whether the process is automated, where the information has been developed under the editorial responsibility of that provider. |
Amendment 9
Proposal for a regulation
Recital 18 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(18a) Directive 2000/31/EC provides that the exemptions from liability cover only cases where the activity of the information society service provider is limited to the technical process of operating and giving access to a communication network over which information made available by third parties is transmitted or temporarily stored, for the sole purpose of making the transmission more efficient, and that this activity is of a mere technical, automatic and passive nature, which implies that the information society service provider has neither knowledge of, nor control over, the information which is transmitted or stored. This implies that all active services are excluded from the limited liability regime. In that context, those exemptions should also not be given to providers of intermediary services that do not comply with the due diligence obligations set out in this Regulation. |
Amendment 10
Proposal for a regulation
Recital 20
|
|
Text proposed by the Commission |
Amendment |
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
(20) A provider of intermediary services that engages with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. |
Amendment 11
Proposal for a regulation
Recital 22
|
|
Text proposed by the Commission |
Amendment |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. In order to ensure harmonised implementation of illegal content removal throughout the Union, the provider should, without delay, remove or disable access to said illegal content. In practice, such an order to remove illegal content could also effectively address the reappearance of this illegal content. If a hosting service provider is ordered by an administrative or judicial authority to prevent infringements, such an order should in principle be limited to a specific infringement and to specific parts of the service, but may be extended to all copies of that specific content, to efficiently ensure that the infringing content does not reappear, taking into account the potential harm the illegal content in question may create. The prevention of the reappearance of illegal content should under no circumstances give rise to a general monitoring obligation or an obligation for the provider to carry out investigations without a specific reason, and safeguards must be established so that stay-down measures never lead to any unavailability of legal content. It should be considered that a general monitoring obligation exists if a hosting service provider is obliged to screen an unspecified amount of information provided by a recipient of the service in order to prevent a specific infringement of the applicable law. The removal or disabling of access should be undertaken with due respect of all relevant principles enshrined in the Charter of Fundamental Rights, including the freedom of expression. The provider can obtain actual knowledge or awareness through its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. |
Amendment 12
Proposal for a regulation
Recital 23
|
|
Text proposed by the Commission |
Amendment |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. |
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average consumer. |
Amendment 13
Proposal for a regulation
Recital 25
|
|
Text proposed by the Commission |
Amendment |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
(25) In order to create legal certainty and not to discourage activities undertaken for the purpose of detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
Amendment 14
Proposal for a regulation
Recital 26
|
|
Text proposed by the Commission |
Amendment |
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. |
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall the generally important role played by those providers. In many cases, such providers may be the best placed to address the problem of illegal content and activities by removing or limiting access to such content or by bringing such activities to an end. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is appropriate to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. |
Amendment 15
Proposal for a regulation
Recital 28
|
|
Text proposed by the Commission |
Amendment |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. |
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation, concerning content which is identical to the content which was previously declared unlawful, or blocking access to that content, as well as to equivalent content which remains essentially unchanged compared with the content which gave rise to the finding of illegality. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content or impeding their ability to undertake proactive measures to identify and remove illegal content and to prevent its reappearance. |
Amendment 16
Proposal for a regulation
Recital 28 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(28a) Since editorial content providers hold editorial responsibility for the content and services they make available, a presumption of legality should exist in relation to the content provided by those providers who carry out their activities in respect of European values and fundamental rights. Such content and services should benefit from a specific regime that prevents a multiple control of that content and those services. That content and those services should be offered in accordance with professional and journalistic standards, as well as legislation, and are already subject to systems of supervision and control, often enshrined in commonly accepted self-regulatory standards and codes. In addition, they usually have in place complaints-handling mechanisms to resolve content-related disputes. Editorial responsibility means the exercise of effective control both over the selection of content and over its provision by means of its presentation, composition and organisation. Editorial responsibility does not necessarily imply any legal liability under national law for the content or the services provided. In any case, any provider of an audiovisual media service as defined in Article 1(1), point (a) of Directive 2010/13/EU and publishers of press publications as defined in Article 2 point (4) of Directive (EU) 2019/790 should be considered as editorial content providers for the purposes of this Regulation. Intermediary service providers should refrain from removing, suspending or disabling access to any such content or services, and should be exempt from liability for such content and services. Compliance by editorial content providers with these rules should be overseen by the respective independent regulatory authorities, bodies or both and the respective European networks they are organised in. |
Amendment 17
Proposal for a regulation
Recital 31 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(31a) It is imperative that the Commission ensure the proper enforcement of this Regulation at Union level, across Member States, in order to avoid potential inequalities, differences of approach and unfair competition within or from outside the Union. |
Amendment 18
Proposal for a regulation
Recital 34
|
|
Text proposed by the Commission |
Amendment |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as freedom of information and data security and trust of the recipients of the service, including minors and vulnerable users, and the relevant fundamental rights to freedom of expression and protection against discrimination enshrined in the Charter, and to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
Amendment 19
Proposal for a regulation
Recital 36 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(36a) Very large online platforms should provide for the possibility to communicate to their points of contact in each official language of the Member States where they provide services. Other providers of intermediary services should ensure that the choice of language does not impose a disproportionate burden on Member States' authorities and should make every effort to establish effective communication options. A possible language barrier should not be invoked as a reason to ignore or deny communication with a Member States' authorities and should not be used as an excuse for inaction. Where necessary, Member States' authorities and providers of intermediary services may reach a separate agreement on the language of communication. |
Amendment 20
Proposal for a regulation
Recital 38
|
|
Text proposed by the Commission |
Amendment |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. |
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of the rights of the recipients of the service and the avoidance of unfair or arbitrary outcomes. Terms and conditions should be summarised in a clear, accessible and easily comprehensible manner while offering the possibility of opting-out from optional clauses. Intermediary service providers should be prohibited from drawing up terms and conditions that go against Union and national law and that lead to the removal, disabling of access to other kind of interferences with content and services of editorial content providers. The freedom and pluralism of media should be respected. To this end, Member States should ensure that editorial content providers have the possibility to contest decisions of online platforms or to seek judicial redress in accordance with the national law of the Member State concerned. |
Amendment 21
Proposal for a regulation
Recital 38 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(38b) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in so far as they qualify as providers of hosting services covered by this Regulation. |
Amendment 22
Proposal for a regulation
Recital 39
|
|
Text proposed by the Commission |
Amendment |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should draw up an annual report which they should make publicly available, in a standardised and machine-readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions including comprehensive anonymised statistical analysis of measures taken and the misuses of services and manifestly unfounded notices or complaints under the mechanisms established under this Regulation, and where a platform is an online marketplace, their business users. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC40 or not-for-profit services with fewer than 100 000 monthly active users. |
_________________ |
_________________ |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 23
Proposal for a regulation
Recital 39 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(39a) Recipients of the service should be empowered to make autonomous decisions inter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, and recommender systems when interacting with intermediary services. However, it is possible for providers of intermediary services to exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose, by deceiving or nudging recipients of the service and subverting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark pattern’). Providers of intermediary services should be prohibited from using such dark patterns. This includes, but is not limited to, exploitative design choices to direct the user to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, repetitively requesting or pressuring the recipient to make a decision or hiding or obscuring certain options. |
Amendment 24
Proposal for a regulation
Recital 40
|
|
Text proposed by the Commission |
Amendment |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Moreover, the notification and action mechanism should be supplemented by actions aimed to prevent the reappearance of content which has been identified as illegal or which is identical to content which had been identified and withdrawn as illegal. The application of this requirement should in no way lead to a general monitoring obligation. |
Amendment 25
Proposal for a regulation
Recital 42
|
|
Text proposed by the Commission |
Amendment |
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. |
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of efficient, proportionate and accurate automated means accompanied by human oversight, the provider should inform the recipient of its decision, the reasons for its decision and the available effective redress possibilities to rapidly contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. |
Amendment 26
Proposal for a regulation
Recital 42 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(42a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in overblocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood as making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation, the provider should ensure human review and the protection of legal content. |
Amendment 27
Proposal for a regulation
Recital 43
|
|
Text proposed by the Commission |
Amendment |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. |
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations, which is to be encouraged. |
_________________ |
_________________ |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 28
Proposal for a regulation
Recital 44
|
|
Text proposed by the Commission |
Amendment |
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
(44) Recipients of the service, including persons with disabilities, should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. Such internal systems should be available also to individuals or entities that have submitted a notice. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. |
Amendment 29
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, have significant legitimate interests and have demonstrated competence for the purposes of detecting, identifying and notifying illegal content, and that they work in a diligent and objective manner. Such entities can also be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, right holders‘ representatives and organisations of industry could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions, including their competence and objectivity. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
__________________ |
__________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 30
Proposal for a regulation
Recital 46 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(46a) Trusted flaggers should also be able to submit complaints to the Digital Service Coordinators about those activities by online platforms that create a systemic risk. |
Amendment 31
Proposal for a regulation
Recital 47
|
|
Text proposed by the Commission |
Amendment |
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
(47) The misuse of services of online platforms by repeatedly providing or disseminating illegal content or by repeatedly submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. |
Amendment 32
Proposal for a regulation
Recital 48 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(48a) Online transparency requirements for commercial entities are vital for ensuring accountability, trust and access to effective redress. To this end, Article 5 of Directive 2000/31/EC establishes general information requirements for service providers to render to service recipients and competent authorities. In addition, Article 6 of Regulation (EU) 2016/679 allows for the processing and disclosure of all information on domain name holders from the WHOIS database for the performance of tasks carried out in the public interest, and a number of Member States require their national country code top-level domain registries to make such information publicly accessible. However, the lack of effective enforcement of Article 5 of Directive 2000/31/EC and the often outdated and inaccurate information contained within the WHOIS database emphasise the need to put in place a clear obligation for providers of intermediary services to verify the identity of their business customers. The ‘know your business customer’ obligation should also prohibit providers of intermediary services from providing their services to unverified customers and oblige them to cease the provision of their services when the identification provided proves to be incomplete, inaccurate or fraudulent. |
Amendment 33
Proposal for a regulation
Recital 49
|
|
Text proposed by the Commission |
Amendment |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers and other users, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter the selling and dissemination of products and services in violation of the applicable rules, all providers of intermediary services, including hosting providers, domain name registrars, providers of content delivery networks, proxy and reverse proxy providers, online marketplaces, online payment service providers and online advertising service providers should ensure that their business customers are traceable. The business customer should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to business customers that promote messages on products or services on behalf of brands, based on underlying agreements. Providers of intermediary services should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed and verified, in accordance with the applicable law, including on the protection of personal data, by the providers of intermediary services, public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. |
Amendment 34
Proposal for a regulation
Recital 50
|
|
Text proposed by the Commission |
Amendment |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the providers of intermediary services covered should make reasonable efforts to verify the reliability of the information provided by their business customers concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System, or by requesting their business customers concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the providers of intermediary services covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such providers of intermediary services, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability and accuracy of the information towards consumer or other interested parties. Such providers of intermediary services should update the information they hold on a risk-sensitive basis, and at least once a year and also design and organise their online interface in a way that enables their business customers to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . |
__________________ |
__________________ |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers |
Amendment 35
Proposal for a regulation
Recital 51
|
|
Text proposed by the Commission |
Amendment |
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union. |
(51) Very large online platforms are used in a way that strongly influences online safety, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to their own benefit with their advertising-driven business models, which can cause societal concerns. In the absence of effective regulation and enforcement, they can set up the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the risks stemming from the functioning and use of their service, as well as from potential misuses by the recipients of the service, and take appropriate mitigating measures. |
Amendment 36
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Advertising funding helps to ensure that European citizens can enjoy news and entertainment services for free or at a reduced rate. Without effective advertising, funding for all sorts of media would be greatly reduced, which may lead to more expensive TV-subscriptions, reduced newspapers and magazines’ plurality and independence, and some radio stations would lack the ability to provide news and entertainment throughout the day, to the detriment of media pluralism and cultural diversity. Advertising is a key source of growth for a number of audiovisual media service providers, press publishers and radio stations. The use of data, in full compliance with the obligations set out in the Regulation (EU) 2016/679 and Directive 2002/58/EC, is a way to improve the effectiveness of advertising. It is therefore important for this regulation to focus on delivering more advertising transparency while not negatively affecting the effectiveness of advertising for news and entertainment services. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling, and opt for less intrusive forms of advertising that do not require any tracking of user interaction with content. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU)2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for advertising. Additionally, online platforms should provide recipients of the service to whom they supply online advertising, when requested and to the extent possible, with information that allows recipients of the service to understand how data was processed, categories of data or criteria on the basis of which ads may appear, and data that was disclosed to advertisers or third parties, and refrain from using any aggregated or non-aggregated data, which may include anonymised and personal data without the explicit consent of the data subject. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
Amendment 37
Proposal for a regulation
Recital 53
|
|
Text proposed by the Commission |
Amendment |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. |
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, specifically regarding disinformation, online harassment, hate speech or any other types of harmful content, there being no alternative and less restrictive measures that would effectively achieve the same result. |
Amendment 38
Proposal for a regulation
Recital 56
|
|
Text proposed by the Commission |
Amendment |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. |
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the impact on fundamental rights of the functioning and use of their service, as well as of potential misuses by the recipients of the service, and take appropriate mitigating measures, including by adapting algorithmic recommender systems and online interfaces, in particular as regards their potential for amplifying certain content, including disinformation. |
Amendment 39
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Three categories of risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products, or services prohibited by Union or national law, including counterfeit products or the illegal display of copyright protected content. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may be incorporated in the basic programming of the algorithms used by the very large online platform or arise from the misuse of their service through the submission of abusive notices or other methods for silencing speech hampering competition, or the way platforms' terms and conditions including content moderation policies are enforced. Therefore, it is necessary to promote adequate changes in platforms' conduct, a more accountable information ecosystem, enhanced fact-checking capabilities and collective knowledge on disinformation, and the use of new technologies in order to improve the way information is produced and disseminated online. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
Amendment 40
Proposal for a regulation
Recital 58
|
|
Text proposed by the Commission |
Amendment |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the risks identified in the risk assessment. Very large online platforms should under such mitigating measures enhance or otherwise adapt the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they limit the dissemination of illegal content, for instance by building in systems to demote content identified as harmful, introducing artificial delays to limit virality, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources such as public interest information provided by public authorities or international organisations or content under the control of an editorial content provider and subject to specific standards, sector- specific regulation and oversight. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They should also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. Mitigation of risks which would lead to removal, disabling access to or otherwise interfering with content and services for which an editorial content provider holds editorial responsibility should not be considered reasonable or proportionate. |
Amendment 41
Proposal for a regulation
Recital 59
|
|
Text proposed by the Commission |
Amendment |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, relevant regulatory authorities, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
Amendment 42
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of an online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. These recommender systems can also have an impact on consumers' media consumption and cultural practices and might lead to their being enclosed in a bubble without enabling them to discover other content. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
Amendment 43
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
(63) Advertising systems used by online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with negative impact on public health, public security, civil discourse, political participation and equality. Repositories should be searchable, easy to access and functional and should include the content of advertisements and related data on the advertiser and the delivery of the advertisement. |
Amendment 44
Proposal for a regulation
Recital 64
|
|
Text proposed by the Commission |
Amendment |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems by providing relevant source code and associated data that allow the detection of possible biases or threats to fundamental rights for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations of possible biases or threats to fundamental rights are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides frameworks for compelling access to data from very large online platforms to the Digital Services Coordinator and the Commission. All requirements for access to data under those frameworks should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets in line with Directive (EU) 2016/943 of the European Parliament and of the Council1a and the privacy of any other parties concerned, including the recipients of the service. |
|
__________________ |
|
1a Directive (EU) 2016/943 of the European Parliament and of the Council of 8 June 2016 on the protection of undisclosed know-how and business information (trade secrets) against their unlawful acquisition, use and disclosure (OJ L 157, 15.6.2016, p. 1). |
Amendment 45
Proposal for a regulation
Recital 64 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(64a) Moderation and recommendation algorithms used by very large online platforms pose high risks and require closer and further regulatory supervision, because of the presence of algorithmic biases, which often leads to a massive dissemination of illegal content or threats to fundamental rights including freedom of expression. Taking into account the permanent evolution of these algorithms and the immediate risks they could generate when deployed, very large online platforms should ensure full and real-time disclosure of moderation and recommendation algorithms to the Digital Services Coordinator or the Commission. Such disclosure should include all the data regarding the creation and the setting of those algorithms, such as corresponding datasets. To facilitate the supervision of the Digital Services Coordinator or the Commission, this Regulation provides a framework of obligations for very large online platforms, including explainability of algorithms, accountability and close cooperation with the Digital Services Coordinator or the Commission. Where an algorithmic bias is detected, very large online platforms should correct it expeditiously, following requirements from the Digital Services Coordinator or the Commission. |
Amendment 46
Proposal for a regulation
Recital 65 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(65a) Given the cross-border nature of the services at stake, Union action to harmonise accessibility requirements for very large online platforms across the internal market is necessary to avoid market fragmentation and to ensure that equal right to access and choice of those services for persons with disabilities is guaranteed. Lack of harmonised accessibility requirements for digital services can create barriers for the implementation of existing Union law on accessibility, as many of the services falling under that law will rely on intermediary services to reach end-users. Therefore, accessibility requirements for very large online platforms, including their user interfaces, needs to be consistent with existing Union law on accessibility, including Directives (EU) 2016/21021a and (EU) 2019/8821b of the European Parliament and of the Council. |
|
__________________ |
|
1a Directive (EU) 2016/2102 of the European Parliament and of the Council of 26 October 2016 on the accessibility of the websites and mobile applications of public sector bodies (OJ L 327, 2.12.2016, p. 1). |
|
1b Directive (EU) 2019/882 of the European Parliament and of the Council of 17 April 2019 on the accessibility requirements for products and services (OJ L 151, 7.6.2019, p. 70). |
Amendment 47
Proposal for a regulation
Recital 66
|
|
Text proposed by the Commission |
Amendment |
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. |
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, interoperability of content hosting platforms or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. |
Amendment 48
Proposal for a regulation
Recital 67
|
|
Text proposed by the Commission |
Amendment |
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct. |
(67) The Commission and the Board should be able to request and coordinate the drawing-up of codes of conduct to contribute to the application of this Regulation. The implementation of codes of conduct should be measurable and subject to public oversight. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct. |
Amendment 49
Proposal for a regulation
Recital 68
|
|
Text proposed by the Commission |
Amendment |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. |
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal or harmful content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of risks on society, such as coordinated operations aimed at amplifying information, for instance through the use of bots, fake accounts and proxy services for the creation and propagation of fake or misleading information, sometimes with a purpose of obtaining economic or political gain, which are particularly harmful for vulnerable recipients of the service. Other areas for consideration could be to improve transparency regarding the origin of information and the way it is produced, sponsored, disseminated and targeted, to promote diversity of information through support of high quality journalism and relation between information creators and distributors, and to foster credibility of information by providing an indication of its trustworthiness, and improving traceability of information of influential information providers, whilst respecting confidentiality of journalistic sources. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal by an online platform of the Commission’s invitation to participate in the application of such a code of conduct must be taken into account, when determining whether the online platform has infringed the obligations laid down by this Regulation. When codes of conduct are used as a risk mitigating measure, they should be binding for very large online platforms, subjected to an oversight by the Digital Service Coordinator. |
Amendment 50
Proposal for a regulation
Recital 73 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(73a) The designation of a Digital Services Coordinator in the Member States should be without prejudice to already existing enforcement mechanisms, such as in the Union or national law on electronic communication or media, and independent regulatory structures in these fields as defined by Union and national law. The competences of the Digital Services Coordinator should not interfere with those of the appointed authorities. The different European networks, in particular the European Regulators Group for Audiovisual Media Services (ERGA) and the Body of European Regulators for Electronic Communications (BEREC) should be responsible for ensuring coordination and for contributing to the effective consistent application and enforcement of this Regulation throughout the Union. For the effective implementation of this task, those networks should develop suitable procedures to be applied in cases concerning this Regulation. |
Amendment 51
Proposal for a regulation
Recital 76 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(76a) Consumers, consumer organisations and rights holders should be able to lodge any complaint related to compliance of a marketplace with this Regulation with the Digital Services Coordinator in the Member State where they are based. Complaints should provide a faithful overview of issues related to a particular intermediary service provider’s compliance. The Digital Services Coordinator should involve national competent authorities and inform the Member State where the intermediary service provider concerned is established if the issue requires cross-border cooperation. Complaints should be dealt with in a timely manner no later than one month from the receipt of a complaint. |
Amendment 52
Proposal for a regulation
Recital 77
|
|
Text proposed by the Commission |
Amendment |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. |
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. Member States should also consider to provide specialised training, in cooperation with Union bodies, offices and agencies, for relevant national authorities, in particular administrative authorities, who are responsible for issuing orders to act against illegal content and provide information. |
Amendment 53
Proposal for a regulation
Recital 81
|
|
Text proposed by the Commission |
Amendment |
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. |
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations as well as parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. |
Amendment 54
Proposal for a regulation
Recital 91
|
|
Text proposed by the Commission |
Amendment |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be encouraged to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, intellectual property, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
Amendment 55
Proposal for a regulation
Recital 98 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(98a) In order to ensure the effective enforcement of this Regulation, the Commission should intervene where a common pattern of non-compliance with orders issued by national judicial or administrative authorities is identified by at least three Digital Services Coordinators or by the Board vis-à-vis the same online platform, irrespective of its size. A common pattern of non-compliance may be established, among others, in light of a manifest disregard or unjustified delays in executing mandatory orders issued by national judicial or administrative authorities concerning illegal content or requests of information, in accordance with Articles 8 and 9 of this Regulation. |
Amendment 56
Proposal for a regulation
Article 1 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
Amendment 57
Proposal for a regulation
Article 1 – paragraph 5 – introductory part
|
|
Text proposed by the Commission |
Amendment |
5. This Regulation is without prejudice to the rules laid down by the following: |
5. This Regulation shall not affect the rules laid down by the following: |
Amendment 58
Proposal for a regulation
Article 1 – paragraph 5 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) Union law on copyright and related rights; |
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 and the national transposition instruments adopted by Member States to comply with the Directive; |
Amendment 59
Proposal for a regulation
Article 1 – paragraph 5 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
This Regulation shall not affect the competences of Member States to adopt legislation addressed to providers of intermediary service, aimed at protecting or promoting freedom of expression and information, media freedom and pluralism and cultural and linguistic diversity, where the adoption of such legislation is deemed necessary so as to ensure, protect and promote the freedom of information and of the media or to foster the diversity of the media and diversity of opinion or cultural and linguistic diversity. |
Amendment 60
Proposal for a regulation
Article 1 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. Any contractual provisions between a provider of intermediary services and a trader, a business user or a recipient of the service which are contrary to this Regulation shall be invalid. This Regulation shall apply irrespective of the law applicable to contracts concluded between providers of intermediary services and a recipient of the service, a consumer, a trader or business user. |
Amendment 61
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) ‘business customer’ means: |
|
- legal entities, except any entity which qualifies as a large undertaking within the meaning of Article 3(4) of Directive 2013/34/EU of the European Parliament and the Council1a; |
|
- any natural person that purchases a type or amount of service indicative of, or that otherwise indicates, the intent to operate a business online, or contracts for the purchase of more than EUR 10 000 of services provided by the intermediary service provider in a one-year period; |
|
____________________ |
|
1a Directive 2013/34/EU of the European Parliament and of the Council of 26 June 2013 on the annual financial statements, consolidated financial statements and related reports of certain types of undertakings, amending Directive 2006/43/EC of the European Parliament and of the Council and repealing Council Directives 78/660/EEC and 83/349/EEC (OJ L 182, 29.6.2013, p. 19). |
Amendment 62
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
|
|
Text proposed by the Commission |
Amendment |
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service; |
— a ‘hosting’ service that consists of the storage or the permission of storage of information provided by, and at the request of, a recipient of the service; |
Amendment 63
Proposal for a regulation
Article 2 – paragraph 1 – point g
|
|
Text proposed by the Commission |
Amendment |
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
(g) ‘illegal content’ means any information made available which, in itself or by its reference to an activity, including the sale of products or provision of services. is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
Amendment 64
Proposal for a regulation
Article 2 – paragraph 1 – point g b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(gb) ‘personal data’ means personal data as defined in Article 4 point (1) of Regulation (EU) 2016/679; |
Amendment 65
Proposal for a regulation
Article 2 – paragraph 1 – point h
|
|
Text proposed by the Commission |
Amendment |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other principal service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. |
Amendment 66
Proposal for a regulation
Article 2 – paragraph 1 – point o
|
|
Text proposed by the Commission |
Amendment |
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
(o) ‘recommender system’ means a fully or partially automated system, designed as a separate tool from the principal service offered and used by an online platform to suggest, rank, prioritise, select and display in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; |
Amendment 67
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qa) “editorial content provider” means the natural or legal person who has editorial responsibility for the content and services they offer, determines the manner in which the content and the services are organised, who is subject to sector-specific regulation, including self-regulatory standards in the media and press sectors, and has put in place complaints-handling mechanisms to resolve content-related disputes. |
Amendment 68
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qb) ‘persons with disabilities’ means persons with disabilities as defined in Article 3 point (1) of Directive (EU) 2019/882; |
Amendment 69
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service on condition that the provider: |
1. Where an information society service is provided that consists of the storage or the permission of storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service on condition that the provider: |
Amendment 70
Proposal for a regulation
Article 5 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content. |
Where the illegal activity or the illegal content pertains to the broadcast of a live sports or entertainment event, the condition under point (b) of the first subparagraph shall be considered to be fulfilled if the provider acts immediately or as fast as possible, and in any event no later than within 30 minutes after obtaining knowledge or awareness of that illegal activity or illegal content. |
Amendment 71
Proposal for a regulation
Article 5 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Without prejudice to specific deadlines set out in Union or national law, providers of hosting services shall, upon obtaining actual knowledge or awareness of illegal content, remove or disable access to that content as soon as possible and in any event within 24 hours. Where the provider of hosting services cannot comply with this obligation on grounds of force majeure or for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the competent authority having issued an order pursuant to Article 8 or the recipient of the service having submitted a notice pursuant to Article 14, of those grounds. |
Amendment 72
Proposal for a regulation
Article 5 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. |
3. Paragraph 1 shall not apply with respect to liability, where hosting services, including online platforms, present a specific item of information or otherwise enables a specific transaction at issue in a way that would lead an average and reasonably well-informed recipient to believe that the information, or the product or service that is the object of the transaction, is provided either by the hosting service provider itself or by a recipient of the service who is acting under its authority or control. This is notably the case where online platforms present the information in a way that is not neutral as it specifically relates to the profile of the recipient of the service in order to maximise profit and attention of the recipient of the service. This is also the case where an online platform organises or promotes the information, products or services in such a way that the platform decides, based on human intervention or algorithms, which information, products or services are accessed or found and how that access is achieved. |
|
Paragraph 1 of this Article shall not apply for hosting services editorially controlled advertisement content as defined in Article 2(n). |
|
Providers of intermediary services shall not be exempt from liability referred to in Articles 3, 4 and 5, where their main purpose is to engage in or facilitate illegal activities. |
Amendment 73
Proposal for a regulation
Article 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 5a |
|
Providers of intermediary services shall be deemed ineligible for the exemptions from liability as referred to in Articles 3, 4 and 5 and liable to pay penalties in accordance with Article 42, where they do not comply with the due diligence obligations set out in this Regulation. |
Amendment 74
Proposal for a regulation
Article 6 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities undertaken for the specific purpose of detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. |
Amendment 75
Proposal for a regulation
Article 6 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Measures taken pursuant to the first subparagraph shall be effective, proportionate, specific, targeted and in accordance with the Charter. |
Amendment 76
Proposal for a regulation
Article 7 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. |
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. No provision of this Regulation shall be understood as prescribing, promoting or recommending the use of automated decision-making or the monitoring of the behaviour of a large number of natural persons, not even for statistical purposes. |
Amendment 77
Proposal for a regulation
Article 7 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Providers of intermediary services shall not be obliged to use automated tools for content moderation. |
Amendment 78
Proposal for a regulation
Article 7 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity. |
Amendment 79
Proposal for a regulation
Article 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 7a |
|
Prohibition of interference with content and services offered by editorial content providers |
|
Intermediary service providers shall not remove, disable access to or otherwise interfere with content and services made available by editorial content providers. |
|
Editorial content providers’ accounts shall not be suspended on the grounds of legal content and services they offer. This Article shall not affect the possibility for an independent judicial or independent administrative authority in line with Directive 2010/13/EU to require the editorial content provider to terminate or prevent an infringement of applicable Union or national law. |
Amendment 80
Proposal for a regulation
Article 8 – title
|
|
Text proposed by the Commission |
Amendment |
Orders to act against illegal content |
Cross-border orders to act against illegal content |
Amendment 81
Proposal for a regulation
Article 8 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. |
1. Providers of intermediary services shall, upon the receipt of a cross-border order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union and national law, take measures to comply with the order and law, inform the authority issuing the order of its receipt and the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. Under the condition that necessary safeguards are provided, such orders could, in particular, consist of catalogue-wide and dynamic injunctions by courts or administrative authorities requiring the cross-border termination or prevention of any infringement. |
Amendment 82
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
|
|
Text proposed by the Commission |
Amendment |
— one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned; |
— additional information enabling the identification of the illegal content concerned; |
Amendment 83
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider of the service and to the recipient of the service who provided the content; |
— information about redress available to the provider of the service and to the recipient of the service who provided the content including information about effective remedy; |
Amendment 84
Proposal for a regulation
Article 8 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. |
(c) the order is sent to the point of contact, appointed by the provider, in accordance with Article 10. |
Amendment 85
Proposal for a regulation
Article 9 – title
|
|
Text proposed by the Commission |
Amendment |
Orders to provide information |
Cross-border orders to provide information |
Amendment 86
Proposal for a regulation
Article 9 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
1. Providers of intermediary services shall, upon receipt of a cross-border order to provide information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. |
Amendment 87
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
|
|
Text proposed by the Commission |
Amendment |
— information about redress available to the provider and to the recipients of the service concerned; |
— information about content of the order and redress available to the provider and to the recipients of the service concerned; |
Amendment 88
Proposal for a regulation
Article 9 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control; |
(b) the order only requires the provider to provide information enabling the identification of recipients of the service and which lies within its control; |
Amendment 89
Proposal for a regulation
Article 9 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10; |
(c) the order is sent to the point of contact appointed by that provider, in accordance with Article 10; |
Amendment 90
Proposal for a regulation
Article 10 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Providers of intermediary services shall specify in the information referred to in paragraph 2, the official language or languages of the Union, which can be used to communicate with their points of contact and which shall include at least one of the official languages of the Member State in which the provider of intermediary services has its main establishment or where its legal representative resides or is established. |
3. Providers of intermediary services shall specify in the information referred to in paragraph 2, the official language or languages of the Union, which can be used to communicate with their points of contact and which shall include at least one of the official languages of the Member State in which the provider of intermediary services has its main establishment or where its legal representative resides or is established. Very large online platforms shall provide for the possibility to communicate to their points of contact in each official language of the Member States where they provide services. |
Amendment 91
Proposal for a regulation
Article 11 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. |
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative at least in one of the Member States where the provider offers its services. The right of Member States to require very large online platforms to designate a legal representative in their countries remains unaffected. |
Amendment 92
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Terms and conditions of providers of intermediary services shall respect the principles of human rights as enshrined in the Charter and international law. Providers of intermediary services shall include and publish information on any restrictions or modifications that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format and machine-readable format in the language in which the service is offered. Providers of intermediary services shall inform the recipients of their services of changes to their terms and conditions in a timely manner. |
Amendment 93
Proposal for a regulation
Article 12 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Providers of intermediary services shall publish summary versions of their terms and conditions in a clear, user-friendly and unambiguous language, and in an easily accessible and machine-readable format. Such summary versions shall include the main elements of the information requirements, including the possibility of easily opting-out from optional clauses as well as information on remedies and redress mechanisms available, such as the possibility to modify or influence the main parameters of recommender systems and advertisement options. |
Amendment 94
Proposal for a regulation
Article 12 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Very large online platforms shall ensure that their terms and conditions as well as their other policies, procedures, measures and tools used for the purpose of content moderation are applied and enforced in accordance with Article 26(2). |
Amendment 95
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall act in a coherent, predictable, non-discriminatory, transparent, diligent, non-arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, in compliance with procedural safeguards and with due regard to Union and national law and to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service, in particular the freedom of expression and information, as enshrined in the Charter. |
Amendment 96
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Terms and conditions, or specific provisions thereof, community standards or any other internal guidelines or tools implemented by an intermediary service provider shall be applied in compliance with Article 7a. Providers of intermediary services shall ensure that their terms and conditions as well as other policies, procedures, measures and tools used for the purpose of content moderation are applied and enforced in such a way as to prohibit any removal, suspension, disabling access to or any other interference with editorial content and services of an editorial content provider or their account in relation to the legal content offered by that editorial content provider. This Article shall not affect the possibility for an independent judicial or independent administrative authority in line with Directive 2010/13/EU to require the editorial content provider to terminate or prevent an infringement of applicable Union or national law. Intermediary service providers shall notify editorial content providers pursuant to Article 7a beforehand of any proposed changes to their terms and conditions and to their parameters or algorithms that might affect the organisation, presentation and display of content and services offered by the editorial content provider. The proposed changes shall not be implemented before the expiry of a notice period that is reasonable and proportionate to the nature and extent of the proposed changes and their impact on editorial content providers and the content and services they offer. That period shall begin on the date on which the online intermediary service provider notifies the editorial content providers of the proposed changes. The provision by an editorial content provider of new content and services using the intermediary services before the expiry of the notice period shall not be considered as a conclusive or affirmative action, given that such content is of particular importance for the exercise of fundamental rights, in particular the freedom of expression and information. Member States shall ensure that editorial content providers have the possibility to contest decisions of online platforms or to seek judicial redress in accordance with the national law of the Member State concerned. |
Amendment 97
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services shall be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant. |
Amendment 98
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. Terms and conditions that do not comply with this Article shall not be binding on recipients. |
Amendment 99
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible, detailed and accessible reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
Amendment 100
Proposal for a regulation
Article 13 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; |
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, separately for each Member State, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; |
Amendment 101
Proposal for a regulation
Article 13 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
(b) the number of notices submitted in accordance with Article 14, categorised by the category, including the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
Amendment 102
Proposal for a regulation
Article 13 – paragraph 1 – point b a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ba) the number of fact-checkers, content moderators and trusted flaggers reporting for each Member State, accompanied by statistical analysis on the use made of automated means and the human oversight of such means; |
Amendment 103
Proposal for a regulation
Article 13 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed, including decisions reversed based on redress possibilities. |
Amendment 104
Proposal for a regulation
Article 13 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Online marketplaces shall also publish, at least once a year, publicly available statistics on the proportion of content, goods or services offered by traders versus consumers and the location thereof. |
Amendment 105
Proposal for a regulation
Article 13 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
2. Paragraphs 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. |
Amendment 106
Proposal for a regulation
Article 13 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 13a |
|
Traceability of business customers |
|
1. A provider of intermediary services shall ensure that business customers can only use its services to promote messages on or to offer products, content or services to consumers located in the Union if, prior to the use of its services, the provider of intermediary services has obtained the following information: |
|
(a) the name, address, telephone number and electronic mail address of the business customer; |
|
(b) a copy of the identification document of the business customer or any other electronic identification as defined in Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council1a; |
|
(c) the bank account details of the business customer, where the business customer is a natural person; |
|
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3, point 13 and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council1b or any relevant act of Union law; |
|
(e) where the business customer is registered in a corporate or trade register or similar public register, the register in which the business customer is registered and its registration number or equivalent means of identification in that register; |
|
(f) a self-certification by the business customer committing to only offer products or services that comply with the applicable rules of Union law; |
|
2. The provider of intermediary services shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any publicly accessible official online database or online interface made available by a Member State or the Union or through requests to the business customer to provide supporting documents from reliable and independent sources. |
|
3. Where the provider of intermediary services obtains indications, including through a notification by law enforcement agencies or other individuals with a legitimate interest, that any item of information referred to in paragraph 1 obtained from the business customer concerned is inaccurate, misleading, or incomplete, or otherwise invalid, that provider of intermediary services shall request the business customer to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. Where the business customer fails to correct or complete that information, the provider of intermediary services shall suspend the provision of its service to the business customer until the request is complied with. |
|
4. The provider of intermediary services shall store the information obtained pursuant to paragraphs 1 and 2 in a secure manner for a period of two years following the termination of their contractual relationship with the business customer concerned. They shall subsequently delete the information. |
|
5. Providers of intermediary services shall apply the identification and verification measures set out in paragraphs 1 and 2 not only in relation to new business customers but they shall also update the information they hold on existing business customers on a risk-sensitive basis, and at least once a year, or when the relevant circumstances of a business customer change. |
|
6. Without prejudice to paragraph 2, the provider of intermediary services shall disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation, as well as pursuant to proceedings initiated under other relevant provisions of Union or national law. |
|
7. The provider of intermediary services shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. |
|
8. The provider of intermediary services shall design and organise its online interface in a way that enables business customers to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. |
|
9. The Digital Services Coordinator of establishment shall determine dissuasive financial penalties for non-compliance with this Article. |
|
____________________ |
|
1a Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73). |
|
1b Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1). |
Amendment 107
Proposal for a regulation
Article 13 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 13b |
|
Display of the identity of traders |
|
Intermediary service providers shall ensure that the identity, such as the trademark or logo or other characteristic traits, of the provider providing content, goods or services using the intermediary services is clearly visible alongside the content, goods or services offered. |
Amendment 108
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
|
|
Text proposed by the Commission |
Amendment |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify and assess the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: |
Amendment 109
Proposal for a regulation
Article 14 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; |
(b) sufficiently precise and adequately substantiated information to allow a diligent economic operator to reasonably identify the illegal content; |
Amendment 110
Proposal for a regulation
Article 14 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
3. Notices that are adequately precise, substantiated and that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. |
Amendment 111
Proposal for a regulation
Article 14 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. |
5. The provider shall also, without undue delay, notify that individual or entity whose content was removed or challenged of its action in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that action, including the opportunity to reply, unless this would obstruct the prevention and prosecution of serious criminal offences. The provider shall ensure that the decision-making process is reviewed and any final action or measure is taken by qualified staff; |
Amendment 112
Proposal for a regulation
Article 14 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner and in compliance with the obligation under Article 5(1)(b) to act expeditiously to remove or disable access to the illegal content. When a decision has been taken to remove or disable information, the providers of hosting services shall take all necessary measures to prevent the same or equivalent illegal content from reappearing on their service. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. |
Amendment 113
Proposal for a regulation
Article 14 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. When the provider of hosting services decides to remove or disable illegal information provided by the recipient of the service, the provider shall also prevent the reappearance of that information. This order may also extend to specific information that is identical to the notified information or to equivalent information which remains essentially unchanged compared to the information previously notified and removed or to which access was disabled. The application of this requirement shall not lead to any general monitoring obligation. |
Amendment 114
Proposal for a regulation
Article 14 – paragraph 6 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
6b. This Article does not apply to the editorial content and services provided by a media service provider identified in accordance with Article 12. |
Amendment 115
Proposal for a regulation
Article 14 – paragraph 6 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
6c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. |
Amendment 116
Proposal for a regulation
Article 14 – paragraph 6 d (new)
|
|
Text proposed by the Commission |
Amendment |
|
6d. The provider of hosting services shall ensure that the processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards, as well as appropriate working conditions are provided, including, where relevant, professional support, qualified psychological assistance and legal advice. |
Amendment 117
Proposal for a regulation
Article 15 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient and notifier, immediately after the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. |
Amendment 118
Proposal for a regulation
Article 15 – paragraph 2 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means; |
(c) where applicable, information on the use made of automated means accompanying the decision, including where the decision was taken in respect of content detected or identified using automated means; |
Amendment 119
Proposal for a regulation
Article 15 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 15a |
|
Trusted flaggers |
|
1. Providers of hosting services shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14 are processed and decided upon with priority and without delay. |
|
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: |
|
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; |
|
(b) it represents collective interests or it has a significant legitimate interest along with demonstrated expertise and a proven experience in flagging illegal content with high rate of accuracy while being independent from any online hosting services provider or platform; |
|
(c) it carries out not less than part of its activities for the purposes of submitting notices in a timely, diligent and objective manner. |
|
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
|
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. |
|
5. Where a provider of hosting services has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
|
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by a hosting services provider pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. |
|
7. The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6. |
Amendment 120
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: |
Amendment 121
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions to remove, restrict, demote, or disable access to or impose other sanctions against the information; |
Amendment 122
Proposal for a regulation
Article 17 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) decisions to suspend or terminate the recipients’ account. |
(c) decisions to suspend or terminate the recipients’ account; |
Amendment 123
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) decisions not to act upon the receipt of a notice. |
Amendment 124
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, including for persons with disabilities, user-friendly and non-discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities. |
Amendment 125
Proposal for a regulation
Article 17 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. |
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means but have adequate human oversight and are reviewed by qualified staff who shall receive adequate initial and ongoing training on the applicable legislation, including, where relevant, professional support, qualified psychological assistance and legal advice. |
Amendment 126
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Recipients of the service, as well as individuals or entities that have submitted a notice, addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. |
Amendment 127
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) the dispute settlement is easily accessible through electronic communication technology; |
(c) the dispute settlement is made easily accessible, including for persons with disabilities, through electronic communication technology; |
Amendment 128
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) the anonymity of the individuals involved in the settlement procedure can be guaranteed; |
Amendment 129
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) it is capable of settling dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union; |
(d) it ensures the settling of a dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union or, at the request of the recipient, at least in English; |
Amendment 130
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. |
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure which are easily and publicly accessible; |
Amendment 131
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) it ensures that a preliminary decision is taken within a period of seven days following the reception of the complaint and that the outcome of the dispute settlement is made available within a period of 90 calendar days from the date on which the body has received the complete complaint file. |
Amendment 132
Proposal for a regulation
Article 19 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) it represents collective interests and is independent from any online platform; |
(b) it represents collective interests, ensures independent public interest representation and is independent from any online platform, political parties or commercial interest; |
Amendment 133
Proposal for a regulation
Article 19 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. |
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. Digital Services Coordinators shall engage in a regular dialogue with platforms and rightholders for maintaining the accuracy and efficacy of a trusted flagger system. |
Amendment 134
Proposal for a regulation
Article 19 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. |
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of wrongful, insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned and shall inform the Board and other Digital Services Coordinators, providing the necessary explanations and supporting documents. |
Amendment 135
Proposal for a regulation
Article 19 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger |
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. Before revoking that status, the Digital Services Coordinator shall inform the Board and other Digital Services Coordinators of the decision to revoke. |
Amendment 136
Proposal for a regulation
Article 20 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. |
1. Online platforms shall suspend, or otherwise restrict, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide or disseminate illegal content. In cases of repeated suspension, providers of hosting services shall terminate the provision of their services and, where technically possible, introduce mechanisms that prevent the re-registration of recipients of service that frequently provide or disseminate illegal content. |
Amendment 137
Proposal for a regulation
Article 20 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. |
2. Online platforms may suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms, internal complaints-handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 18, respectively, by individuals or entities or by complainants that frequently or repeatedly submit notices or complaints or initiate dispute settlements that are unfounded. |
Amendment 138Proposal for a regulation
Article 20 – paragraph 3 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; |
(a) the absolute numbers of items of illegal content or unfounded notices or complaints, submitted in the past year; |
Amendment 139
Proposal for a regulation
Article 20 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. |
4. Online platforms shall set out, in a clear and detailed manner, with due regard to their obligations under Article 12(2), in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension or other restrictions of services on recipients of service. |
Amendment 140
Proposal for a regulation
Article 23 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; |
(b) the number of suspensions or other restrictions of services imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of illegal content, the submission of unfounded notices and the submission of unfounded complaints and presented separately by means identified, namely out-of-court disputes, notice and action mechanism or orders from judicial or administrative authority; |
Amendment 141
Proposal for a regulation
Article 23 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied. |
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied, including human oversight and decisions made. |
Amendment 142
Proposal for a regulation
Article 24 – paragraph -1 (new)
|
|
Text proposed by the Commission |
Amendment |
|
Online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
Amendment 143
Proposal for a regulation
Article 24 – paragraph -1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
Amendment 144
Proposal for a regulation
Article 24 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed and the advertising agency or publishers managing the advertisement, including the criteria used by the ad-tech platform services, such as pricing mechanisms, advertising auctions and their weighting, the fees charged by ad exchanges, and the identity of the natural or legal person(s) responsible for the possible automated system; |
Amendment 145
Proposal for a regulation
Article 24 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. |
(c) meaningful information about the parameters used to determine the recipient to whom the advertisement is displayed, including how the information is ranked and prioritised by algorithmically suggesting on users online interfaces in an easily comprehensive manner; |
Amendment 146
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) Providers of intermediary services shall, by default, not make the recipients of their services subject to targeted, micro targeted and behavioural advertising unless the recipient of the service has explicitly given consent via opt-in. |
Amendment 147
Proposal for a regulation
Article 24 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
With regard to requirements set out in points (b) and (c), providers of online advertising intermediaries must ensure the transmission of information held by them to recipients of the service. |
Amendment 148
Proposal for a regulation
Article 24 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 24a |
|
Additional due diligence requirements for online marketplaces |
|
Online marketplaces shall take reasonable precautions, such as regular spot checks on the products and services available on their platforms, in order to identify products or services that do not comply with Union or national law and shall take necessary measures to partially or fully suspend infringing traders. |
Amendment 149
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union and shall submit a report of that risk assessment to the national competent authority of the Member State in which their legal representative is established. This risk assessment shall be specific to their services and shall include the following systemic risks: |
Amendment 150
Proposal for a regulation
Article 26 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the dissemination of illegal content through their services; |
(a) the dissemination and amplification of illegal content through their services; |
Amendment 151
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any negative effects for the exercise of the fundamental rights, including to the respect for human dignity, private and family life, freedom of expression and information including the freedom and pluralism of the media, freedom of the arts and sciences, and the right to education, the prohibition of discrimination and the rights of the child, as enshrined in the Charter respectively; |
Amendment 152
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) any negative effects to the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
Amendment 153
Proposal for a regulation
Article 26 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. Very large online platforms shall ensure that their terms and conditions as well as other policies, procedures, measures and tools used for the purpose of content moderation are applied and enforced in such a way as to prohibit any removal, suspension, disabling access to or otherwise interference with content and services from the account of a recognised media service provider as defined in Article 1, paragraph 1 (a) of the Directive (EU) 2018/1808. |
Amendment 154
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific risks identified pursuant to Article 26. Such measures may include, where applicable: |
Amendment 155
Proposal for a regulation
Article 27 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide; |
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide, limiting providers of disinformation and monetisation of fake news, limiting reach of advertisement and advertisements identified as posing a risk pursuant to Article 26; |
Amendment 156
Proposal for a regulation
Article 27 – paragraph 1 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) initiating or adjusting cooperation with media service providers; |
Amendment 157
Proposal for a regulation
Article 27 – paragraph 1 – subparagraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
The decision as to the choice of measures shall remain with the platform. |
Amendment 158
Proposal for a regulation
Article 27 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
(a) identification and assessment of the most prominent and recurrent risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; |
Amendment 159
Proposal for a regulation
Article 27 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
3. The Commission, in cooperation with the Digital Services Coordinators, shall issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. |
Amendment 160
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: |
1. Very large online platforms shall be subject, at their own expense and at least twice a year, to audits to assess compliance with the following: |
Amendment 161
Proposal for a regulation
Article 29 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. |
1. The parameters used in recommender systems shall be set up in such a way that they reduce any potential bias and that they are non-discriminatory and adaptable. Online platforms that use recommender systems shall set out separately, in their terms and conditions and on a designated easily accessible webpage, in a manner which is clear, accessible and easily comprehensible for all, the information concerning the role and functioning of recommender systems and the main parameters used in their recommender systems, and they shall offer control with the available options for the recipients of the service to modify or influence those parameters that they may have made available, including options which are not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. Online platforms shall ensure that the option activated by default for the recipient of the service is not based on profiling. |
|
In addition to the obligations set out in the first subparagraph of this paragraph, very large online platforms may offer to the recipients of the service the choice of using recommender systems from third party providers, where available. In these cases, third parties shall be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c). |
Amendment 162
Proposal for a regulation
Article 29 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily and clearly accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. When a user creates an account, the settings for recommender systems shall be the default ones, not based on profiling, and shall give the user, in a easily comprehensible manner, a choice to set the main parameters to be used in recommender systems. |
Amendment 163
Proposal for a regulation
Article 29 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Very large online platforms shall ensure that their online interface is designed in such a way that there is no danger of it misleading or manipulating the recipients of the service. |
Amendment 164
Proposal for a regulation
Article 30 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. |
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a searchable, easy to access and functional repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. |
Amendment 165
Proposal for a regulation
Article 30 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the natural or legal person on whose behalf the advertisement is displayed; |
(b) the natural or legal person on whose behalf the advertisement is displayed, and the natural or legal person who finances the advertisement; |
Amendment 166
Proposal for a regulation
Article 30 – paragraph 2 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; |
(d) whether the advertisement was intended to be displayed specifically to or concealed specifically from one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; |
Amendment 167
Proposal for a regulation
Article 30 – paragraph 2 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically. |
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically; |
Amendment 168
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ea) whether the advertisement has been labelled, moderated, or disabled. |
Amendment 169
Proposal for a regulation
Article 31 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and without undue delay, full access to all available and relevant data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. |
Amendment 170
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. With regard to moderation and recommender systems, very large online platforms shall, upon request, provide the Digital Services Coordinator or the Commission, or both, access to algorithms by providing the relevant source code and associated data that allow the detection of possible biases. When disclosing these data, very large online platforms shall have a duty of explainability and ensure close cooperation with the Digital Services Coordinator or the Commission to make moderation and recommender systems fully understandable. When a bias is detected, very large online platforms shall correct it expeditiously following requirements from the Digital Services Coordinator or the Commission. Very large online platforms shall be able to demonstrate their compliance with the requirements set out in this Article at every step of the process. |
Amendment 171
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
deleted |
Amendment 172
Proposal for a regulation
Article 31 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with the Digital Services Coordinator or the Commission can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
Amendment 173
Proposal for a regulation
Article 31 – paragraph 6
|
|
Text proposed by the Commission |
Amendment |
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: |
deleted |
(a) it does not have access to the data; |
|
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. |
|
Amendment 174
Proposal for a regulation
Article 31 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. |
deleted |
The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request. |
|
Amendment 175
Proposal for a regulation
Article 31 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 31a |
|
1. Upon request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks asset out in Article 26(1). |
|
2. In order to be vetted, researchers shall be affiliated with academic institutions, media, civil society or international organisations representing the public interest, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
|
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. |
|
4. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation(EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. |
|
5. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested by vetted researchers because one of following two reasons: |
|
(a) it does not have access to the data; |
|
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets, in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. |
|
6. Requests for amendment pursuant to point (b) of paragraph 5 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. |
|
The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request. |
Amendment 176
Proposal for a regulation
Article 32 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall appoint one or more compliance officers responsible for monitoring their compliance with this Regulation. |
1. Very large online platforms shall appoint one or more compliance officers, for every Member State in their official language, responsible for monitoring their compliance with this Regulation. |
Amendment 177
Proposal for a regulation
Article 33 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. |
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. The reports shall include information disaggregated by Member State and provide information on the human and technical resources allocated for the purpose of content moderation for each official language of the Union. |
Amendment 178
Proposal for a regulation
Article 33 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 33a |
|
Accessibility requirements |
|
1. Very large online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. |
|
2. Very large online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Intermediary service providers shall keep that information for as long as the service is in operation. |
|
3. Very large online platforms shall ensure that information, and measures provided pursuant to Articles 10, 12(1), 13(1), 14(1) and (5), 15(3) and (4), 17(1), (2) and (4), 23(2), 24, 29(1) and (2), 30(1), and 33(1) are made available in a manner that they are easy to find, accessible to persons with disabilities. |
|
4. Very large online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. |
|
5. In the case of non-conformity, providers of intermediary services shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements and shall immediately inform the Digital Services Coordinator of establishment or other competent national authority of the Member States in which the service is established. |
|
6. Very large online platforms shall, cooperate with the competent authority or Digital Services Coordinator, upon a reasoned request, and provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. |
|
7. Very large online platforms shall be presumed to be in conformity with the accessibility requirements of this Regulation when they are in conformity with harmonised standards or parts there of the references of which have been published in the Official Journal of the European Union. |
|
8. Very large online platforms which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements. |
|
9. Very large online platforms shall, at least once a year, report to Digital Service Coordinators or other competent authorities on their obligation to ensure accessibility for persons with disabilities as required by this Regulation. |
|
10. In addition to the information included in Article 44(2), activity reports by the Digital Services Coordinators shall include measures taken pursuant to Article 10. |
Amendment 179
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board shall request and coordinate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
Amendment 180
Proposal for a regulation
Article 35 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
2. Where risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission shall invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, the European Parliament and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
Amendment 181
Proposal for a regulation
Article 35 – paragraph 3
|
|
Text proposed by the Commission |
Amendment |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. |
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain verifiable key performance indicators to measure the achievement of those objectives, have an independent monitoring and audit systems in place and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly and in good faith to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. |
Amendment 182
Proposal for a regulation
Article 35 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. |
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions and request that the organisations involved amend their codes of conducts accordingly. |
Amendment 183
Proposal for a regulation
Article 37 – paragraph 2 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level; |
(a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level which are also accessible for persons with disabilities; |
Amendment 184
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(fa) measures to ensure accessibility for persons with disabilities during the implementation of crisis protocols, including by providing accessible description about these protocols; |
Amendment 185
Proposal for a regulation
Article 39 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. |
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. Member States shall ensure that their Digital Services Coordinators are legally distinct from the government and functionally independent of their respective governments and of any other public or private body. |
Amendment 186
Proposal for a regulation
Article 39 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party. |
2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any public authority or any private party. |
Amendment 187
Proposal for a regulation
Article 43 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Recipients of the service, as well as other parties having a legitimate interest and meeting relevant criteria of expertise and independence from any intermediary service provider shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. |
Amendment 188
Proposal for a regulation
Article 45 – paragraph 7 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
7a. Member States shall introduce expedited procedures under which an order granted by a court or competent administrative authority in another Member State against a provider of intermediary services whose services are used to disseminate illegal content, can be used as a basis for court or administrative order in the Member State against similar providers of intermediary services whose service are used to disseminate the same illegal content. National Digital Services Coordinators shall make public decisions by judicial or administrative authorities provided to them by other Digital Services Coordinators under Article 8 of this Regulation. |
Amendment 189
Proposal for a regulation
Article 46 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Where at least three Digital Services Coordinators or the Board identify a common pattern of non-compliance with orders issued under Articles 8 and 9 vis-à-vis the same provider, they may request the Commission to initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 of this Regulation, irrespective of the size of the online platform. Such a request shall contain information listed in Article 45(2)(a) and (c) and all relevant information related to orders adopted under Articles 8 or 9 and to non-compliance with them. |
Amendment 190
Proposal for a regulation
Article 48 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. |
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator, notably representatives of European regulatory networks of independent national regulatory authorities, bodies or both, shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. |
Amendment 191
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision without undue delay. |
Amendment 192
Proposal for a regulation
Article 50 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35. |
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan shall include, where appropriate, participation in a code of conduct as provided for in Article 35. |
Amendment 193
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2). |
Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it shall request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within two months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2). |
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
CULT 8.2.2021 |
|||
Rapporteur for the opinion Date appointed |
Sabine Verheyen 20.1.2021 |
|||
Discussed in committee |
13.7.2021 |
|
|
|
Date adopted |
27.9.2021 |
|
|
|
Result of final vote |
+: –: 0: |
23 2 4 |
||
Members present for the final vote |
Asim Ademov, Ilana Cicurel, Gilbert Collard, Gianantonio Da Re, Laurence Farreng, Tomasz Frankowski, Romeo Franz, Chiara Gemma, Alexis Georgoulis, Irena Joveva, Petra Kammerevert, Predrag Fred Matić, Dace Melbārde, Victor Negrescu, Niklas Nienaß, Peter Pollák, Marcos Ros Sempere, Domènec Ruiz Devesa, Monica Semedo, Andrey Slabakov, Massimiliano Smeriglio, Michaela Šojdrová, Sabine Verheyen, Maria Walsh, Theodoros Zagorakis, Milan Zver |
|||
Substitutes present for the final vote |
Marcel Kolaja, Elżbieta Kruk |
|||
Substitutes under Rule 209(7) present for the final vote |
Evelyne Gebhardt |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
23 |
+ |
ECR |
Elżbieta Kruk, Dace Melbārde, Andrey Slabakov |
NI |
Chiara Gemma |
PPE |
Asim Ademov, Tomasz Frankowski, Peter Pollák, Michaela Šojdrová, Sabine Verheyen, Maria Walsh, Theodoros Zagorakis, Milan Zver |
Renew |
Ilana Cicurel, Laurence Farreng, Monica Semedo |
S&D |
Evelyne Gebhardt, Petra Kammerevert, Predrag Fred Matić, Victor Negrescu, Marcos Ros Sempere, Domènec Ruiz Devesa, Massimiliano Smeriglio |
The Left |
Alexis Georgoulis |
2 |
- |
Verts/ALE |
Romeo Franz, Marcel Kolaja |
4 |
0 |
ID |
Gilbert Collard, Gianantonio Da Re |
Renew |
Irena Joveva |
Verts/ALE |
Niklas Nienaß |
Key to symbols:
+ : in favour
- : against
0 : abstention
OPINION OF THE COMMITTEE ON WOMEN'S RIGHTS AND GENDER EQUALITY (13.10.2021)
for the Committee on the Internal Market and Consumer Protection
on the proposal for a regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM(2020)0825 – C9‑0418/2020 – 2020/0361(COD))
Rapporteur for opinion: Jadwiga Wiśniewska
SHORT JUSTIFICATION
The Internet is becoming increasingly important feature of our daily lives. It helps us in many ways but also opens new ways of abusing its users, both by other users and by online platforms which play a major role in bringing people together. The Covid-19 pandemic only deepened both trends, the positive one of using online environment to facilitate our work and daily lives, and the negative one of increasing online violence and using the Internet to commit serious crimes, such as trafficking in human beings or child abuse.
Women are particularly affected by these negative trends. This leads to very negative consequences at personal (mental health), social (lack of full digital inclusion) and economic (untapped potential) levels. Women are often discouraged to take a full use of digital solutions, which is particularly true for women in politics and other highly visible professions. Moreover, online tools are increasingly often used to perpetuate serious crimes such as trafficking in human beings, where most of the victims are women or children.
The Commission’s proposal on the Single Market for Digital Services (so called Digital Services Act, DSA) already contains a number of useful solutions. It rightly distinguished between very large online platforms, which have a huge impact on millions of people, and other service providers putting more obligations on the first. The rapporteur for the opinion believes that the proposal is short of taking into account some particular vulnerabilities of women and thus proposes to put more emphasis on their situation, especially in the recitals of the proposal. For very large online platforms the rapporteur proposes that they are not only obliged to disclose their algorithms to users, but also that they regularly review them with a view to minimise negative effects on the users. These negative effects may also be understood as deepening problems they are confronted with, such as depression or addictions. Very large online platforms should be obliged to try to avoid exposing them to the content which may lead to deepening their problems. A supervision of their actions should be also strengthened by Member States taking into account their socio-cultural context and their respective laws.
Yet, the rapporteur also notes some concerns as regards the freedom of expression. She realises that regulating online environment always needs to be balanced against the important value of letting people express their views. And while this freedom is not absolute and cannot be abused, a careful consideration of both values is necessary to arrive at good solutions. Therefore, she proposes only a few changes to the Commission proposal to avoid negative consequences on the freedom of expression.
AMENDMENTS
The Committee on Women's Rights and Gender Equality calls on the Committee on the Internal Market and Consumer Protection, as the committee responsible, to take into account the following amendments:
Amendment 1
Proposal for a regulation
Recital 2
|
|
Text proposed by the Commission |
Amendment |
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
(2) Up till now, politics has relied on voluntary cooperation with a view to address these risks and challenges. Since this has proved insufficient and there has been a lack of harmonised rules at Union level, Member States have been increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. |
Amendment 2
Proposal for a regulation
Recital 3
|
|
Text proposed by the Commission |
Amendment |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. |
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to gender equality and non-discrimination. In order to exercise those rights, the online world needs to be a safe space, especially for women and girls, where everybody can move freely. Therefore, measures to protect from, and prevent, phenomena such as online violence, cyberstalking, harassment, hate speech and exploitation of women and girls are essential. |
Amendment 3
Proposal for a regulation
Recital 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(3a) Gender equality is one of the founding values of the Union (Article 2 and Article 3(3) of the Treaty on European Union (TEU)). These objectives are also enshrined in Article 21 of the Charter of Fundamental Rights (the ‘Charter’). Article 8 of the Treaty on the Functioning of the European Union gives the Union the task of eliminating inequalities and promoting equality between women and men in all of its activities and policies. In order to protect women's rights and tackle gender-based online violence, the right to gender equality should be respected and the principle of gender mainstreaming should be applied in all policies of the Union, including the regulation of the functioning of the internal market and its digital services. |
Amendment 4
Proposal for a regulation
Recital 3 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(3b) Children, especially girls, have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The United Nations Committee on the Rights of the Child General comment No. 25 on children’s rights in relation to the digital environment formally sets out how those rights apply in the digital world. |
Amendment 5
Proposal for a regulation
Recital 5
|
|
Text proposed by the Commission |
Amendment |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. |
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. Given that online platforms are part of our everyday life and have become indispensable, even more so since the pandemic, the spread of illegal and harmful content, such as child sexual abuse material, online sexual harassment, unlawful non-consensual sharing of private images and videos, cyber violence, has risen dramatically as well. Ensuring a safe space online implies targeted actions against all phenomena harmfully affecting our social life, including through an awaited proposal on how to deal with harmful but not illegal content online. |
_________________ |
_________________ |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
26 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1). |
Amendment 6
Proposal for a regulation
Recital 9
|
|
Text proposed by the Commission |
Amendment |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 Regulation (EU) 2021/784 of the European Parliament and of the Council29 and Regulation 2021/1232 of the European Parliament and of the Council29a. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. |
_________________ |
_________________ |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . |
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation |
29 Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (OJ L 172, 17.5.2021, p. 79). |
|
29a Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41). |
Amendment 7
Proposal for a regulation
Recital 12
|
|
Text proposed by the Commission |
Amendment |
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
(12) In order to achieve the objective of ensuring a safe, predictable, accessible (including for persons with disabilities) and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly in order to underpin the general idea that what is illegal offline should also be illegal online. The concept should cover information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech, child sexual abuse material or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as trafficking in human beings and online sexual violence against women and girls, forced marriages, the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, doxing, mobbing, sextortion, grooming adolescents, online sexual harassment and other forms of gender based violence, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. |
Amendment 8
Proposal for a regulation
Recital 12 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(12a) As there is no common definition accepted for the recognition of cyber violence and hate speech online against women, there is an urgent need to define and adopt a common definition for the various forms of violence and hate speech targeting women and sexual minorities online that would serve as a basis for legislation. |
Amendment 9
Proposal for a regulation
Recital 12 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
(12b) Access to the internet is fast becoming a necessity for economic well-being; it is therefore crucial to ensure that this digital public space is a safe and empowering place for everyone, including women and girls. Online violence is a phenomenon which needs to be addressed for the safety of all users, though special attention should be paid to tackling violence against women and girls and other forms of gender-based violence. It not only causes psychological harm and physical suffering but it also deters victims from digital participation in political, social, cultural and economic life and it affects women and girls disproportionally. Evidence shows that women are on average more exposed to online violence than men, especially women engaged in political or other forms of highly visible activities. Research by the World Health Organization shows that one in three women will have experienced a form of violence in her lifetime, and despite the relatively new and growing phenomenon of internet connectivity, it is estimated that one in ten women has already experienced a form of cyber violence since the age of 15. A survey by the European Union Agency for Fundamental Rights in 2014, the most comprehensive at Union level in the field, showed that 1 in 10 women aged 15 or over in the Union has faced online harassment. |
Amendment 10
Proposal for a regulation
Recital 12 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
(12c) The COVID-19 pandemic has had a significant impact on almost all spheres of life, including on organised crime. For example, traffickers increasingly moved online for every phase of trafficking. They use the digital space in the recruitment and exploitation of victims, organisation of their transport and accommodation, advertising victims online and reaching out to potential clients, controlling victims, communicating between perpetrators and hiding criminal proceeds. Other forms of organised crime facilitated by digital tools are different types of exploitation, particularly for sexual, but also for labour exploitation, forced begging, forced and sham marriages, forced criminality, the removal of organs, illegal adoption of children and forced marriages. |
Amendment 11
Proposal for a regulation
Recital 25
|
|
Text proposed by the Commission |
Amendment |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent and non-discriminatory manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. |
Amendment 12
Proposal for a regulation
Recital 26 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(26a) Being aware that the intermediary services have already applied a risk assessment, there is still potential for improvement for the security and safety of all users, especially children, women, and other vulnerable groups. Therefore providers of intermediary services, more precisely online platforms and very large online platforms, should regularly evaluate their risk assessment and, if found necessary, improve it. Given the importance of providers of intermediary services and their potential to impact social life, common rules determining how users shall behave online, should be applied. The implementation of a code of conduct should be obligatory for every provider of intermediary services covered by this Regulation. |
Amendment 13
Proposal for a regulation
Recital 30
|
|
Text proposed by the Commission |
Amendment |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679, Regulation 2021/1232 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. Member States should ensure that the competent authorities fulfil their tasks in an objective, independent and non-discriminatory manner. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) 2021/784 addressing the dissemination of terrorist content online, Regulation 2021/1232 or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. |
Amendment 14
Proposal for a regulation
Recital 34
|
|
Text proposed by the Commission |
Amendment |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. |
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market, to ensure a safe and transparent online environment and to ensure the right to non-discrimination, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health, including mental health, the safety and trust of the recipients of the service, including minors, women, LGBTIQ+ people and vulnerable users such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. The World Health Organisation defines ‘health’ as a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity. This definition supports the fact that the development of new technologies might bring new health risks to users, in particular for children and women, such as psychological risk, development risks, mental risks, depression, loss of sleep, or altered brain function. |
Amendment 15
Proposal for a regulation
Recital 39
|
|
Text proposed by the Commission |
Amendment |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 |
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Data should be reported as disaggregated as possible. For example, anonymised individual characteristics such as gender, age group and social background of the notifying parties should be reported, whenever available. Providers offering their services in more than one Member State should also provide a breakdown of the information by Member State. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC40. Aligned with the annual reports broken down by actions of content moderation and Member State, the results of all forms of violence against women and girls online, hate speech and of other illegal content should reappear in the crime statistics. All forms of violence against women and girls should be reported as an own category in those criminal statistics and law enforcement entities should list them separately. |
__________________ |
__________________ |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36). |
Amendment 16
Proposal for a regulation
Recital 40
|
|
Text proposed by the Commission |
Amendment |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Online platforms may also allow users or trusted flaggers to notify content, including their own, to which others are responding with illegal content at large, such as illegal hate speech. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. |
Amendment 17
Proposal for a regulation
Recital 41
|
|
Text proposed by the Commission |
Amendment |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to gender equality and the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. |
Amendment 18
Proposal for a regulation
Recital 46
|
|
Text proposed by the Commission |
Amendment |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material, organisations committed to notifying illegal racist and xenophobic expressions online and women’s rights organisations such as the European Women’s Lobby. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 |
_________________ |
_________________ |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 |
Amendment 19
Proposal for a regulation
Recital 52
|
|
Text proposed by the Commission |
Amendment |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising that can have both an impact on the equal treatment and opportunities of citizens, in particular with regard to gender equality, and on the perpetuation of harmful stereotypes and norms. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. |
Amendment 20
Proposal for a regulation
Recital 57
|
|
Text proposed by the Commission |
Amendment |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual exploitation and abuse material, unlawful non-consensual sharing of private images and videos, online stalking, doxing, cyberbullying, rape threats or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through advertising, recommender systems or accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination the right to gender equality, the rights of the child and the right to personal data protection. The social dimension, as online platforms play a major role in everyday life, is also affected by phenomena such as online harassment and cyber violence. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform, including when algorithms are biased causing a widening of gender gaps and amplify discriminatory speech and content, or the misuse of their service through the submission of abusive notices or other methods for silencing speech, causing harm, such as long term mental health damage, psychological damage and societal damage or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. |
Amendment 21
Proposal for a regulation
Recital 58
|
|
Text proposed by the Commission |
Amendment |
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
(58) Very large online platforms should deploy the necessary means to diligently cease, prevent and mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions to cover aspects such as online violence and, in particular, online gender-based violence. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. They may also consider providing training to their staff and specifically to content moderators so they can stay up to date on covert language used as a form of illegal hate speech and violence against women and minorities. Very large online platforms should reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They should also initiate or increase cooperation with trusted flaggers and civil society organisations, such as women’s rights organisations, organise training sessions and exchanges with those organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and equality and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. |
Amendment 22
Proposal for a regulation
Recital 58 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(58a) Transparency and effectivity of processes is the key to making online platforms safer to use and tackling online violence and illegal content. The actions of online platforms’ decisions on how they act, or not, to remove illegal, abusive and harmful content vary hugely and some of the reports can remain unanswered. There must be an easily accessible knowledge for all users on how and why the content is being removed. These processes needs to be fully transparent. Very large online platforms should actively report and publish meaningful data on how they handle gender and other identity-based violence and they should share this information in an easy and accessible way on their platforms on an annual basis. This should include the number of reports they receive per year, and also the number of reports they receive that failed to receive any response from them, disaggregated by the category of the illegal, harmful and abusive content being reported. Very large platforms should ensure that experts and academics have access to the relevant data, e.g. to enable them to compare and evaluate how measures are working in order to gain a better understanding of the extent of the problem. They should also link their measures to international human rights, regularly evaluate, and update the implementation of their own ethical standards. |
Amendment 23
Proposal for a regulation
Recital 58 b(new)
|
|
Text proposed by the Commission |
Amendment |
|
(58b) The content of very large online platforms needs to be fully and easily accessible to all of their users. This can be achieved by implementing user-friendly measures into the services that very large online platforms offer. Very large online platforms should present their terms of service in machine-readable format and also make all their previous versions of their terms of service easily accessible to the public, including by persons with disabilities. Options to report potentially illegal, abusive and harmful content should be easy to find and to use in the native language of the user. Information on support for persons affected and on national contact points should be easily achievable. Very large online platforms should offer and evolve easily accessible services to all users in these kind of and similar cases. They should also make moderation as easy as possible, with the help of tools, training etc. for people administrating and moderating online groups that are using their platforms and services. They should also improve and ensure accessibility of elements and functions of their services for persons with disabilities. |
Amendment 24
Proposal for a regulation
Recital 59
|
|
Text proposed by the Commission |
Amendment |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. |
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services such as consumers’ and women’s rights organizations, independent experts and civil society organisations. |
Amendment 25
Proposal for a regulation
Recital 62
|
|
Text proposed by the Commission |
Amendment |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. |
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Those algorithms may lead to negative consequences, such as an increase in cases of online violence, and consequently physical violence, or the promotion of contents deepening personal problems, such as depression or addiction. Consequently, very large online platforms should regularly review their algorithms to minimise such negative consequences, should avoid gender-biased algorithms and any discriminatory impact on women and girls and should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible and accessible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients have alternative options for the main parameters, including a visible, user-friendly and readily available option to turn off algorithmic selection with the recommender system entirely and options that are not based on profiling of the recipient. They should allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are used as intended. |
Amendment 26
Proposal for a regulation
Recital 63
|
|
Text proposed by the Commission |
Amendment |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. |
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. Disinformation, especially political disinformation, has become a huge problem and very large online platforms have more and more become the platforms of sharing this kind of content, especially via advertising. Very large online platforms should take out extremist actors in consultation with independent experts, in case of repeated violations. Very large online platforms should implement comprehensive and verifiable standards and measures to limit the scope of extremist actors and purposeful disinformation. |
Amendment 27
Proposal for a regulation
Recital 64
|
|
Text proposed by the Commission |
Amendment |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. This data should be provided as disaggregated as possible in order to allow for meaningful conclusions to be drawn from it. For example, it is important that very large online platforms provide gender disaggregated data as much as possible in order for vetted researchers to have the possibility to explore whether and in what way certain online risks are experienced differently between men and women. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. |
Amendment 28
Proposal for a regulation
Recital 74
|
|
Text proposed by the Commission |
Amendment |
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate. |
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate. Moreover, it is important to assure that the Digital Services Coordinator, as well as other competent authorities, have the necessary knowledge to guarantee the rights and obligations of this Regulation. Therefore, they should promote education and training on fundamental rights and discrimination for their staff, including training in partnership with law enforcement authorities, crisis management authorities or civil society organisations that support victims of illegal online and offline activities such as harassment, gender-based violence and illegal hate speech. |
Amendment 29
Proposal for a regulation
Recital 82
|
|
Text proposed by the Commission |
Amendment |
(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available. |
(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, content associated with the sexual exploitation and abuse of women and girls and revenge porn, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available. |
Amendment 30
Proposal for a regulation
Recital 88
|
|
Text proposed by the Commission |
Amendment |
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. |
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. This advisory group should strive to achieve a gender-balanced representation in its composition. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. |
Amendment 31
Proposal for a regulation
Recital 91
|
|
Text proposed by the Commission |
Amendment |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities such as the European Data Protection Supervisor and the European Union Agency for Fundamental Rights under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, in particular gender equality and non-discrimination, eradication of all forms of violence against women and girls and other forms of gender-based violence, including online violence and harassment, online stalking, online sex trafficking, child abuse, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. |
Amendment 32
Proposal for a regulation
Article 1 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. |
(b) set out uniform rules for a safe, accessible, including for persons with disabilities, predictable and trusted online environment, where fundamental rights enshrined in the Charter, in particular those relating to equality, are effectively protected. |
Amendment 33
Proposal for a regulation
Article 1 – paragraph 5 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted]; |
(d) Regulation (EU) 2021/784 of the European Parliament and of the Council1a; |
|
__________________ |
|
1a Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (OJ L 172, 17.5.2021, p. 79). |
Amendment 34
Proposal for a regulation
Article 1 – paragraph 5 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) Regulation (EU) 2021/1232 of the European Parliament and of the Council1a; |
|
__________________ |
|
1a Regulation (EU) 2021/1232 of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC as regards as the use of technologies by providers of number-independent interpersonal communications service for the processing of personal and other data for the purpose of combatting online child sexual abuse (OJ L 274, 30.7.2021, p. 41). |
Amendment 35
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) ‘child’ means any natural person under the age of 18; |
Amendment 36
Proposal for a regulation
Article 2 – paragraph 1 – point g
|
|
Text proposed by the Commission |
Amendment |
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; |
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is manifestly not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; reporting or warning of an illegal act shall not be deemed illegal content; |
Amendment 37
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(qa) "gender-based online violence" means any act of gender-based violence that is committed, assisted or aggravated in part or fully by the use of ICT, such as mobile phones and smartphones, the internet, social media platforms or email, against a woman because she is a woman or affects women disproportionately, or against LGBTI people because of their gender identity, gender expression or sex characteristics, and results in, or is likely to result in physical, sexual, psychological or economic harm, including threats to carry out such acts, coercion or arbitrary deprivation of liberty, in public or private life |
Amendment 38
Proposal for a regulation
Article 8 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 8a |
|
Injunction orders |
|
Member States shall ensure that recipients of a service are entitled under their national law to seek an injunction order as an interim measure for removing manifestly illegal content. |
Amendment 39
Proposal for a regulation
Article 10 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 10a |
|
Point of contact for recipients of a service |
|
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with the recipients of their services. The means of communication shall be user-friendly and easily accessible. |
|
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact for recipients. |
Amendment 40
Proposal for a regulation
Article 12 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. |
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format, in a searchable archive of all the previous versions with their date of application. |
Amendment 41
Proposal for a regulation
Article 12 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
2. Providers of intermediary services shall act in a diligent, non-discriminatory, transparent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. |
Amendment 42
Proposal for a regulation
Article 12 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Providers of intermediary services shall include on their platforms their terms and conditions, setting out behavioural rules for their users. Those rules shall be publicly accessible in an easily understandable format, shall promote gender equality and non-discrimination, shall be age-appropriate, shall be set out in clear and unambiguous language and meet the highest European or international standards as referred to in Article 34. |
Amendment 43
Proposal for a regulation
Article 12 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Very large online platforms as referred to in Article 25(1) shall publish their terms and conditions in all languages of the Member States in which they provide services and upon request in all official languages of the Union. They shall set out their terms and conditions in machine-readable format. |
Amendment 44
Proposal for a regulation
Article 12 – paragraph 2 c (new)
|
|
Text proposed by the Commission |
Amendment |
|
2c. The Digital Services Coordinator of each Member State may seek to cooperate, in coordination with the Board, with very large online platforms as referred to in Article 25(1) to apply measures and tools of content moderation, in order to address infringements of the obligations laid down in this Regulation. |
Amendment 45
Proposal for a regulation
Article 12 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 12a |
|
Child impact assessment |
|
1. All providers of intermediary services shall assess whether their services are accessed by, likely to be accessed by, or impact children, especially girls. Providers of services likely to impact children, especially girls, shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year, any systemic risks stemming from the functioning and use of their services in the Union for children, especially girls. Those risk impact assessments shall be specific to their services, meet the highest European or international standards referred to in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments shall also include the following systemic risks: |
|
(a) the dissemination of illegal content or behaviour enabled by, manifested on or as a result of their services; |
|
(b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and in the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment; |
|
(c) any negative effects on the right to gender equality, as enshrined in Article 23 of the Charter, particularly the right to live free from violence as envisaged by the Council of Europe Convention on preventing and combating violence against women and girls (Istanbul Convention); |
|
(d) any negative effects on the right to non-discrimination, as enshrined in Article 21 of the Charter; |
|
(e) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on children's rights, especially of girls. |
|
2. When conducting such impact assessments, providers of intermediary services likely to impact children, especially girls, shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child, especially of girls. |
Amendment 46
Proposal for a regulation
Article 12 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 12b |
|
Mitigation of risks to children, especially girls |
|
Providers of intermediary services likely to impact children, especially girls, shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 12a. |
|
Such measures shall include, where applicable: |
|
(a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; |
|
(b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; |
|
(c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; |
|
(d) adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child and gender equality; |
|
(e) ensuring the highest levels of privacy, safety, and security by design and default for children; |
|
(f) preventing profiling of children, including for commercial purposes like targeted advertising; |
|
(g) ensuring published terms are age appropriate and uphold children’s rights and gender equality; |
|
(h) providing child-friendly and inclusive mechanisms for remedy and redress, including easy access to expert advice and support. |
Amendment 47
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: |
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include breakdowns at Member State level and, in particular, information on the following, as applicable: |
Amendment 48
Proposal for a regulation
Article 13 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, anonymised data on individual characteristics of those who submit those notices such as gender, age group and social background, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; |
Amendment 49
Proposal for a regulation
Article 13 – paragraph 1 – point d
|
|
Text proposed by the Commission |
Amendment |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, anonymised data on individual characteristics of those who submit those complaints, such as gender, age group and social background, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. |
Amendment 50
Proposal for a regulation
Article 13 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Protection of the identity of the victims concerned shall be ensured, in line with GDPR standards. |
Amendment 51
Proposal for a regulation
Article 13 – paragraph 1 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
1b. Providers of intermediary services that impact children, especially girls, shall publish, at least once a year: |
|
(a) child impact assessments to identify known harms, unintended consequences and emerging risks; those impact assessments shall comply with the standards outlined in Article 34; |
|
(b) clear, easily comprehensible and detailed reports outlining the gender equality and child risk mitigation measures undertaken, their efficacy and any outstanding actions required; those reports shall comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design that equally promotes gender equality. |
Amendment 52
Proposal for a regulation
Article 14 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. |
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them, in languages of a Member State in which they provide services and upon request in all official languages of the Union, of the presence on their service of specific items of information that the individual or entity considers to be illegal content or content that infringes the terms and conditions of the service. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. |
Amendment 53
Proposal for a regulation
Article 14 – paragraph 2 – point d a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(da) the option for those submitting notices to outline some of their individual characteristics, such as gender, age group or social background; the providers shall make clear that this information shall not be part of the decision-making process with regard to the notice, shall be completely anonymised and used solely for reporting purposes. |
Amendment 54
Proposal for a regulation
Article 14 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. The provider of intermediary services shall also notify the recipients, where contact details are available, giving them the opportunity to reply, unless this would obstruct the prevention and prosecution of serious criminal offences, create undue delays or increase the risk of further distribution of illegal content. |
Amendment 55
Proposal for a regulation
Article 14 – paragraph 6 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
6a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal. |
Amendment 56
Proposal for a regulation
Article 14 – paragraph 6 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
6b. The provider of hosting services shall ensure that processing of notices is undertaken by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards, including anti-discrimination, as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice. |
Amendment 57
Proposal for a regulation
Article 17 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) decisions to remove or disable access to the information; |
(a) decisions whether to remove or disable access to the information; |
Amendment 58
Proposal for a regulation
Article 17 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) decisions to suspend or terminate the provision of the service, in whole or in part, to the recipients; |
(b) decisions whether to suspend or terminate the provision of the service, in whole or in part, to the recipients; |
Amendment 59
Proposal for a regulation
Article 17 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) decisions to suspend or terminate the recipients’ account. |
(c) decisions whether to suspend or terminate the recipients’ account. |
Amendment 60
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
|
|
Text proposed by the Commission |
Amendment |
|
(ca) decisions whether to restrict the ability to monetise content provided by the recipients. |
Amendment 61
Proposal for a regulation
Article 17 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access and user-friendly, including for children, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. |
Amendment 62
Proposal for a regulation
Article 17 – paragraph 4 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
4a. Online platforms shall give the option for those submitting complaints to outline some of their individual characteristics, such as gender, age group and social background. Online platforms shall make clear that that information is not part of the decision-making process in regards to the complaint, is completely anonymised and is used solely for reporting purposes. |
Amendment 63
Proposal for a regulation
Article 24 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 24a |
|
Recommender systems |
|
1. Online platforms shall not make the recipients of their services subject to recommender systems based on profiling, unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Online platforms shall ensure that the option that is not based on profiling is activated by default. |
|
2. Online platforms shall set out in their terms and conditions and when content is recommended, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they have made available, including at least one option which is not based on profiling, within the meaning of Article 4(4) of Regulation (EU) 2016/679. Online platforms shall also enable the recipients of the service to view, in a user-friendly manner, any profile or profiles used to curate their own content. They shall provide users with an easily accessible option to delete their profile or profiles used to curate the content the recipient sees. |
|
3. The parameters referred to in paragraph 2 shall include, at a minimum: |
|
(a) the recommendation criteria used by the relevant system; |
|
(b) how these criteria are weighted against each other; |
|
(c) what goals the relevant system has been optimised for; and |
|
(d) if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs. |
|
4. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible function on their online interface allowing the recipients of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. |
|
5. Online platforms shall inform their users about the identity of the person responsible for the recommender system. |
|
6. Online platforms shall ensure that the algorithm used by their recommender system is designed in such a way that it does not risk misleading or manipulating the recipients of the service when they use it. |
|
7. Online platforms shall ensure that information from trustworthy sources, such as information from public authorities or from scientific sources, is displayed as first results following search queries that are related to areas of public interest. |
Amendment 64
Proposal for a regulation
Article 24 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 24b |
|
Protections against image-based sexual abuse |
|
Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure: |
|
(a) that the identity of users who disseminate content has been verified through a double opt-in e-mail and cell phone registration; |
|
(b) professional human-powered content moderation in line with Article 14(6b), where content having a high probability of being illegal, such as voyeuristic content or content enacting rape scenes, is reviewed; |
|
(c) the accessibility of an anonymous qualified notification procedure in addition to the mechanism referred to in Article 14 and respecting the same principles with the exception of paragraph 4a of that Article, allowing individuals to notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure shall be considered manifestly illegal in terms of Article 14(6a) and shall be suspended within 48 hours. |
Amendment 65
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: |
Amendment 66
Proposal for a regulation
Article 26 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; |
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively; |
Amendment 67
Proposal for a regulation
Article 26 – paragraph 1 – point c
|
|
Text proposed by the Commission |
Amendment |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on gender equality, online violence or on the protection of public health (including mental health), minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. |
Amendment 68
Proposal for a regulation
Article 26 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. |
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content or the content risking increase in online violence, deepening the marginalisation of vulnerable communities who are often targets of online hate speech and of information that is incompatible with their terms and conditions. |
Amendment 69
Proposal for a regulation
Article 26 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. Very large online platforms shall regularly review their algorithms to minimise negative consequences, such as an increase in cases of online violence, and consequently physical violence. Very large online platforms shall implement comprehensive and verifiable standards and measures to limit deliberate misinformation. |
Amendment 70
Proposal for a regulation
Article 26 – paragraph 2 b (new)
|
|
Text proposed by the Commission |
Amendment |
|
2b. Very large online platforms shall offer easily accessible explanations that allow users to understand when, why, for which tasks, and to which extent algorithmic tools are used. They shall let users, in an easy and accessible way, choose whether to accept the algorithms used on their platforms and in their services. They shall allow independent researchers and relevant regulators to audit their algorithmic tools to ensure that they are used as intended. |
Amendment 71
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: |
1. Very large online platforms shall put in place reasonable, proportionate and effective measures to cease, prevent and mitigate systemic risks, tailored to the specific systemic risks identified pursuant to Article 26. Such measures shall include, where applicable: |
Amendment 72
Proposal for a regulation
Article 27 – paragraph 1 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide; |
(b) targeted measures aimed at limiting the display of advertisements, illegal or harmful content in association with the service they provide; |
Amendment 73
Proposal for a regulation
Article 27 – paragraph 1 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
1a. Where a very large online platform decides not to put in place any of the mitigation measures listed in paragraph 1 of this Article, it shall provide written reasons. Those reasons shall be provided to the independent auditors for the purpose of the audit report referred to in Article 28(3). |
Amendment 74
Proposal for a regulation
Article 27 – paragraph 2 – point b
|
|
Text proposed by the Commission |
Amendment |
(b) best practices for very large online platforms to mitigate the systemic risks identified. |
(b) best practices for very large online platforms to cease, prevent and mitigate the systemic risks identified. |
Amendment 75
Proposal for a regulation
Article 28 – paragraph 1 – point a
|
|
Text proposed by the Commission |
Amendment |
(a) the obligations set out in Chapter III; |
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27; |
Amendment 76
Proposal for a regulation
Article 31 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). |
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1) and to verify the effectiveness of the risk mitigation measures taken by the very large online platform in question under Article 27. |
Amendment 77
Proposal for a regulation
Article 31 – paragraph 3 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
3a. The data provided to vetted researchers shall be as disaggregated as possible, unless the researcher requests it otherwise. |
Amendment 78
Proposal for a regulation
Article 31 – paragraph 4
|
|
Text proposed by the Commission |
Amendment |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, disclose the funding financing the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. |
Amendment 79
Proposal for a regulation
Article 33 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 33a |
|
Algorithm accountability |
|
1. When using automated decision-making, the very large online platform shall perform an assessment of the algorithms used. |
|
2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: |
|
(a) the compliance with corresponding Union requirements; |
|
(b) how the algorithm is used and its impact on the provision of the service; |
|
(c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and |
|
(d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). |
|
3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non-governmental organisations. |
|
4. Following the assessment referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points (a), (b), (c) and (d) of paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. |
|
5. Where the very large online platform finds that the algorithm used does not comply with point (a) or (d) of paragraph 2, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. |
|
6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c) or (d) of paragraph 2, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred to in paragraph 5, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement. |
Amendment 80
Proposal for a regulation
Article 34 – paragraph 2 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
2a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child and the right to gender equality, observance of which, once adopted, will be mandatory, at least for the following: |
|
(a) age assurance and age verification pursuant to Article 13; |
|
(b) child impact assessments pursuant to Article 13; |
|
(c) age-appropriate terms and conditions that equally promote gender equality pursuant to Article 12; |
|
(d) child-centred design that equally promotes gender equality and pursuant to Article 13. |
Amendment 81
Proposal for a regulation
Article 35 – paragraph 1
|
|
Text proposed by the Commission |
Amendment |
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
1. The Commission and the Board shall have the right to initiate and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. |
Amendment 82
Proposal for a regulation
Article 35 – paragraph 2
|
|
Text proposed by the Commission |
Amendment |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. |
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission, after consulting the Board, shall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, including organisations working on gender equality, experts on fundamental rights and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. Trusted flaggers and vetted researchers may submit to the Commission and the Board requests for codes of conduct to be considered based on the systemic risk reports referred to in Article 13 and research evaluating the impact of the measures put in place by online platforms to address these systemic risks. |
Amendment 83
Proposal for a regulation
Article 36 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
Article 36a |
|
Codes of conduct for the fight against online violence |
|
1. The Commission shall encourage the development of codes of conduct for the fight against online violence at Union level, between online platforms and other relevant service providers, organisations representing victims of online violence, civil society organisations and law enforcement authorities. Those codes of conduct shall contribute to further transparency and reporting requirements with regard to instances of online violence, with special attention to gender-based violence. Those codes of conduct shall also strengthen requirements on how online platforms and other service providers deal with these instances. |
|
2. The Commission shall aim to ensure that the codes of conduct referred to in paragraph 1 pursue an effective transmission of information, in full respect of the rights of all parties involved, and clarify how online platforms and other relevant service providers should deal with particularly sensitive cases of illegal content, such as content related to unlawful non-consensual sharing of private images, in accordance with Union and national law. The Commission shall aim to ensure that the codes of conduct address at least: |
|
(a) the categories of illegal content related to online violence which should be used by providers of intermediary services in the detailed reports referred to in Article 13; |
|
(b) the types of illegal content associated with online violence, such as content related to unlawful non-consensual sharing of private images, that very large online platforms should consider as possible systemic risks when pursuing their risks assessments referred to in Article 26; |
|
(c) the information that online platforms and other relevant service providers should provide to law enforcement or judicial authorities when there is suspicion of a serious criminal offence related to online violence, such as content related to unlawful non-consensual sharing of private images, pursuant to Article 21; |
|
(d) any standardised information that should be provided in addition to Article 14(4) to the individual or entity that submitted a notice of the presence of alleged illegal content related to online violence, such as the contact details of organisations supporting victims of gender-based violence as well as how to access relevant public services, such as psychological support. |
|
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. |
Amendment 84
Proposal for a regulation
Article 37 – paragraph 4 – point e
|
|
Text proposed by the Commission |
Amendment |
(e) safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination; |
(e) safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information, the right to equality between women and men, the right to non-discrimination and the rights of the child; |
Amendment 85
Proposal for a regulation
Article 45 – paragraph 7
|
|
Text proposed by the Commission |
Amendment |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. |
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall also be transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1. |
Amendment 86
Proposal for a regulation
Article 48 – paragraph 5
|
|
Text proposed by the Commission |
Amendment |
5. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available. |
5. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts, in areas such as equality, in particular gender equality, and non-discrimination, online violence and harassment, online stalking and child abuse, where relevant. The Board shall make the results of this cooperation publicly available. |
Amendment 87
Proposal for a regulation
Article 48 – paragraph 5 a (new)
|
|
Text proposed by the Commission |
Amendment |
|
5a. The composition of the Board shall be gender balanced. |
Amendment 88
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
|
|
Text proposed by the Commission |
Amendment |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. |
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, shall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision without undue delay and in any event within two months. |
Amendment 89
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
|
|
Text proposed by the Commission |
Amendment |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, shall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: |
Amendment 90
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
|
|
Text proposed by the Commission |
Amendment |
Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. |
When the Commission initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. |
PROCEDURE – COMMITTEE ASKED FOR OPINION
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Opinion by Date announced in plenary |
FEMM 11.3.2021 |
|||
Rapporteur for the opinion Date appointed |
Jadwiga Wiśniewska 12.4.2021 |
|||
Discussed in committee |
1.7.2021 |
30.9.2021 |
11.10.2021 |
|
Date adopted |
12.10.2021 |
|
|
|
Result of final vote |
+: –: 0: |
30 0 1 |
||
Members present for the final vote |
Isabella Adinolfi, Simona Baldassarre, Vilija Blinkevičiūtė, Annika Bruna, Margarita de la Pisa Carrión, Rosa Estaràs Ferragut, Frances Fitzgerald, Cindy Franssen, Heléne Fritzon, Lina Gálvez Muñoz, Elżbieta Katarzyna Łukacijewska, Karen Melchior, Andżelika Anna Możdżanowska, Maria Noichl, Pina Picierno, Sirpa Pietikäinen, Samira Rafaela, Evelyn Regner, Diana Riba i Giner, María Soraya Rodríguez Ramos, Christine Schneider, Sylwia Spurek, Jessica Stegrud, Ernest Urtasun, Hilde Vautmans, Elissavet Vozemberg-Vrionidi, Chrysoula Zacharopoulou, Marco Zullo |
|||
Substitutes present for the final vote |
Lena Düpont, Maria-Manuel Leitão-Marques, Kira Marie Peter-Hansen |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE ASKED FOR OPINION
30 |
+ |
ECR |
Andżelika Anna Możdżanowska, Margarita de la Pisa Carrión |
ID |
Simona Baldassarre, Annika Bruna |
PPE |
Isabella Adinolfi, Lena Düpont, Rosa Estaràs Ferragut, Frances Fitzgerald, Cindy Franssen, Sirpa Pietikäinen, Christine Schneider, Elissavet Vozemberg‑Vrionidi, Elżbieta Katarzyna Łukacijewska |
Renew |
Karen Melchior, Samira Rafaela, María Soraya Rodríguez Ramos, Hilde Vautmans, Chrysoula Zacharopoulou, Marco Zullo |
S&D |
Vilija Blinkevičiūtė, Heléne Fritzon, Lina Gálvez Muñoz, Maria‑Manuel Leitão‑Marques, Maria Noichl, Pina Picierno, Evelyn Regner |
Verts/ALE |
Kira Marie Peter‑Hansen, Diana Riba i Giner, Sylwia Spurek, Ernest Urtasun |
0 |
- |
|
|
|
|
1 |
0 |
ECR |
Jessica Stegrud |
Key to symbols:
+ : in favour
- : against
0 : abstention
PROCEDURE – COMMITTEE RESPONSIBLE
Title |
Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC |
|||
References |
COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) |
|||
Date submitted to Parliament |
16.12.2020 |
|
|
|
Committee responsible Date announced in plenary |
IMCO 8.2.2021 |
|
|
|
Committees asked for opinions Date announced in plenary |
ECON 8.2.2021 |
ITRE 8.2.2021 |
TRAN 8.2.2021 |
CULT 8.2.2021 |
|
JURI 8.2.2021 |
LIBE 8.2.2021 |
FEMM 11.3.2021 |
|
Associated committees Date announced in plenary |
ITRE 20.5.2021 |
JURI 20.5.2021 |
LIBE 20.5.2021 |
|
Rapporteurs Date appointed |
Christel Schaldemose 27.1.2021 |
|
|
|
Discussed in committee |
11.1.2021 |
21.6.2021 |
27.9.2021 |
27.10.2021 |
Date adopted |
14.12.2021 |
|
|
|
Result of final vote |
+: –: 0: |
36 7 2 |
||
Members present for the final vote |
Alex Agius Saliba, Andrus Ansip, Pablo Arias Echeverría, Alessandra Basso, Brando Benifei, Adam Bielan, Hynek Blaško, Biljana Borzan, Markus Buchheit, Andrea Caroppo, Anna Cavazzini, Dita Charanzová, Deirdre Clune, David Cormand, Carlo Fidanza, Evelyne Gebhardt, Alexandra Geese, Sandro Gozi, Maria Grapini, Svenja Hahn, Krzysztof Hetman, Virginie Joron, Eugen Jurzyca, Arba Kokalari, Andrey Kovatchev, Jean-Lin Lacapelle, Maria-Manuel Leitão-Marques, Morten Løkkegaard, Adriana Maldonado López, Antonius Manders, Beata Mazurek, Leszek Miller, Anne-Sophie Pelletier, Miroslav Radačovský, Christel Schaldemose, Andreas Schwab, Tomislav Sokol, Ivan Štefanec, Róża Thun und Hohenstein, Tom Vandenkendelaere, Kim Van Sparrentak, Marion Walsmann, Marco Zullo |
|||
Substitutes present for the final vote |
Claude Gruffat, Martin Schirdewan |
|||
Date tabled |
21.12.2021 |
|||
FINAL VOTE BY ROLL CALL IN COMMITTEE RESPONSIBLE
36 |
+ |
ECR |
Adam Bielan, Carlo Fidanza, Beata Mazurek |
PPE |
Pablo Arias Echeverría, Andrea Caroppo, Deirdre Clune, Krzysztof Hetman, Arba Kokalari, Andrey Kovatchev, Antonius Manders, Andreas Schwab, Tomislav Sokol, Ivan Štefanec, Tom Vandenkendelaere, Marion Walsmann |
Renew |
Andrus Ansip, Dita Charanzová, Sandro Gozi, Svenja Hahn, Morten Løkkegaard, Róża Thun und Hohenstein, Marco Zullo |
S&D |
Alex Agius Saliba, Brando Benifei, Biljana Borzan, Evelyne Gebhardt, Maria Grapini, Maria-Manuel Leitão-Marques, Adriana Maldonado López, Leszek Miller, Christel Schaldemose |
Verts/ALE |
Anna Cavazzini, David Cormand, Alexandra Geese, Claude Gruffat, Kim Van Sparrentak |
7 |
- |
ECR |
Eugen Jurzyca |
ID |
Alessandra Basso, Hynek Blaško, Markus Buchheit |
NI |
Miroslav Radačovský |
The Left |
Anne-Sophie Pelletier, Martin Schirdewan |
2 |
0 |
ID |
Virginie Joron, Jean-Lin Lacapelle |
Key to symbols:
+ : in favour
- : against
0 : abstention