Index 
 Previous 
 Next 
 Full text 
Procedure : 2020/0361(COD)
Document stages in plenary
Document selected : A9-0356/2021

Texts tabled :

A9-0356/2021

Debates :

PV 19/01/2022 - 14
PV 19/01/2022 - 16
PV 19/01/2022 - 18
CRE 19/01/2022 - 14
CRE 19/01/2022 - 16
CRE 19/01/2022 - 18
PV 04/07/2022 - 15
CRE 04/07/2022 - 15

Votes :

PV 20/01/2022 - 3
PV 20/01/2022 - 15
CRE 20/01/2022 - 3
PV 05/07/2022 - 6.4
Explanations of votes

Texts adopted :

P9_TA(2022)0014
P9_TA(2022)0269

Texts adopted
PDF 596kWORD 259k
Thursday, 20 January 2022 - Strasbourg
Digital Services Act ***I
P9_TA(2022)0014A9-0356/2021

Amendments adopted by the European Parliament on 20 January 2022 on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (COM(2020)0825 – C9-0418/2020 – 2020/0361(COD))(1)

(Ordinary legislative procedure: first reading)

Text proposed by the Commission   Amendment
Amendment 1
Proposal for a regulation
Recital 1
(1)  Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.
(1)  Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel and innovative ways, transforming their communication, consumption and business habits. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, for individual users, companies and for society as a whole.
__________________
__________________
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
Amendment 2
Proposal for a regulation
Recital 2
(2)  Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.
(2)  Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services, and resulting in a fragmentation of the internal market. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice, without lock-in effects, and reducing administrative burden for intermediary services, especially for micro, small and medium sized enterprises.
Amendment 3
Proposal for a regulation
Recital 3
(3)  Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination.
(3)  Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights and freedoms guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the rights to privacy, to protection of personal data, respect for human dignity, private and family life, the freedom of expression and information, the freedom and the pluralism of the media, and the freedom to conduct a business, a high level of consumer protection, the equality between women and men and the right to non-discrimination. Children have particular rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child (UNCRC). As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
Amendment 4
Proposal for a regulation
Recital 4
(4)  Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.
(4)  In order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers, protecting consumers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated, while respecting fundamental rights.
Amendment 5
Proposal for a regulation
Recital 4 a (new)
(4a)  Given the importance of digital services, it is essential that this Regulation ensures a regulatory framework which ensures full, equal and unrestricted access to intermediary services for all recipients of services, including persons with disabilities. Therefore, it is important that accessibility requirements for intermediary services, including their user interfaces, are consistent with existing Union law, such as the European Accessibility Act and the Web Accessibility Directive and that Union law is further developed, so that no one is left behind as result of digital innovation.
Amendment 6
Proposal for a regulation
Recital 6
(6)  In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union.
(6)  In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport of persons and goods, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union.
Amendment 7
Proposal for a regulation
Recital 8
(8)  Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) No 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union.
(8)  Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the directing of activities towards one or more Member States. The directing of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The directing of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) No 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union.
__________________
__________________
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).
27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).
Amendment 8
Proposal for a regulation
Recital 9
(9)  This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level.
(9)  This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) 2021/784 of the European Parliament and of the Council29. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation should apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures. To assist Member States and service providers, the Commission should provide guidelines as to how to interpret the interaction and complementary nature between different Union legal acts and this Regulation and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. In particular, the guidelines should clarify any potential conflicts between the conditions and obligations laid down in legal acts, referred to in this Regulation, explaining which legal act should prevail.
__________________
__________________
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 .
28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 .
29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
29 Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (OJ L 172, 17.5.2021, p. 79).
Amendment 9
Proposal for a regulation
Recital 9 a (new)
(9a)  In line with Article 167(4) of the Treaty on the Functioning of the European Union, cultural aspects should be taken into account, in particular in order to respect and to promote the cultural and linguistic diversity. It is essential that this Regulation contributes to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity.
Amendment 10
Proposal for a regulation
Recital 10
(10)  For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions.
(10)  For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33, Directive (EU) 2018/1972 of the European Parliament and of the Council33a, as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive (EU) 2019/882 of the European Parliament and of the Council, Regulation (EU) 2019/1020, Directive 2001/95/EC, Directive 2013/11/EU of the European Parliament and of the Council, Regulation (EU) 2017/239437a , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union or national law on working conditions.
__________________
__________________
30 Regulation (EU) 2019/1148 of the European Parliament and of the Council of 20 June 2019 on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1).
30 Regulation (EU) 2019/1148 of the European Parliament and of the Council of 20 June 2019 on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1).
31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57).
31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57).
32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37.
32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37.
33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC.
33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC.
33a Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast) (OJ L 321, 17.12.2018, p. 36).
34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) (OJ L 149, 11.6.2005, p. 22).
34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) (OJ L 149, 11.6.2005, p. 22).
35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (OJ L 304, 22.11.2011, p. 64).
35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (OJ L 304, 22.11.2011, p. 64).
36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts (OJ L 95, 21.4.1993, p. 29).
36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts (OJ L 95, 21.4.1993, p. 29).
37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules (OJ L 328, 18.12.2019, p. 7).
37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules (OJ L 328, 18.12.2019, p. 7).
37a Regulation (EU) 2017/2394 of the European Parliament and of the Council of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004 (OJ L 345, 27.12.2017, p. 1).
38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
Amendment 11
Proposal for a regulation
Recital 11
(11)  It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected.
(11)  It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 of the European Parliament and of the Council, which establish specific rules and procedures that should remain unaffected.
Amendment 12
Proposal for a regulation
Recital 12
(12)  In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
(12)  In order to achieve the objective of ensuring a safe, accessible, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should underpin the general idea that what is illegal offline should also be illegal online. The concept of “illegal content” should be defined appropriately and should cover information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable Union or national law is either itself illegal, such as illegal hate speech, or terrorist content and unlawful discriminatory content, or that is not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law, the provision of illegal services in particular in the area of accommodation services on short-term rental platforms non-compliant with Union or national law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is in conformity with Union law, including the Charter and what the precise nature or subject matter is of the law in question.
Amendment 13
Proposal for a regulation
Recital 13
(13)  Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
(13)  Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor or a purely ancillary feature of another service or functionality of the principal service and that feature or functionality cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature or functionality is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. For the purposes of this Regulation, cloud computing services should not be considered to be an online platform in cases where allowing the dissemination of specific content constitutes a minor or ancillary feature. Moreover, cloud computing services, when serving as infrastructure, for example, as the underlining infrastructural storage and computing services of an internet-based application or online platform, should not in itself be seen as disseminating to the public information stored or processed at the request of a recipient of an application or online platform which it hosts.
Amendment 14
Proposal for a regulation
Recital 14
(14)  The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.
(14)  The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision on whom to grant access. Information exchanged using interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, are not considered to have been disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.
__________________
__________________
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 15
Proposal for a regulation
Recital 16
(16)  The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union.
(16)  The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity, consistency, predictability, accessibility and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments.
Amendment 16
Proposal for a regulation
Recital 18
(18)  The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
(18)  The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. The mere ranking or displaying in an order, or the use of a recommender system should not, however, be deemed as having control over an information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
Amendment 17
Proposal for a regulation
Recital 20
(20)  A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
(20)  Where a provider of intermediary services deliberately collaborates with a recipient of the services in order to undertake illegal activities, the service should be deemed not to have been provided neutrally and the provider should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 18
Proposal for a regulation
Recital 21
(21)  A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
(21)  A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved in the content of the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
Amendment 19
Proposal for a regulation
Recital 22
(22)  In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
(22)  In order to benefit from the exemption from liability for hosting services, the provider should, after having become aware of the illegal nature of the content and thus obtaining actual knowledge or awareness, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of a high level of consumer protection and of the Charter of Fundamental Rights, including the principle of freedom of expression and the right to receive and impart information and ideas without interference by public authority. The provider can obtain actual knowledge or awareness of the illegal nature of the content through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent hosting service provider to reasonably identify, assess and where appropriate act against the allegedly illegal content. As long as providers act upon obtaining actual knowledge, they should benefit from the exemptions from liability referred to in this Regulation.
Amendment 20
Proposal for a regulation
Recital 23
(23)  In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
(23)  In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of a consumer. Such a belief may arise, for example, where the online platform allowing distance contracts with traders fails to display clearly the identity of the trader pursuant to this Regulation, or is marketing the product or service in its own name rather than using the name of the trader who will supply it, or where the provider determines the final price of the goods or services offered by the trader.
Amendment 21
Proposal for a regulation
Recital 25
(25)  In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
(25)  In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, solely because they are carrying out voluntary own-initiative investigations, provided those activities are carried out in good faith and in a diligent manner and are accompanied with additional safeguards against over-removal of legal content. Providers of intermediary services should make best efforts to ensure that where automated tools are used for content moderation, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 22
Proposal for a regulation
Recital 26
(26)  Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
(26)  Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed and open online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the specific provider that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently providers should act where they are in the best place to do so.
Amendment 23
Proposal for a regulation
Recital 27
(27)  Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
(27)  Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be and among others, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, Virtual Private Networks, cloud infrastructure services, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 24
Proposal for a regulation
Recital 27 a (new)
(27a)  A single webpage or website may include elements that qualify differently between ‘mere conduit’, ‘caching’ or hosting services and the rules for exemptions from liability should apply to each accordingly. For example, a search engine could act solely as a ‘caching’ service as to information included in the results of an inquiry. Elements displayed alongside those results, such as online advertisements, would however still qualify as a hosting service.
Amendments 25 and 517/rev
Proposal for a regulation
Recital 28
(28)  Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
(28)  Providers of intermediary services should not be subject to a monitoring obligation, neither de jure, nor de facto with respect to obligations of a general nature. This does not concern specific and properly identified monitoring obligations in a specific case, where set out in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implement Union legal acts, in accordance with the conditions established in this Regulation and other Union law considered as lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, Member States should not prevent providers of intermediary services from providing end-to-end encrypted services. Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, and effectively prevents unauthorised third party access. Furthermore, to ensure effective digital privacy, Member States should not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. In accordance with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the right to use and pay for services anonymously wherever reasonable efforts can make this possible. This should apply without prejudice to the obligations in Union law on the protection of personal data. Providers can enable anonymous use of their services by refraining from collecting personal data regarding the recipient and their online activities and by not preventing recipients from using anonymising networks for accessing the service. Anonymous payment can take place for example by paying in cash, by using cash-paid vouchers or prepaid payment instruments.
Amendment 26
Proposal for a regulation
Recital 29
(29)  Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
(29)  Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with Union law, including the Charter on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.
Amendment 27
Proposal for a regulation
Recital 30
(30)  Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information.
(30)  Orders to act against illegal content or to provide information should be issued in compliance with Union law, including the Charter and in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online, or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information.
Amendment 28
Proposal for a regulation
Recital 31
(31)  The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
(31)  The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law in conformity with Union law, including Directive 2000/31/EC and the Charter, enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. Exceptionally, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
Amendment 29
Proposal for a regulation
Recital 32
(32)  The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
(32)  The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. Member States should ensure full implementation of the Union legal framework on confidentiality of communications and online privacy, as well as on protection of natural persons with regard to the processing of personal data enshrined in Directive (EU) 2016/680. In particular, Member States should respect the rights of individuals and journalists and refrain from seeking information which could harm media freedom or freedom of expression.
Amendment 30
Proposal for a regulation
Recital 33
(33)  Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
(33)  Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, as defined in Union or national law in compliance with Union law, respectively, where they are addressed to providers of intermediary services established in another Member State, they should not in principle restrict those providers’ freedom to provide their services across borders. The competent authority should transmit the orders to act against illegal content and to provide information directly to the relevant addressee by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement should notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) No 910/2014 of the European Parliament and of the Council. This Regulation should be without prejudice to the rules on the mutual recognition and enforcement of judgements, namely as regards the right to refuse recognition and enforcement of an order to act against illegal content, in particular where such an order is contrary to the public policy in the Member State where recognition or enforcement is sought.
Amendment 31
Proposal for a regulation
Recital 33 a (new)
(33a)  This Regulation should not prevent the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, to issue an order to restore content, where such content has been in compliance with the terms and conditions of the intermediary service provider, but has been erroneously considered as illegal by the service provider and has been removed.
Amendment 32
Proposal for a regulation
Recital 33 b (new)
(33b)  To ensure the effective implementation of this Regulation, orders to act against illegal content and to provide information should comply with Union law, including with the Charter. The Commission should provide an effective response to breaches of Union law through infringement proceedings.
Amendment 33
Proposal for a regulation
Recital 34
(34)  In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
(34)  In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective, predictable and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as a high level of consumer protection, the safety and trust of the recipients of the service, including minors and vulnerable users, the protection of relevant fundamental rights enshrined in the Charter, the meaningful accountability of those providers and the empowerment of recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 34
Proposal for a regulation
Recital 35
(35)  In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
(35)  In that regard, it is important that the due diligence obligations are adapted to the type, nature and size of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation in relation to those services. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
Amendment 35
Proposal for a regulation
Recital 36
(36)  In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
(36)  In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to designate a single point of contact and to publish relevant and up to date information relating to their point of contact, including the languages to be used in such communications. Such information should be notified to the Digital Service Coordinator in the Member State of establishment. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. It should be possible that this contact point is the same contact point as required under other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location.
Amendment 36
Proposal for a regulation
Recital 36 a (new)
(36a)  Providers of intermediary services should also be required to designate a single point of contact for recipients of services, which allows rapid, direct and efficient communication in particular by easily accessible means such as telephone number, email addresses, electronic contact forms, chatbots or instant messaging. It should be explicitly indicated when a user communicates with chatbots. To facilitate rapid, direct and efficient communication, recipients of services should not be faced with lengthy phone menus or hidden contact information. In particular, phone menus should always include the option to speak to a human. Providers of intermediary services should allow recipients of services to choose means of direct and efficient communication which do not solely rely on automated tools. This requirement should not affect the internal organisation of providers of intermediary services, including the ability to use third-party services to provide this communication system, such as external service providers and call centres.
Amendment 37
Proposal for a regulation
Recital 37
(37)  Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with.
(37)  Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. It should be possible that a legal representative is mandated by more than one provider of intermediary services, in accordance with national law, provided that such providers qualify as micro, small or medium sized enterprises as defined in Recommendation 2003/361/EC.
Amendment 38
Proposal for a regulation
Recital 38
(38)  Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes.
(38)  Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of protecting fundamental rights, in particular freedom of expression and of information, transparency, the protection of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are drafted in a clear and unambiguous language in line with applicable Union and national law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available, using, where appropriate graphical elements, such as icons.
Amendment 39
Proposal for a regulation
Recital 39
(39)  To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40
(39)  To ensure an adequate level of transparency and accountability, providers of intermediary services should draw up an annual report in a standardised and machine-readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC 40 which do not also qualify as very large online platforms.
__________________
__________________
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36).
40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 40
Proposal for a regulation
Recital 39 a (new)
(39a)  Recipients of a service should be able to make a free, autonomous and informed decisions or choices when using a service and providers of intermediary services shall not use any means, including via its interface, to distort or impair that decision-making. In particular, recipients of the service should be empowered to make such decision sinter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. However, certain practices typically exploit cognitive biases and prompt recipients of the service to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark patterns’). This should include, but should not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more visual prominence to a consent option, repetitively requesting or urging the recipient to make a decision such as making the procedure of cancelling a service significantly more cumbersome than signing up to it. However, rules preventing dark patterns should not be understood as preventing providers to interact directly with users and to offer new or additional services to them. In particular it should be possible to approach a user again in a reasonable time, even if the user had denied consent for specific data processing purposes, in accordance with Regulation (EU) 2016/679. The Commission should be empowered to adopt a delegated act to define practices that could be considered as dark patterns.
Amendment 512
Proposal for a regulation
Recital 39 b (new)
(39b)   To ensure an efficient and adequate application of the obligation on traceability of business users, without imposing any disproportionate burdens, the intermediary service providers covered should carry out due diligence checks prior to the use of their service to verify the reliability of the information provided by the business user concerned, in particular by using freely accessible official online databases or online interfaces, such as national trade registers or by requesting the business user concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation.
Amendment 41
Proposal for a regulation
Recital 40
(40)  Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
(40)  Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can establish that the content in question is clearly illegal without additional legal or factual examination of the information indicated in the notice and remove or disable access to that content ('action'). Such mechanism should include a clearly identifiable reporting mechanism, located close to the content in question allowing to notify quickly and easily items of information considered to be illegal content under Union or national law. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. While individuals should always be able to submit notices anonymously, such notices should not give rise to actual knowledge, except in the case of information considered to involve one of the offences referred to in Directive 2011/93/EU. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 42
Proposal for a regulation
Recital 40 a (new)
(40a)  Nevertheless, notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content. Such hosting service providers should redirect such notices to the particular online platform and inform the Digital Services Coordinator.
Amendment 43
Proposal for a regulation
Recital 40 b (new)
(40b)  Moreover, hosting providers should seek to act only against the items of information notified. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action.
Amendment 44
Proposal for a regulation
Recital 41
(41)  The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.
(41)  The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent, objective, non-arbitrary and non-discriminatory processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.
Amendment 45
Proposal for a regulation
Recital 41 a (new)
(41a)  Providers of hosting services should act upon notices without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action. The provider of hosting services should inform the individual or entity notifying the specific content of its decision without undue delay after taking a decision whether to act upon the notice or not.
Amendment 46
Proposal for a regulation
Recital 42
(42)  Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
(42)  Where a hosting service provider decides to remove, disable access to, demote or impose other measures with regard to information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have been proven to be efficient, proportionate and accurate, that provider should in a clear and user-friendly manner inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. The obligation should however not apply in a number of situations, namely when the content is deceptive or part of high-volume of commercial content, or when it has been requested by a judicial or law enforcement authority to not inform the recipient due to an ongoing criminal investigation until the criminal investigation is closed. Where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium, it should not be required to do so.
Amendment 47
Proposal for a regulation
Recital 42 a (new)
(42a)  A provider of hosting services may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving an imminent threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council1. In such instances, the provider of hosting services should inform without delay the competent law enforcement authorities of such suspicion, providing, upon their request, all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. The information notified by the hosting service provider should not be used for any purpose other than those directly related to the individual serious criminal offence notified. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms provider of hosting services. Providers of hosting services should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. In order to facilitate the notification of suspicions of criminal offenses, Member States should notify to the Commission the list of the competent law enforcement or judicial authorities.
__________________
1 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 48
Proposal for a regulation
Recital 43 a (new)
(43a)  Similarly, in order to ensure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of Chapter III Section 3, in whole or in parts, to those providers of intermediary services that are non-for profit t, or are medium-sized enterprises, but do not present any systemic risk related to illegal content and have limited exposure to illegal content. The providers should present justified reasons for why they should be issued a waiver and send their application first to their Digital Services Coordinators of establishment for a preliminary assessment. The Commission should examine such an application taking into account a preliminary assessment carried out by the Digital Services Coordinators of establishment. The preliminary assessment should be sent together with the application to the Commission. The Commission should monitor the application of the waiver and have the right revoke a waiver at any time. The Commission should maintain a public list of all waiver issued and their conditions.
Amendment 49
Proposal for a regulation
Recital 44
(44)  Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
(44)  Recipients of the service, should be able to easily and effectively contest certain decisions, of online platforms that negatively affect them. This should include decisions of online platforms allowing consumers to conclude distance contracts with traders to suspend the provisions of their services to traders. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non-discriminatory, non-arbitrary and fair outcomes within ten working days starting on the date on which the online platform received the complaint. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 50
Proposal for a regulation
Recital 46
(46)  Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43
(46)  Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, and expeditiously, taking into account due process and without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in an objective manner. Such trusted flagger status should only be awarded, for a period of two years, to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner and have transparent funding structure. The Digital Services Coordinator should be allowed to renew the status where the trusted flagger concerned continues to meet the requirements of this Regulation. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations, consumer organisations, and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. Trusted flaggers should publish easily comprehensible and detailed reports on notices submitted in accordance with Article 14. Those reports should indicate information such as notices categorised by the entity of the provider of hosting services, the type of content notified, the legal provisions allegedly breached by the content in question, and the action taken by the provider. The reports should also include information about any potential conflict of interest and sources of funding as well as the procedure put in place by the trusted flagger to retain its independence. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and respect for exceptions and limitations to intellectual property rights. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 In order to avoid abuses of the status of trusted flagger, it should be possible to suspend such status when a Digital Service Coordinator of establishment opened an investigation based on legitimate reasons. The suspension should not be longer than the time needed to conduct the investigation and should be maintained if the Digital Services Coordinator of establishment concluded that the entity in question could still be considered as a trusted flagger.
__________________
__________________
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 51
Proposal for a regulation
Recital 46 a (new)
(46a)  The strict application of universal design to all new technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity. It is essential to ensure that providers of online platforms, which offer services in the Union, design and provide those services in accordance with the accessibility requirements, set out in Directive (EU) 2019/882. In particular, providers of online platforms should ensure that information provided, forms provided and procedures that are in place are made available in a manner that they are easy to find, easy to understand, and accessible to persons with disabilities.
Amendment 52
Proposal for a regulation
Recital 47
(47)  The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
(47)  The misuse of services of online platforms by frequently providing illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate, proportionate and effective safeguards against such misuse. The misuse of services of online platforms could be established with regard to frequently provided illegal content where it is evident that that content is illegal without conducting a detailed legal or factual analysis. Notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should be entitled to temporarily or, in a limited number of situations, permanently suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 53
Proposal for a regulation
Recital 48
(48)  An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.
deleted
___________
1 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 54
Proposal for a regulation
Recital 49
(49)  In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
(49)  In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms that allow consumers to conclude distance contracts with traders should obtain additional information on the trader and the products and services they intend to offer on the platform. The online platform should therefore be required to obtain information on the name, telephone number and electronic mail of the economic operator and the type of product or service the trader intends to offer on the online platform. Prior to offering its services to the trader, the online platform operator should make best efforts to assess if the information provided by the trader is reliable. In addition, the platform should take adequate measures, such as where applicable, random checks, to identify and prevent illegal content from appearing on their interface. The fulfilment of the obligations on traceability of the traders, products and services should facilitate the compliance by platforms allowing consumers to conclude distance contracts with the obligation to inform consumers of the identity of their contracting party established under Directive 2011/83/EU of the European Parliament and of the Council, as well as the obligations established under Regulation (EU) No 1215/2012 as regards the Member State in which consumers can pursue their consumer rights. The requirement to provide essential information should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary and no longer than six months after the end of a relationship with the trader, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a direct legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 55
Proposal for a regulation
Recital 50
(50)  To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 .
(50)  To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should, before allowing the display of the product or services on its online interface, make reasonable efforts to assess the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the best efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a user-friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 .
__________________
__________________
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en
45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (OJ L 304, 22.11.2011, p. 64).
46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (OJ L 304, 22.11.2011, p. 64).
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) (OJ L 149, 11.6.2005, p. 22).
47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) (OJ L 149, 11.6.2005, p. 22).
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers (OJ L 80, 18.3.1998, p. 27).
48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers (OJ L 80, 18.3.1998, p. 27).
Amendment 56
Proposal for a regulation
Recital 50 a (new)
(50a)  Online platforms that allow consumers to conclude distance contracts with traders should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services, in compliance with the no general monitoring principle. Online platforms covered should inform recipients when the service or product they have acquired through their services are illegal.
Amendments 57 and 498
Proposal for a regulation
Recital 52
(52)  Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
(52)  Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. New advertising models have generated changes in the way information is presented and have created new personal data collection patterns and business models that might affect privacy, personal autonomy, democracy, quality news reporting and facilitate manipulation and discrimination. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed, as well as the natural or legal person who finances the advertisement. In addition, recipients of the service should have easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. In addition to these information obligations, online platforms should ensure that recipients of the service can refuse or withdraw their consent for targeted advertising purposes, in accordance with Regulation (EU) 2016/679 in a way that is not more difficult nor time-consuming than to give their consent. Online platforms should also not use personal data for commercial purposes related to direct marketing, profiling and behaviourally targeted advertising of minors. The online platform should not be obliged to maintain, acquire or process additional information in order to assess the age of the recipient of the service. Refusing consent in processing personal data for the purposes of advertising should not result in access to the functionalities of the platform being disabled. Alternative access options should be fair and reasonable both for regular and for one-time users, such as options based on tracking-free advertising. Targeting individuals on the basis of special categories of data which allow for targeting vulnerable groups should not be permitted.
Amendment 58
Proposal for a regulation
Recital 52 a (new)
(52a)  A core part of an online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should ensure that recipients can understand how recommender system impact the way information is displayed, and can influence how information is presented to them. They should clearly present the parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them.
Amendment 59
Proposal for a regulation
Recital 53
(53)  Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
(53)  Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no proportionate alternative and less restrictive measures that would effectively achieve the same result.
Amendment 60
Proposal for a regulation
Recital 54
(54)  Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
(54)  Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. Accordingly, the number of average monthly recipients of the service should reflect the recipients actually reached by the service either by being exposed to content or by providing content disseminated on the platforms’ interface in that period of time.
Amendment 61
Proposal for a regulation
Recital 56
(56)  Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
(56)  Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures where mitigation is possible without adversely impacting fundamental rights.
Amendment 62
Proposal for a regulation
Recital 57
(57)  Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
(57)  Four categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination and amplification of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous and counterfeit products and illegally-traded animals. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the actual and foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, freedom of the press, human dignity, the right to private life, the right to gender equality, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. A fourth category of risks concerns any actual and foreseeable negative effects on the protection of public health, including behavioural addictions due to excessive use of a service or other serious negative effects to the person's physical, mental, social and financial well-being.
Amendment 63
Proposal for a regulation
Recital 58
(58)  Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
(58)  Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment where mitigation is possible without adversely impacting fundamental rights. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content and of content that is incompatible with their terms and conditions. They should also consider mitigation measures in case of malfunctioning or intentional manipulation and exploitation of the service, or in case of risks inherent to the intended operation of the service, including the amplification of illegal content, of content that is in breach with their terms and conditions or any other content having negative effects, by adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the recipients of the service. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. The decision as to the choice of measures should remain with the very large online platform. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. The Commission should evaluate the implementation and effectiveness of the mitigating measures and issue recommendations when the measures implemented are deemed inappropriate or ineffective to address the systemic risk at stake.
Amendment 64
Proposal for a regulation
Recital 59
(59)  Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
(59)  Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, independent experts and civil society organisations.
Amendment 65
Proposal for a regulation
Recital 60
(60)  Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
(60)  Given the need to ensure verification by independent experts, very large online platforms should be accountable, through external independent auditing, for their compliance with the obligations laid down by this Regulation. In particular, audits should assess the clarity, coherence and predictable enforcement of terms of service, the completeness, methodology and consistency of the transparency reporting obligations, the accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers regarding notices of illegal content and terms of service violations, the accuracy of classification of removed information, the internal complaint handling mechanism, the interaction with trusted flaggers and assessment of their accuracy, the diligence with regard to the verification of the traceability of traders, the adequateness and correctness of the risk assessment, the adequateness and effectiveness of the risk mitigation measures taken and, where relevant, any complementary commitments undertaken pursuant to codes of conduct and crises protocols. They should give the vetted auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Vetted auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets,that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. This guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation applicable to very large online platforms. Auditors should be legally and financially independent and should not have conflict of interest involving the very large online platform concerned and other very large online platforms, so as to be able to perform their tasks in an adequate and trustworthy manner. Additionally, vetted auditors and their employees should not have provided any service to the very large online platform audited for 12 months before the audit. They should also commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for 12 months after their position in the auditing organisation has ended. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
Amendment 66
Proposal for a regulation
Recital 61
(61)  The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
(61)  The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, a statement of reasons for the failure to reach such a conclusion should be included in the audit opinion.
Amendment 67
Proposal for a regulation
Recital 62
(62)  A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
(62)  A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. Often, they facilitate the search for relevant content for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should let the recipients decide whether they want to be subject to recommender systems based on profiling and ensure that there is an option which is not based on profiling. In addition, online platforms should ensure that recipients are appropriately informed, on the use of recommender systems, and that recipients can influence the information presented to them through making active choices. They should clearly present the main parameters for such recommender systems in an easily comprehensible and user-friendly manner to ensure that the recipients understand how information is prioritised for them, the reason why, and how to modify the parameters used to curate the content presented for the recipients. Very large online platforms should implement appropriate technical and organisational measures for ensuring that recommender systems are designed in a consumer friendly manner and do not influence end users’ behaviour through dark patterns.
Amendment 68
Proposal for a regulation
Recital 63
(63)  Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
(63)  Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements, including the name of the product, service or brand and the object of the advertisement, and related data on the advertiser, and, if different, the natural or legal person who paid for the advertisement, and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
Amendment 69
Proposal for a regulation
Recital 64
(64)  In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
(64)  In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data and algorithms. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by vetted researchers, vetted not-for-profit bodies, organisations or associations, on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, not-for-profit bodies, organisations or associations,. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including personal data, trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. Vetted researchers, not-for-profit bodies, organisations or associations should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks.
Amendment 70
Proposal for a regulation
Recital 66
(66)  To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
(66)  To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, about the interoperability of advertisement repositories, or about terms and conditions. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. In the absence of relevant standards agreed within [24 months after the entry into force of this Regulation], the Commission should be able to establish technical specifications by implementing acts until a voluntary standard is agreed.
Amendment 71
Proposal for a regulation
Recital 67
(67)  The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
(67)  The Commission and the Board should encourage the drawing-up of codes of conduct as well as the compliance with the provisions of these codes to contribute to the application of this Regulation. The Commission and the Board should aim that the codes of conduct clearly define the nature of the public interest objectives being addressed, that they contain mechanisms for independent evaluation of the achievement of these objectives and that the role of competent authorities is clearly defined. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
Amendment 72
Proposal for a regulation
Recital 68
(68)  It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
(68)  It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation, or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of intentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure.
Amendment 73
Proposal for a regulation
Recital 69
(69)  The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.
(69)  The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. The Commission should also encourage the development of codes of conduct to facilitate compliance with obligations in areas, such as protection of minors or short-term rental. Other areas for consideration could be to promote diversity of information through support of high quality journalism and to foster credibility of information, whilst respecting confidentiality of journalistic sources. Moreover, it is important to ensure consistency with already existing enforcement mechanisms, such as those in the area of electronic communications or media and with independent regulatory structures in these fields as defined by Union and national law.
Amendment 74
Proposal for a regulation
Recital 70
(70)  The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.
(70)  The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. The effectiveness of the codes of conduct should be regularly assessed. Unlike legislation, codes of conduct are not subject to democratic scrutiny and their compliance with fundamental rights is not subject to judicial review. In order to enhance accountability, participation and transparency, procedural safeguards for drawing up codes of conduct are needed. Before initiating or facilitating the drawing-up or the revision of codes of conduct, the Commission may invite where appropriate, the Fundamental Rights Agency or the European Data Protection Supervisor to express their opinion.
Amendment 75
Proposal for a regulation
Recital 71
(71)  In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
(71)  In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
Amendment 76
Proposal for a regulation
Recital 72
(72)  The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should appoint at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure.
(72)  The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should designate at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure.
Amendment 77
Proposal for a regulation
Recital 73
(73)  Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level.
(73)  Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be designated as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level.
Amendment 78
Proposal for a regulation
Recital 74
(74)  The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate.
(74)  The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities have the necessary financial and human resources to carry out their tasks under this Regulation. It is also necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate.
Amendment 79
Proposal for a regulation
Recital 75
(75)  Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to apply and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members.
(75)  Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to supervise the application and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members.
Amendment 80
Proposal for a regulation
Recital 76
(76)  In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
(76)  In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in this Regulation by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
Amendment 81
Proposal for a regulation
Recital 77
(77)  Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation.
(77)  Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to adopt proportionate interim measures in case of risk of serious harm, as well as to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation.
Amendment 82
Proposal for a regulation
Recital 78
(78)  Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation.
(78)  Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure consistent and uniform application of this Regulation, the Commission should adopt guidance on the rules and procedures related to the powers of Digital Services Coordinators.
Amendment 83
Proposal for a regulation
Recital 79
(79)  In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should in principle take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States.
(79)  In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States.
Amendment 84
Proposal for a regulation
Recital 80
(80)  Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, whether the provider is active in several Member States.
(80)  Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, the number of recipients affected, the intentional or negligent character of the infringement and whether the provider is active in several Member States. The Commission should issue guidance to Member States concerning the criteria and conditions to impose proportionate penalties.
Amendment 85
Proposal for a regulation
Recital 81
(81)  In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation.
(81)  In order to ensure effective enforcement of the obligations, laid down in this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. The Digital Services Coordinator of establishment should assess the complaint in a timely manner and inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
Amendment 86
Proposal for a regulation
Recital 82
(82)  Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available.
(82)  Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements of this Regulation. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available.
Amendment 87
Proposal for a regulation
Recital 83 a (new)
(83a)  Without prejudice to the provisions on the exemption from liability, provided for in this Regulation as regards the information transmitted or stored at the request of a recipient of the service, providers of intermediary services should be liable for the infringement of their obligations laid down in this Regulation. Recipients of the service and organisations representing them should be entitled to have access to proportionate and effective remedies. They should in particular have the right to seek, in accordance with national or Union law, compensation from those providers of intermediary services against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation.
Amendment 88
Proposal for a regulation
Recital 84
(84)  The Digital Services Coordinator should regularly publish a report on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State.
(84)  The Digital Services Coordinator should regularly publish a report in a standardised and machine-readable format on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, based on the Internal Market Information system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State.
Amendment 89
Proposal for a regulation
Recital 86
(86)  In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved.
(86)  In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation on the basis of an agreement between the Member States concerned, and in the absence of agreement, under the authority of the Digital Services Coordinator of the Member State of establishment. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved.
Amendment 90
Proposal for a regulation
Recital 88
(88)  In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State.
(88)  In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. The rules of procedure of the Board should ensure respecting the confidentiality of the information.
Amendment 91
Proposal for a regulation
Recital 90
(90)  For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation.
(90)  For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation. The Board should draw up an annual report regarding its activities.
Amendment 92
Proposal for a regulation
Recital 91
(91)  The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
(91)  The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, gender equality and non-discrimination, eradication of all forms of violence against women and girls and other forms of gender-based violence, data protection, respect for intellectual property, competition, electronic communications, audiovisual services, market surveillance, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 93
Proposal for a regulation
Recital 96
(96)  Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform.
(96)  Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission should, on its own initiative or upon advice of the Board, initiate further investigation on the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform. The Commission should initiate proceedings in view of the possible adoption of decisions in respect of the relevant conduct by the very large online platform for example where that platform is suspected of having infringed this Regulation including where the platform has been found to not implement the operational recommendations from the independent audit that has been endorsed by Digital Services Coordinator of establishment and where the Digital Services Coordinator of establishment did not take any investigatory or enforcement measures.
Amendment 94
Proposal for a regulation
Recital 97
(97)  The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
(97)  Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
Amendment 95
Proposal for a regulation
Recital 97 a (new)
(97a)  The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.
Amendment 96
Proposal for a regulation
Recital 99
(99)  In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers
(99)  In particular, the Commission, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or that individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
Amendment 97
Proposal for a regulation
Recital 100
(100)  Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods.
(100)  Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods. The Commission should in particular ensure that the penalties are effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and nature of activities carried out, the number of recipients affected, the intentional or negligent character of the infringement as well as the economic capacity of the infringer.
Amendment 98
Proposal for a regulation
Recital 102
(102)  In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure.
(102)  The Commission should carry out a general evaluation of this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. This report should address in particular the definition of very large online platforms and the number of average monthly active recipients of the service. This report should also address the implementation of codes of conduct, as well as the obligation to designate a representative, established in the Union and assess the effect of similar obligations imposed by third countries on European service providers operating abroad. In particular, the Commissions should assess any impact of the costs to European service providers of any similar requirements, including to designate a legal representative, introduced by third countries and any new barriers to non-Union market access after the adoption of this Regulation. The Commission should also assess the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union. In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within three years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure.
Amendment 99
Proposal for a regulation
Article 1 – title
Subject matter and scope
Subject matter
Amendment 100
Proposal for a regulation
Article 1 – paragraph 1 – point c
(c)  rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.
(c)  rules on the implementation and enforcement of the requirements set out in this Regulation, including as regards the cooperation of and coordination between the competent authorities.
Amendment 101
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b)  set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
(b)  set out harmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected
Amendment 102
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
(ba)  promote a high level of consumer protection and contribute to increased consumer choice while facilitating innovation, support digital transition and encourage economic growth within the internal market.
Amendment 103
Proposal for a regulation
Article 1 – paragraph 3
3.  This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
deleted
Amendment 104
Proposal for a regulation
Article 1 – paragraph 4
4.  This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service.
deleted
Amendment 105
Proposal for a regulation
Article 1 – paragraph 5
5.  This Regulation is without prejudice to the rules laid down by the following:
deleted
(a)  Directive 2000/31/EC;
(b)  Directive 2010/13/EC;
(c)  Union law on copyright and related rights;
(d)  Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted];
(e)  Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted]
(f)  Regulation (EU) 2019/1148;
(g)  Regulation (EU) 2019/1150;
(h)  Union law on consumer protection and product safety, including Regulation (EU) 2017/2394;
(i)  Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC.
Amendment 106
Proposal for a regulation
Article 1 a (new)
Article 1a
Scope
1.  This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
2.  This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service.
3.  This Regulation is without prejudice to the rules laid down by the following:
(a)  Directive 2000/31/EC;
(b)  Directive 2010/13/EU;
(c)  Union law on copyright and related rights, in particular Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market;
(d)  Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online;
(e)  Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted]
(f)  Regulation (EU) 2019/1148;
(g)  Regulation (EU) 2019/1150;
(h)  Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Directive 2001/95/EC on general product safety;
(i)  Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC.
(j)  Directive (EU) 2019/882;
(k)  Directive (EU) 2018/1972;
(l)  Directive 2013/11/EU.
4.  By [12 months after the entry into force of this Regulation] the Commission shall publish guidelines with regard to the relationship between this Regulation and the legal acts referred to in Article 1a (3).
Amendment 107
Proposal for a regulation
Article 2 – paragraph 1 – point a
(a)  ‘information society services’ means services within the meaning of Article 1(1)(b) of Directive (EU) 2015/1535;
(a)  ‘information society services’ means services as defined in Article 1(1)(b) of Directive (EU) 2015/1535;
Amendment 108
Proposal for a regulation
Article 2 – paragraph 1 – point b
(b)  ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service;
(b)  ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service in order to seek information or to make it accessible;
Amendment 109
Proposal for a regulation
Article 2 – paragraph 1 – point c
(c)  ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business or profession;
(c)  ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;
Amendment 110
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
(d)  ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as:
(d)  ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of a provider of information society services which has a substantial connection to the Union;
Amendment 111
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
—  a significant number of users in one or more Member States; or
deleted
Amendment 112
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
—  the targeting of activities towards one or more Member States.
deleted
Amendment 113
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
(da)  ‘substantial connection to the Union’ means the connection of a provider with one or more Member States resulting either from its establishment in the Union, or in the absence of such an establishment, from the fact that the provider directs its activities towards one or more Member States;
Amendment 114
Proposal for a regulation
Article 2 – paragraph 1 – point e
(e)  ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;
(e)  ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes directly relating to his or her trade, business, craft or profession;
Amendment 115
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 1
—  a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
—  a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network including technical auxiliary functional services;
Amendment 116
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 2
—  a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;
—  a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;
Amendment 117
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g)  ‘illegal content’ means any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
(g)  ‘illegal content’ means any information or activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 118
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h)  ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
(h)  ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor or a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
Amendment 119
Proposal for a regulation
Article 2 – paragraph 1 – point k
(k)  ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications;
(k)  ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications which enables the recipients of the service to access and interact with the relevant intermediary service;
Amendment 120
Proposal for a regulation
Article 2 – paragraph 1 – point k a (new)
(ka)  ‘trusted flagger’ means an entity that has been awarded such status by a Digital Services Coordinator;
Amendment 121
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n)  ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information;
(n)  ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that message;
Amendment 122
Proposal for a regulation
Article 2 – paragraph 1 – point n a (new)
(na)  'remuneration' means economic compensation consisting of direct or indirect payment for the service provided, including where the intermediary service provider is not directly compensated by the recipient of the service or where the recipient of the service provides data to the service provider, except where such data is collected for the sole purpose of meeting legal requirements;
Amendment 123
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o)  ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
(o)  ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, prioritise or curate in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 124
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p)  ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
(p)  ‘content moderation’ means the activities, either automated or not automated, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 125
Proposal for a regulation
Article 2 – paragraph 1 – point q
(q)  ‘terms and conditions’ means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.
(q)  ‘terms and conditions’ means all terms and conditions or specifications, by the service provider irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.
Amendment 126
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa)  ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882.
Amendment 127
Proposal for a regulation
Article 3 – paragraph 3
3.  This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
3.  This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 128
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1.  Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that:
1.  Where an information society service is provided that consists of the transmission in communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or secure the information's onward transmission to other recipients of the service upon their request, on condition that the provider:
Amendment 129
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a)  the provider does not modify the information;
(a)  does not modify the information;
Amendment 130
Proposal for a regulation
Article 4 – paragraph 1 – point b
(b)  the provider complies with conditions on access to the information;
(b)  complies with conditions on access to the information;
Amendment 131
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c)  the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;
(c)  complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;
Amendment 132
Proposal for a regulation
Article 4 – paragraph 1 – point d
(d)  the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and
(d)  does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and
Amendment 133
Proposal for a regulation
Article 4 – paragraph 1 – point e
(e)  the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
(e)  acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
Amendment 134
Proposal for a regulation
Article 4 – paragraph 2
2.  This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
2.  This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 135
Proposal for a regulation
Article 5 – paragraph 3
3.  Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
3.  Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
Amendment 136
Proposal for a regulation
Article 5 – paragraph 4
4.  This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
4.  This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 137
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
1.   Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4, and 5 solely because they carry out voluntary own-initiative investigations or take measures aimed at detecting, identifying and removing, or disabling of access to, illegal content or take the necessary measures to comply with the requirements of national and Union law, including the Charter and the requirements set out in this Regulation.
Amendment 138
Proposal for a regulation
Article 6 – paragraph 1 a (new)
1a.   Providers of intermediary services shall ensure that voluntary own-initiative investigations carried out and measures taken pursuant to paragraph 1 shall be effective and specific. Such own initiative investigations and measures shall be accompanied by appropriate safeguards, such as human oversight, documentation, or any additional measure to ensure and demonstrate that those investigations and measures are accurate, non-discriminatory, proportionate, transparent and do not lead to over-removal of content. Providers of intermediary services shall make best efforts to ensure that where automated means are used, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content.
Amendment 139
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.
1.   No general obligation to monitor, neither de jure, nor de facto, through automated or non-automated means, the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity or for monitoring the behaviour of natural persons shall be imposed on those providers.
Amendment 140
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a.   Providers of intermediary services shall not be obliged to use automated tools for content moderation or for monitoring the behaviour of natural persons.
Amendment 141
Proposal for a regulation
Article 7 – paragraph 1 b (new)
1b.   Member States shall not prevent providers of intermediary services from offering end-to-end encrypted services.
Amendment 142
Proposal for a regulation
Article 7 – paragraph 1 c (new)
1c.   Member States shall not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. Member States shall not oblige providers of intermediary services to generally and indiscriminately retain personal data of the recipients of their services. Any targeted retention of a specific recipient’s data shall be ordered by a judicial authority in accordance with Union or national law.
Amendment 520/rev
Proposal for a regulation
Article 7 – paragraph 1 d (new)
1d.  Without prejudice to Regulation (EU) 2016/679 and Directive 2002/58/EC, providers shall make reasonable efforts to enable the use of and payment for that service without collecting personal data of the recipient.
Amendment 143
Proposal for a regulation
Article 8 – paragraph 1
1.  Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
1.  Providers of intermediary services shall, upon the receipt via a secure communications channel of an order to act against one or more specific items of illegal content, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the actions taken and the moment when the actions were taken.
Amendment 144
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent -1 (new)
—  a reference to the legal basis for the order;
Amendment 145
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
—  a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
—  a sufficiently detailed statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law in conformity with Union law;
Amendment 146
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
—  identification of the issuing authority including the date, timestamp and electronic signature of the authority, that allows the recipient to authenticate the order and contact details of a person of contact within the said authority;
Amendment 147
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
—  one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned;
—  a clear indication of the exact electronic location of that information, such as the exact URL or URLs where appropriate or when the exact electronic location is not precisely identifiable; one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned;
Amendment 148
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
—  information about redress available to the provider of the service and to the recipient of the service who provided the content;
—  easily understandable information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content, including the deadlines for appeal;
Amendment 149
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
—  where necessary and proportionate, the decision not to disclose information about the removal of or disabling of access to the content for reasons of public security, such as the prevention, investigation, detection and prosecution of serious crime, not exceeding six weeks from that decision;
Amendment 150
Proposal for a regulation
Article 8 – paragraph 2 – point b
(b)  the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective;
(b)  the territorial scope of the order on the basis of the applicable rules of Union and national law in conformity with Union law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; the territorial scope of the order shall be limited to the territory of the Member State issuing the order unless the illegality of the content derives directly from Union law or the rights at stake require a wider territorial scope, in accordance with Union and international law;
Amendment 151
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c)  the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.
(c)  the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10 or in one of the official languages of the Member State that issues the order against the specific item of illegal content; in such case, the point of contact of the service provider may request the competent authority to provide translation into the language declared by the provider;
Amendment 152
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
(ca)  the order is in compliance with Article 3 of Directive 2000/31/EC;
Amendment 153
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
(cb)  where more than one provider of intermediary services is responsible for hosting the specific items of illegal content, the order is issued to the most appropriate provider that has the technical and operational ability to act against those specific items.
Amendment 154
Proposal for a regulation
Article 8 – paragraph 2 a (new)
2a.  The Commission shall adopt implementing acts in accordance with Article 70, after consulting the Board, laying down a specific template and form for the orders, referred to in paragraph 1.
Amendment 155
Proposal for a regulation
Article 8 – paragraph 2 b (new)
2b.  Providers of intermediary services who received an order shall have a right to an effective remedy. The Digital Services Coordinator of the Member State of establishment may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relation to the order.
The Digital Services Coordinator of the Member State of establishment may request the authority issuing the order to withdraw or repeal the order or adjust the territorial scope of the order to what is strictly necessary. Where such a request is refused, the Digital Services Coordinator of the Member State of establishment shall be entitled to seek the annulling, ceasing or adjustment of the effect of the order before the judicial authorities of the Member States issuing the order. Such proceedings shall be completed without undue delay.
Amendment 156
Proposal for a regulation
Article 8 – paragraph 2 c (new)
2c.  If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the judicial or administrative authority that issued the order asking for the necessary clarification.
Amendment 157
Proposal for a regulation
Article 8 – paragraph 2 d (new)
2d.  The authority issuing the order shall transmit that order and the information received from the provider of intermediary services as to the effect given to the order to the Digital Services Coordinator from the Member State of the issuing authority.
Amendment 158
Proposal for a regulation
Article 8 – paragraph 4
4.  The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law.
4.  The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative procedural law in conformity with Union law, including the Charter. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives pursued.
Amendment 159
Proposal for a regulation
Article 8 – paragraph 4 a (new)
4a.  Member States shall ensure that the relevant authorities may, at the request of an applicant whose rights are infringed by illegal content, issue against the relevant provider of intermediary services an injunction order in accordance with this Article to remove or disable access to that content.
Amendment 160
Proposal for a regulation
Article 9 – paragraph 1
1.  Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
1.  Providers of intermediary services shall, upon receipt via a secure communications channel of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 161
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
—  the identification details of the judicial or administrative authority issuing the order and authentication of the order by that authority, including the date, time stamp and electronic signature of the authority issuing the order to provide information;
Amendment 162
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 a (new)
—  a reference to the legal basis for the order;
Amendment 163
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 b (new)
—  a clear indication of the exact electronic location, an account name, or a unique identifier of the recipient on whom information is sought;
Amendment 164
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
—  a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
—  a sufficiently detailed statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
Amendment 165
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
—  where the information sought constitutes personal data within the meaning of Article 4, point (1), of Regulation (EU) 2016/679 or Article 3, point (1), of Directive (EU) 2016/680, a justification that the order is in accordance with applicable data protection law;
Amendment 166
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
—  information about redress available to the provider and to the recipients of the service concerned;
—  information about redress available to the provider and to the recipients of the service concerned including deadlines for appeal;
Amendment 167
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
—  an indication on whether the provider should inform without undue delay the recipient of the service concerned, including information about the data being sought; where information is requested in the context of criminal proceedings, the request for that information shall be in compliance with Directive (EU) 2016/680, and the information to the recipient of the service concerned about that request may be delayed as long as necessary and proportionate to avoid obstructing the relevant criminal proceedings, taking into account the rights of the suspected and accused persons and without prejudice to defence rights and effective legal remedies. Such a request shall be duly justified, specify the duration of the obligation of confidentiality and shall be subject to periodic review.
Amendment 168
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c)  the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;
(c)  the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10 or in one of the official languages of the Member State that issues the order against the item of illegal content; in such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
Amendment 169
Proposal for a regulation
Article 9 – paragraph 2 a (new)
2a.  The Commission shall adopt implementing acts in accordance with Article 70, after consulting the Board, laying down specific template and form for the orders referred to in paragraph 1.
Amendment 170
Proposal for a regulation
Article 9 – paragraph 2 b (new)
2b.  The provider of intermediary services who received an order shall have a right to an effective remedy. That right shall include the right to challenge the order before the judicial authorities of the Member State of the issuing competent authority, in particular where such an order is not incompliance with Article 3 of Directive 2000/31/EC. The Digital Services Coordinator of the Member State of establishment may choose to intervene on behalf of the provider in any redress, appeal or other legal proceedings in relation to the order.
The Digital Services Coordinator of the Member State of establishment may request the authority issuing the order to withdraw or repeal the order. Where such a request is refused, the Digital Services Coordinator of the Member State of establishment shall be entitled to seek the annulling, ceasing or adjustment of the effect of the order before the judicial of the Member States of the order. Such proceedings shall be completed without undue delay.
Amendment 171
Proposal for a regulation
Article 9 – paragraph 2 c (new)
2c.  If the provider cannot comply with the order because it contains manifest errors or does not contain sufficient information to enable it to be executed, it shall, without undue delay, inform the judicial or administrative authority that issued that information order and request the necessary clarifications.
Amendment 172
Proposal for a regulation
Article 9 – paragraph 2 d (new)
2d.  The authority issuing the order to provide a specific item of information shall transmit that order and the information received from the provider of intermediary services as to the effect given to the order to the Digital Services Coordinator from the Member State of the issuing authority.
Amendment 173
Proposal for a regulation
Article 9 – paragraph 4
4.  The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law.
4.  The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law or administrative procedural law in conformity with Union law.
Amendment 174
Proposal for a regulation
Article 9 a (new)
Article 9a
Effective remedies for recipients of the service
1.   Recipients of the service whose content was removed according to Article 8 or whose information was sought according to Article 9 shall have the right to effective remedies against such orders, including, where applicable, restauration of content where such content has been in compliance with the terms and conditions, but has been erroneously considered as illegal by the service provider, without prejudice to remedies available under Directive (EU) 2016/680 and Regulation (EU) 2016/679.
2.   Such right to an effective remedy shall be exercised before a judicial authority in the issuing Member State in accordance with national law and shall include the possibility to challenge the legality of the measure, including its necessity and proportionality.
3.   Digital Services Coordinators shall develop national tools and guidance to recipients of the service as regards complaint and redress mechanisms applicable in their respective territory.
Amendment 175
Proposal for a regulation
Chapter III – title
Due diligence obligations for a transparent and safe online environment
Due diligence obligations for a transparent, accessible and safe online environment
Amendment 176
Proposal for a regulation
Article 10 – title
Points of contact
Points of contact for Member States’ authorities, the Commission and the Board
Amendment 177
Proposal for a regulation
Article 10 – paragraph 1
1.  Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation.
1.  Providers of intermediary services shall designate a single point of contact enabling them to communicate directly, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation.
Amendment 178
Proposal for a regulation
Article 10 – paragraph 2
2.  Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact.
2.  Providers of intermediary services shall communicate to the Member States' authorities, the Commission and the Board, the information necessary to easily identify and communicate with their single points of contact, including the name, the email address, the physical address and the telephone number, and shall ensure that the information is kept up to date.
Amendment 179
Proposal for a regulation
Article 10 – paragraph 2 a (new)
2a.  Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
Amendment 180
Proposal for a regulation
Article 10 a (new)
Article 10a
Points of contact for recipients of services
1.   Providers of intermediary services shall designate a single point of contact that enables recipients of services to communicate directly with them.
2.   In particular, providers of intermediary services shall enable recipients of services to communicate with them by providing rapid, direct and efficient means of communication such as telephone number, email addresses, electronic contact forms, chatbots or instant messaging as well as the physical address of the establishment of the provider of intermediary services, in a user-friendly, and easily accessible manner. Providers of intermediary services shall also enable recipients of services to choose the means of direct communication, which shall not solely rely on automated tools.
3.   Providers of intermediary services shall make all reasonable efforts to guarantee that sufficient human and financial resources are allocated to ensure that the communication, referred to in paragraph 1 is performed in a timely and efficient manner.
Amendment 181
Proposal for a regulation
Article 11 – paragraph 1
1.  Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services.
1.  Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services.
Amendment 182
Proposal for a regulation
Article 11 – paragraph 2
2.  Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to cooperate with the Member States’ authorities, the Commission and the Board and comply with those decisions.
2.  Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of,compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and sufficient resources in order to guarantee their efficient and timely cooperation with the Member States’ authorities, the Commission and the Board and comply with any of those decisions.
Amendment 183
Proposal for a regulation
Article 11 – paragraph 4
4.  Providers of intermediary services shall notify the name, address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date.
4.  Providers of intermediary services shall notify the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established They shall ensure that that information is kept up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity.
Amendment 477
Proposal for a regulation
Article 11 – paragraph 5 a (new)
5a.   Providers of intermediary services that qualify as micro, small or medium-sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
Amendment 513
Proposal for a regulation
Article 12 – paragraph 1
1.  Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
1.  Providers of intermediary services shall use fair, non-discriminatory and transparent terms and conditions. Providers of intermediary services shall draft those terms and conditions in clear, plain, user friendly and unambiguous language and shall make them publicly available in an easily accessible and machine-readable format in the languages of the Member State towards which the service is directed. In their terms and conditions, providers of intermediary services shall respect the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms, as enshrined in the Charter as well as the rules applicable to the media in the Union.
Amendment 186
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a.  In their terms and conditions, providers of intermediary services shall include information on any restrictions or modifications that they impose in relation to the use of their service in respect of content provided by the recipients of the service. Providers of intermediary services shall also include easily accessible information on the right of the recipients to terminate the use of their service. Providers of intermediary services shall also include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review.
Amendment 187
Proposal for a regulation
Article 12 – paragraph 1 b (new)
1b.  Providers of intermediary services shall notify expeditiously the recipients of the service of any significant change to the terms and conditions and provide an explanation thereof.
Amendment 188
Proposal for a regulation
Article 12 – paragraph 1 c (new)
1c.  Where an intermediary service is primarily directed at minors or is pre-dominantly used by them, the provider shall explain conditions for and restrictions on the use of the service in a way that minors can understand.
Amendment 189
Proposal for a regulation
Article 12 – paragraph 2
2.  Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
2.  Providers of intermediary services shall act in a fair, transparent, coherent, diligent, timely, non-arbitrary, non-discriminatory and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 190
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a.  Providers of intermediary services shall provide recipients of services with a concise, easily accessible and in machine-readable format summary of the terms and conditions, in clear, user-friendly and unambiguous language. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies and redress mechanisms available.
Amendment 191
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b.  Providers of intermediary services may use graphical elements such as icons or images to illustrate the main elements of the information requirements.
Amendment 192
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c.  Very large online platforms as defined in Article 25 shall publish their terms and conditions in the official languages of all Member States in which they offer their services.
Amendment 193
Proposal for a regulation
Article 12 – paragraph 2 d (new)
2d.  Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service.
Amendment 538
Proposal for a regulation
Article 12 – paragraph 2 e (new)
2e.  Terms and conditions of providers of intermediary services shall respect the essential principles of fundamental rights enshrined in the Charter.
Amendment 539
Proposal for a regulation
Article 12 – paragraph 2 f (new)
2f.  Terms that do not comply with this Article shall not be binding on recipients.
Amendment 194
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1.  Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:
1.  Providers of intermediary services shall publish in a standardised and machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:
Amendment 195
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a)  the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;
(a)  the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform the authority issuing the order of its receipt and the effect given to the order;
Amendment 196
Proposal for a regulation
Article 13 – paragraph 1 – point a a (new)
(aa)  where applicable, the complete number of content moderators allocated for each official language per Member State, and a qualitative description of whether and how automated tools for content moderation are used in each official language;
Amendment 197
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b)  the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
(b)  the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action; providers of intermediary services may add additional information as to the reasons for the average time for taking the action;
Amendment 198
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c)  the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
(c)  meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures, as well as, where applicable, measures taken to provide training and assistance to members of staff who are engaged in content moderation, and to ensure that non-infringing content is not affected;
Amendment 199
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d)  the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
(d)  the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 200
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1a.  The information provided shall be presented per Member State in which services are offered and in the Union as a whole.
Amendment 201
Proposal for a regulation
Article 13 – paragraph 2
2.  Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2.  Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, which do not also qualify as very large online platforms.
Amendment 202
Proposal for a regulation
Article 13 a (new)
Article 13a
Online interface design and organisation
1.   Providers of intermediary services shall not use the structure, function or manner of operation of their online interface, or any part thereof, to distort or impair recipients of services’ ability to make a free, autonomous and informed decision or choice. In particular, providers of intermediary services shall refrain from:
(a)   giving more visual prominence to any of the consent options when asking the recipient of the service for a decision;
(b)   repeatedly requesting that a recipient of the service consents to data processing, where such consent has been refused, pursuant to Article 7(3) of Regulation (EU) 2016/679, regardless of the scope or purpose of such processing, especially by presenting a pop-up that interferes with user experience;
(c)   urging a recipient of the service to change a setting or configuration of the service after the recipient has already made a choice;
(d)   making the procedure of terminating a service significantly more cumbersome than signing up to it; or
(e)   requesting consent where the recipient of the service exercises his or her right to object by automated means using technical specifications, in line with Article 21(5) of Regulation (EU) 2016/679.
This paragraph shall be without prejudice to Regulation(EU) 2016/679.
2.   The Commission is empowered to adopt a delegated act to update the list of practices referred to in paragraph 1.
3.   Where applicable, providers of intermediary services shall adapt their design features to ensure a high level of privacy, safety, and security by design for minors.
Amendment 203
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
2.  The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2.  The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements:
Amendment 204
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
(aa)  where possible, evidence that substantiates the claim;
Amendment 205
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b)  a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
(b)  where relevant, a clear indication of the exact electronic location of that information, for example, the exact URL or URLs, or, where necessary, additional information enabling the identification of the illegal content as applicable to the type of content and to the specific type of hosting service;
Amendment 206
Proposal for a regulation
Article 14 – paragraph 3
3.  Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
3.  Notices that include the elements referred to in paragraph 2, on the basis of which a diligent hosting service provider is able to establish the illegality of the content in question without conducting a legal or factual examination, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 207
Proposal for a regulation
Article 14 – paragraph 3 a (new)
3a.  Information that has been the subject of a notice shall remain accessible while the assessment of its legality is still pending, without prejudice to the right of providers of hosting services to apply their terms and conditions. Providers of hosting services shall not be held liable for failure to remove notified information, while the assessment of legality is still pending.
Amendment 208
Proposal for a regulation
Article 14 – paragraph 4
4.  Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity.
4.  Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
Amendment 209
Proposal for a regulation
Article 14 – paragraph 5
5.  The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
5.  The provider shall also, without undue delay, notify that individual or entity of its action in respect of the information to which the notice relates, providing information on the redress possibilities.
Amendment 210
Proposal for a regulation
Article 14 – paragraph 5 a (new)
5a.  The anonymity of individuals who submitted a notice shall be ensured towards the recipient of the service who provided the content, except in cases of alleged violations of personality rights or of intellectual property rights.
Amendment 211
Proposal for a regulation
Article 14 – paragraph 6
6.  Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
6.  Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-discriminatory and non-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. Where the provider has no technical, operational or contractual ability to act against specific items of illegal content, it may hand over a notice to the provider that has direct control of specific items of illegal content, while informing the notifying person or entity and the relevant Digital Services Coordinator.
Amendment 212
Proposal for a regulation
Article 15 – paragraph 1
1.  Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
1.  Where a provider of hosting services decides to remove, disable access to, demote or to impose other measures with regard to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
This obligation shall not apply where the content is deceptive high-volume commercial content, or it has been requested by a judicial or law enforcement authority not to inform the recipient due to an ongoing criminal investigations until the criminal investigations is closed.
Amendment 213
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a)  whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access;
(a)  whether the action entails either the removal, the disabling of access, the demotion of, or imposes other measures with regard to information and, where relevant, the territorial scope of the action and its duration, including, where an action was taken pursuant to Article 14, an explanation about why the action did not exceed what was strictly necessary to achieve its purpose;
Amendment 214
Proposal for a regulation
Article 15 – paragraph 2 – point b
(b)  the facts and circumstances relied on in taking the decision, including where relevant whether the decision was taken pursuant to a notice submitted in accordance with Article 14;
(b)  the facts and circumstances relied on in taking the action, including where relevant whether the action was taken pursuant to a notice submitted in accordance with Article 14 or based on voluntary own-initiative investigations or to an order issued in accordance with Article 8 and where appropriate, the identity of the notifier;
Amendment 215
Proposal for a regulation
Article 15 – paragraph 2 – point c
(c)  where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
(c)  where applicable, information on the use made of automated means in taking the action, including where the action was taken in respect of content detected or identified using automated means;
Amendment 216
Proposal for a regulation
Article 15 – paragraph 2 – point d
(d)  where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground;
(d)  where the action concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground;
Amendment 217
Proposal for a regulation
Article 15 – paragraph 2 – point e
(e)  where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;
(e)  where the action is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;
Amendment 218
Proposal for a regulation
Article 15 – paragraph 2 – point f
(f)  information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.
(f)  clear, user-friendly information on the redress possibilities available to the recipient of the service in respect of the action, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.
Amendment 219
Proposal for a regulation
Article 15 – paragraph 4
4.  Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data.
4.  Providers of hosting services shall publish at least once a year the actions and the statements of reasons, referred to in paragraph 1 in a publicly accessible machine-readable database managed and published by the Commission. That information shall not contain personal data.
Amendment 220
Proposal for a regulation
Article 15 a (new)
Article 15 a
Notification of suspicions of criminal offences
1.   Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving an imminent threat to the life or safety of persons has taken place, is taking place or planned to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide, upon their request, all the relevant information available.
2.   Where the provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative and may inform Europol.
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, to be taking place or to be planned to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. For the purpose of this Article, Member States shall notify to the Commission the list of its competent law enforcement or judicial authorities.
3.   Unless instructed otherwise by the informed authority, the provider of hosting services shall remove or disable the content.
4.   Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
5.   The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.
Amendment 221
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
1.   This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which do not qualify as a very large online platforms as defined by Article 25 of this Regulation.
2.   Providers of intermediary services may submit an application accompanied by a justification for a waiver from the requirements of this section provided that they:
(a)   do not present significant systemic risks and have limited exposure to illegal content; and
(b)   qualify as non-for-profit or qualify as a medium enterprise within the meaning of the Annex to Recommendation 2003/361/EC.
3.   The application shall be submitted to the Digital Services Coordinator of establishment who shall conduct a preliminary assessment. The Digital Services Coordinator of establishment shall transmit to the Commission the application accompanied by its assessment and where applicable, a recommendation on the Commission’s decision. The Commission shall examine such an application and, after consulting the Board, may issue a total or a partial waiver from the requirements of this Section.
4.   Where the Commission grants such a waiver, it shall monitor the use of the waiver by the provider of intermediary services to ensure that the conditions for use of the waiver are respected.
5.   Upon the request of the Board, the Digital Services Coordinator of establishment or the provider, or on its own initiative, the Commission may review or revoke the waiver in whole or in parts.
6.   The Commission shall maintain a list of all waivers issued and their conditions and shall make the list publicly available.
7.   The Commission shall be empowered to adopt a delegated act in accordance with Article 69 as to the process and procedure for the implementation of the waiver system in relation with this Article.
Amendment 222
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a)  decisions to remove or disable access to the information;
(a)  decisions to remove, demote, disable access to or impose other measures that restrict visibility, availability or accessibility of the information;
Amendment 223
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b)  decisions to suspend or terminate the provision of the service, in whole or in part, to the recipients;
(b)  decisions to suspend or terminate, or limit the provision of the service, in whole or in part, to the recipients;
Amendment 224
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca)  decisions to restrict the ability to monetise content provided by the recipients.
Amendment 225
Proposal for a regulation
Article 17 – paragraph 1 a (new)
1a.  The period of at least six months as set out in paragraph 1 shall be considered to start on the day on which the recipient of the service is informed about the decision in accordance with Article 15.
Amendment 226
Proposal for a regulation
Article 17 – paragraph 2
2.  Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
2.  Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly, including for persons with disabilities and minors, non-discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner.
Amendment 227
Proposal for a regulation
Article 17 – paragraph 3
3.  Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
3.  Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner and within ten working days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 228
Proposal for a regulation
Article 17 – paragraph 5
5.  Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
5.  Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Online platform shall ensure that decisions are taken by qualified staff.
Amendment 229
Proposal for a regulation
Article 17 – paragraph 5 a (new)
5a.  Recipients of the service shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned.
Amendment 230
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
1.  Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
1.  Recipients of the service addressed by the decisions referred to in Article 17(1), taken by the online platform on the grounds that the information provided by the recipients is illegal content or incompatible with its terms and conditions , shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article.
Amendment 231
Proposal for a regulation
Article 18 – paragraph 1 a (new)
1a.  Both parties shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. The possibility to select any out-of-court dispute settlement body shall be easily accessible on the online interface of the online platform in a clear and user-friendly manner.
Amendment 232
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
2.  The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, where the body has demonstrated that it meets all of the following conditions:
2.  The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body for a maximum of three years, which can be renewed, where the body and persons in charge of the out-of-court dispute settlement body has demonstrated that it meets all of the following conditions:
Amendment 233
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a)  it is impartial and independent of online platforms and recipients of the service provided by the online platforms;
(a)  it is independent, including financially independent, and impartial towards online platforms, recipients of the service provided by the online platforms and towards individuals or entities that have submitted notices;
Amendment 234
Proposal for a regulation
Article 18 – paragraph 2 – point b a (new)
(ba)  its members are remunerated in a way that is not linked to the outcome of the procedure;
Amendment 235
Proposal for a regulation
Article 18 – paragraph 2 – point b b (new)
(bb)  the natural persons in charge of dispute resolution commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended, and have not worked for such an organisation for two years prior to taking up this role;
Amendment 236
Proposal for a regulation
Article 18 – paragraph 2 – point c
(c)  the dispute settlement is easily accessible through electronic communication technology;
(c)  the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology and provides for the possibility to submit a complaint and the requisite supporting documents online;
Amendment 237
Proposal for a regulation
Article 18 – paragraph 2 – point e
(e)  the dispute settlement takes place in accordance with clear and fair rules of procedure.
(e)  the dispute settlement takes place in accordance with clear and fair rules of procedure which are clearly visible and easily and publicly accessible.
Amendment 238
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a.  The Digital Services Coordinator shall reassess on a yearly basis whether the certified out-of-court dispute settlement body continues to fulfil the conditions, referred to in paragraph 2. If this is not the case, the Digital Services Coordinator shall revoke the status from the out-of-court dispute settlement body.
Amendment 239
Proposal for a regulation
Article 18 – paragraph 2 b (new)
2b.  The Digital Service Coordinator shall draw up a report every two years listing the number of complaints the out of court dispute settlement body has received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes. The report shall in particular:
(a)  identify best practices of the out-of-court dispute settlement bodies;
(b)  report, where appropriate, on any shortcomings, supported by statistics, that hinder the functioning of the out-of-court dispute settlement bodies for both domestic and cross-border disputes;
(c)  make recommendations on how to improve the effective and efficient functioning of the out-of-court dispute settlement bodies, where appropriate.
Amendment 240
Proposal for a regulation
Article 18 – paragraph 2 c (new)
2c.  Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time and no later than 90 calendar days after the date on which the certified body has received the complaint. The procedure shall be considered terminated on the date on which the certified body has made the decision of out-of-court dispute settlement procedure available.
Amendment 241
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3.  If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
3.  If the body decides the dispute in favour of the recipient of the service, individuals or entities mandated under Article 68 that have submitted notices, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient or individuals or entities that have submitted notices have paid or are to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, and the body does not find that the recipient acted in bad faith in the dispute, the recipient or the individuals or entities that have submitted notices shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
Amendment 242
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof.
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof for online platforms. Out-of-court dispute settlement procedures shall be free of charge or available at a nominal fee for the recipient of the service.
Amendment 243
Proposal for a regulation
Article 18 – paragraph 5
5.  Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated.
5.  Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph as well as out-of-court dispute settlement bodies whose status has been revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated.
Amendment 244
Proposal for a regulation
Article 19 – paragraph 1
1.  Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
1.  Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and expeditiously, taking into account due process.
Amendment 245
Proposal for a regulation
Article 19 – paragraph 1 a (new)
1a.  Online platforms shall take the necessary technical and organisational measures to ensure that trusted flaggers can issue correction notices of incorrect removal, restriction or disabling access to content, or of suspensions or terminations of accounts, and that those notices to restore information are processed and decided upon with priority and without delay.
Amendment 246
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
2.  The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
2.  The status of trusted flaggers under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
Amendment 247
Proposal for a regulation
Article 19 – paragraph 2 – point c
(c)  it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner.
(c)  it carries out its activities for the purposes of submitting notices in an accurate and objective manner.
Amendment 248
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
(ca)  it has a transparent funding structure, including publishing the sources and amounts of all revenue annually;
Amendment 249
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
(cb)  it publishes, at least once a year, clear, easily comprehensible, detailed and standardised reports on all notices submitted in accordance with Article 14 during the relevant period. The report shall list:
—  notices categorised by the identity of the provider of hosting services;
—  the type of content notified;
—  the specific legal provisions allegedly breached by the content notified;
—  the action taken by the provider;
—  any potential conflicts of interest and sources of funding, and an explanation of the procedures in place to ensure that the trusted flagger retains its independence.
The reports referred to in point (cb) shall be sent to the Commission which shall make them publicly available.
Amendment 250
Proposal for a regulation
Article 19 – paragraph 3
3.  Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
3.  Digital Services Coordinators shall award the trusted flagger status for a period of two years, upon which the status may be renewed where the trusted flagger concerned continues to meet the requirements of this Regulation. The Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6. The Digital Services Coordinator of the Member State of establishment of the platform shall engage in dialogue with platforms and stakeholders for maintaining the accuracy and efficacy of a trusted flagger system.
Amendment 251
Proposal for a regulation
Article 19 – paragraph 4
4.  The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated.
4.  The Commission shall publish the information referred to in paragraph 3 in a publicly available database in an easily accessible and machine-readable format and keep the database updated.
Amendment 252
Proposal for a regulation
Article 19 – paragraph 5
5.  Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
5.  Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the online platforms and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation.
Amendment 253
Proposal for a regulation
Article 19 – paragraph 6
6.  The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
6.  The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by an online platform pursuant to paragraph 5, carried out without undue delay, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.
Amendment 254
Proposal for a regulation
Article 19 – paragraph 7
7.  The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6.
7.  The Commission, after consulting the Board, shall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6.
Amendment 255
Proposal for a regulation
Article 19 a (new)
Article 19 a
Accessibility requirements for online platforms
1.   Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882.
2.   Providers of online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in an accessible manner for persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation.
3.   Providers of online platforms shall ensure that information, forms and measures provided pursuant to this Regulation are made available in a manner that they are easy to find, easy to understand, and accessible to persons with disabilities.
4.   Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services.
5.   In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements.
6.   They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements.
7.   Online platforms which are in conformity with harmonised standards or parts thereof derived from Directive (EU) 2019/882 the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements.
8.   Online platforms which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements.
Amendment 256
Proposal for a regulation
Article 20 – paragraph 1
1.  Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
1.  Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide illegal content, for which the illegality can be established without conducting a legal or factual examination or for which they have received two or more orders to act regarding illegal content in the previous 12 months, unless those orders were later overturned.
Amendment 257
Proposal for a regulation
Article 20 – paragraph 2
2.  Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2.  Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that repeatedly submit notices or complaints that are manifestly unfounded.
Amendment 258
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
3.  Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
3.  When deciding on the suspension, providers of online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of the online platform. Those circumstances shall include at least the following:
Amendment 259
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a)  the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
(a)  the absolute numbers of items of illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 260
Proposal for a regulation
Article 20 – paragraph 3 – point d
(d)  the intention of the recipient, individual, entity or complainant.
(d)  where identifiable the intention of the recipient, individual, entity or complainant;
Amendment 261
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
(da)  whether a notice was submitted by an individual user or by an entity or persons with specific expertise related to the content in question or following the use of an automated content recognition system.
Amendment 262
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a.  Suspensions referred to in paragraphs 1 and 2 may be declared permanent where:
(a)  there are compelling reasons of law or public policy, including ongoing criminal investigations;
(b)  the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts;
(c)  a trader has repeatedly offered goods and services that do not comply with Union or national law;
(d)  the items removed were related to serious crimes.
Amendment 263
Proposal for a regulation
Article 20 – paragraph 4
4.  Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
4.  Providers of online platforms shall set out, in a clear, user-friendly, and detailed manner with due regard to their obligations under Article 12(2) their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 264
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1.  Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
1.  Online platforms allowing consumers to conclude distance contracts with traders shall ensure that traders can only use their services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services for those purposes, they have been provided with the following information:
Amendment 265
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d)  the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law;
(d)  the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law, including in the area of product safety;
__________________
__________________
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
51 Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
Amendment 266
Proposal for a regulation
Article 22 – paragraph 1 – point f
(f)  a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law.
(f)  a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law and where applicable confirming that all products have been checked against available databases, such as the Union Rapid Alert System for dangerous non-food products (RAPEX);
Amendment 267
Proposal for a regulation
Article 22 – paragraph 1 – point f a (new)
(fa)  the type of products or services the trader intends to offer on the online platform.
Amendment 268
Proposal for a regulation
Article 22 – paragraph 2
2.  The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
2.  The online platform allowing consumers to conclude distance contracts with traders shall, upon receiving that information before allowing the display of the product or service on its online interface, and until the end of the contractual relationship, make best efforts to assess whether the information referred to in points (a) to (fa) of paragraph 1 is reliable and complete. The online platform shall make best efforts to check the information provided by the trader through the use of any freely accessible official online database or online interface made available by an authorised administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources.
No later than one year after the entry into force of this Regulation, the Commission shall publish the list of online databases and online interfaces mentioned in the paragraph above and keep it up-to-date. The obligations for online platforms referred to in paragraphs 1 and 2 shall apply with regard to new and existing traders.
Amendment 269
Proposal for a regulation
Article 22 – paragraph 2 a (new)
2a.  The online platform shall make best efforts to identify and prevent the dissemination, by traders using its service, of offers for products or services which do not comply with Union or national law through measures such as random checks on the products and services offered to consumers in addition to the obligations referred to in paragraph 1 and 2 of this Article.
Amendment 270
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
3.  Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
3.  Where the online platform obtains sufficient indications or has reasons to believe that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 271
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Where the trader fails to correct or complete that information, the online platform shall suspend the provision of its service to the trader until the request is complied with.
Where the trader fails to correct or complete that information, the online platform shall swiftly suspend the provision of its service to the trader in relation to the offering of products or services to consumers located in the Union until the request is fully complied with.
Amendment 272
Proposal for a regulation
Article 22 – paragraph 3 a (new)
3a.  If an online platform rejects an application for services or suspends services to a trader, the trader shall have recourse to the mechanisms under Article 17 and Article 43 of this Regulation.
Amendment 273
Proposal for a regulation
Article 22 – paragraph 3 b (new)
3b.  Online platforms allowing consumers to conclude contracts with traders shall ensure that the identity, such as the trademark or logo, of the business user providing content, goods or services is clearly visible alongside the content, goods or services offered. For this purpose, the online platform shall establish a standardised interface for business users.
Amendment 274
Proposal for a regulation
Article 22 – paragraph 3 c (new)
3c.  Traders shall be solely liable for the accuracy of the information provided and shall inform without delay the online platform of any changes to the information provided.
Amendment 275
Proposal for a regulation
Article 22 – paragraph 4
4.  The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
4.  The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information no later than six months after the final conclusion of a distance contract.
Amendment 276
Proposal for a regulation
Article 22 – paragraph 6
6.  The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
6.  The online platform shall make the information referred to in points (a), (d), (e), (f), and (fa) of paragraph 1 easily accessible to the recipients of the service,, in a clear, easily accessible and comprehensible manner in accordance with the accessibility requirements of Annex I to Directive (EU) 2019/882.
Amendment 277
Proposal for a regulation
Article 22 a (new)
Article 22a
Obligation to inform consumers and authorities about illegal products and services
1.   Where an online platforms allowing consumers to conclude distance contracts with traders becomes aware, irrespective of the means used to, that a product or a service offered by a trader on the interface of that platform is illegal with regard to applicable requirements in Union or national law, it shall:
(a)   remove the illegal product or service from its interface expeditiously and, where appropriate, inform the relevant authorities, such as the market surveillance authority or the custom authority of the decision taken;
(b)   where the online platform has the contact details of the recipient of the services, inform those recipients of the service that had acquired such product or service about the illegality, the identity of the trader and options for seeking redress;
(c)   compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past twelve months along with information about the concerned trader and options for seeking redress.
2.   Online platforms allowing consumers to conclude distance contracts with traders shall maintain an internal database of illegal products and services removed and/or recipients suspended pursuant to Article 20.
Amendment 278
Proposal for a regulation
Article 23 – paragraph 1 – point a a (new)
(aa)  the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed;
Amendment 279
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b)  the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
(b)  the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 280
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
(ca)  the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions.
Amendment 281
Proposal for a regulation
Article 23 – paragraph 2
2.  Online platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2.  Online platforms shall publish, at least once every twelve months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
Amendment 282
Proposal for a regulation
Article 23 – paragraph 2 a (new)
2a.  Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in connection with the exercise of their supervisory powers.
Amendment 283
Proposal for a regulation
Article 23 – paragraph 4
4.  The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
4.  The Commission shall adopt implementing acts to establish a set of key performance indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
Amendment 284
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time:
1.   Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, concise, and unambiguous manner and in real time:
Amendment 285
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a)  that the information displayed is an advertisement;
(a)  that the information displayed on the interface or parts thereof is an online advertisement, including through prominent and harmonised marking;
Amendment 286
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
(ba)  the natural or legal person who finances the advertisement where this person is different from the natural or legal person referred to in point (b);
Amendment 287
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c)  meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
(c)   clear, meaningful, and uniform information about the parameters used to determine the recipient to whom the advertisement is displayed, and where applicable about how to change those parameters.
Amendment 499
Proposal for a regulation
Article 24 – paragraph 1 a (new)
1a.  Online platforms shall ensure that recipients of services can easily make an informed choice on whether to consent, as defined in Article 4 (11) and Article 7 of Regulation (EU) 2016/679, in processing their personal data for the purposes of advertising by providing them with meaningful information, including information about how their data will be monetised. Online platforms shall ensure that refusing consent shall be no more difficult or time-consuming to the recipient than giving consent. In the event that recipients refuse to consent, or have withdrawn consent, recipients shall be given other fair and reasonable options to access the online platform.
Amendment 500
Proposal for a regulation
Article 24 – paragraph 1 b (new)
1b.  Targeting or amplification techniques that process, reveal or infer personal data of minors or personal data referred to in Article 9(1) of Regulation (EU) 2016/679 for the purpose of displaying advertisements are prohibited.
Amendment 290
Proposal for a regulation
Article 24 a (new)
Article 24a
Recommender system transparency
1.   Online platforms shall set out in their terms and conditions and via a designated online resource that can be directly reached and easily found from the online platform’s online interface when content is recommended, in a clear, accessible and easily comprehensible manner the main parameters used in their recommender systems, as well as any options for the recipient of the service to modify or influence those main parameters that they have made available.
2.   The main parameters referred to in paragraph 1 shall include, at a minimum:
(a)   the main criteria used by the relevant system which individually or collectively are most significant in determining recommendations;
(b)   the relative importance of those parameters;
(c)   what objectives the relevant system has been optimised for; and
(d)   if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
The requirements set out in paragraph 2 shall be without prejudice to rules on protection of trade secrets and intellectual property rights.
3.   Where several options are available pursuant to paragraph 1, online platforms shall provide a clear and easily accessible function on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 291
Proposal for a regulation
Article 24 b (new)
Article 24b
Additional obligations for platforms primarily used for the dissemination of user-generated pornographic content
Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure:
(a)   that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration;
(b)   professional human content moderation, trained to identify image-based sexual abuse, including content having a high probability of being illegal;
(c)   the accessibility of a qualified notification procedure in the form that, additionally to the mechanism referred to in Article 14, individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure is to be suspended without undue delay.
Amendment 292
Proposal for a regulation
Article 25 – paragraph 1
1.  This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
1.  This Section shall apply to online platforms which:
(a)   provide for at least four consecutive months their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. Such a methodology shall take into account, in particular:
(i)   the number of active recipients shall be based on each service individually;
(ii)   active recipients connected on multiple devices are counted only once;
(iii)   indirect use of service, via a third party or linking, shall not be counted;
(iv)   where an online platform is hosted by another provider of intermediary services, that the active recipients are assigned solely to the online platform closest to the recipient;
(v)   that automated interactions, accounts or data scans by a non-human (“bots”) are not included.
Amendment 293
Proposal for a regulation
Article 25 – paragraph 3
3.  The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features.
3.  The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1(a). The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features.
Amendment 294
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1.  Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
1.  Very large online platforms shall effectively and diligently identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, and in any event before launching new services, the probability and severity of any significant systemic risks stemming from, the design, algorithmic systems, intrinsic characteristics, functioning and use made of their services in the Union. The risk assessment shall take into account risks per Member State in which services are offered and in the Union as a whole, in particular to a specific language or region. This risk assessment shall be specific to their services and activities, including technology design, business-model choices, and shall include the following systemic risks:
Amendment 295
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a)  the dissemination of illegal content through their services;
(a)  the dissemination of illegal content through their services or content that is in breach with their terms and conditions;
Amendment 296
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b)  any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
(b)  any actual and foreseeable negative effects for the exercise of the fundamental rights, including for consumer protection, to respect for human dignity, private and family life, the protection of personal data and the freedom of expression and information, as well as to the freedom and the pluralism of the media, the prohibition of discrimination, the right to gender equality, and the rights of the child, as enshrined in Articles 1, 7, 8, 11, 21, 23, 24 and 38 of the Charter respectively;
Amendment 297
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c)  intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
(c)  any malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service or risks inherent to the intended operation of the service, including the amplification of illegal content, of content that is in breach with their terms and conditions or any other content with an actual or foreseeable negative effect on the protection of minors and of other vulnerable groups of recipients of the service, on democratic values, media freedom, freedom of expression and civic discourse, or actual or foreseeable effects related to electoral processes and public security;
Amendment 298
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
(ca)  any actual and foreseeable negative effects on the protection of public health as well as behavioural addictions or other serious negative consequences to the person's physical, mental, social and financial well-being.
Amendment 299
Proposal for a regulation
Article 26 – paragraph 2
2.  When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2.  When conducting risk assessments, very large online platforms shall take into account, in particular, whether and how their content moderation systems, terms and conditions, community standards, algorithmic systems, recommender systems and systems for selecting and displaying advertisement, as well as the underlying data collection, processing and profiling, influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 300
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2a.  When conducting risk assessments, very large online platforms shall consult, where appropriate, representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess.
Amendment 301
Proposal for a regulation
Article 26 – paragraph 2 b (new)
2b.  The supporting documents of the risk assessment shall be communicated to the Digital Services Coordinator of establishment and to the Commission.
Amendment 302
Proposal for a regulation
Article 26 – paragraph 2 c (new)
2c.  The obligations referred to in paragraphs 1 and 2 shall by no means lead to a general monitoring obligation.
Amendment 303
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1.  Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
1.  Very large online platforms shall put in place reasonable, transparent, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 304
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a)  adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions;
(a)  adapting content moderation, algorithmic systems, or recommender systems and online interfaces, their decision-making processes, the design, the features or functioning of their services, their advertising model or their terms and conditions;
Amendment 305
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
(aa)  ensuring appropriate resources to deal with notices and internal complaints, including appropriate technical and operational measures or capacities;
Amendment 306
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b)  targeted measures aimed at limiting the display of advertisements in association with the service they provide;
(b)  targeted measures aimed at limiting the display of advertisements in association with the service they provide, or the alternative placement and display of public service advertisements or other related factual information;
Amendment 307
Proposal for a regulation
Article 27 – paragraph 1 – point b a (new)
(ba)  where relevant, targeted measures aimed at adapting online interfaces and features to protect minors;
Amendment 308
Proposal for a regulation
Article 27 – paragraph 1 – point c
(c)  reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk;
(c)  reinforcing the internal processes, and resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;
Amendment 309
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a.  Very large online platforms shall, where appropriate, design their risk mitigation measures with the involvement of representatives of the recipients of the service, independent experts and civil society organisations. Where no such involvement is foreseen, this shall be made clear in the transparency report referred to in Article 33.
Amendment 310
Proposal for a regulation
Article 27 – paragraph 1 b (new)
1b.  Very large online platforms shall provide a detailed list of the risk mitigation measures taken and their justification to the independent auditors in order to prepare the audit report referred to in Article 28.
Amendment 311
Proposal for a regulation
Article 27 – paragraph 1 c (new)
1c.  The Commission shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms referred to in Article 27(1) and where necessary, may issue recommendations.
Amendment 312
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2.  The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following:
2.  The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:
Amendment 313
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a)  identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
(a)  identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Articles 30, 31 and 33;
Amendment 314
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 a (new)
The reports shall be presented per Member State in which the systemic risks occurred and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union.
Amendment 315
Proposal for a regulation
Article 27 – paragraph 3
3.  The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
3.  The Commission, in cooperation with the Digital Services Coordinators, and following public consultation shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved.
Amendment 316
Proposal for a regulation
Article 27 – paragraph 3 a (new)
3a.  The requirement to put in place mitigation measures shall not lead to a general monitoring obligation or active fact-finding obligations.
Amendment 317
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1.  Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
1.  Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
Amendment 318
Proposal for a regulation
Article 28 – paragraph 1 a (new)
1a.  Very large online platforms shall ensure auditors have access to all relevant data necessary to perform the audit properly.
Amendment 319
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2.  Audits performed pursuant to paragraph 1 shall be performed by organisations which:
2.  Audits performed pursuant to paragraph 1 shall be performed by organisations which having been recognised and vetted by the Commission and which:
Amendment 320
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a)  are independent from the very large online platform concerned;
(a)  are legally and financially independent from, and do not have conflicts of interest with the very large online platform concerned and other very large online platforms;
Amendment 321
Proposal for a regulation
Article 28 – paragraph 2 – point a a (new)
(aa)  auditors and their employees have not provided any other service to the very large online platform audited 12 months before the audit and commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for 12 months after their position in the auditing organisation has ended;
Amendment 322
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
3.  The organisations that perform the audits shall establish an audit report for each audit. The report shall be in writing and include at least the following:
3.  The organisations that perform the audits shall establish an audit report for each audit subject as referred to in paragraph 1. The report shall be in writing and include at least the following:
Amendment 323
Proposal for a regulation
Article 28 – paragraph 3 – point b a (new)
(ba)  a declaration of interests;
Amendment 324
Proposal for a regulation
Article 28 – paragraph 3 – point d
(d)  a description of the main findings drawn from the audit;
(d)  a description of the main findings drawn from the audit and a summary of the main findings;
Amendment 325
Proposal for a regulation
Article 28 – paragraph 3 – point d a (new)
(da)  a description of the third parties consulted as part of the audit;
Amendment 326
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
(fa)  a description of specific elements that could not be audited, and an explanation of why these could not be audited;
Amendment 327
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
(fb)  where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion.
Amendment 328
Proposal for a regulation
Article 28 – paragraph 4 a (new)
4a.  The Commission shall publish and regularly update a list of vetted organisations.
Amendment 329
Proposal for a regulation
Article 28 – paragraph 4 b (new)
4b.  Where a very large online platform receives a positive audit report, it shall be entitled to request from the Commission a seal of excellence.
Amendment 330
Proposal for a regulation
Article 29 – paragraph 1
1.  Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
1.  In addition to the requirements set out in Article 24a, very large online platforms that use recommender systems shall provide at least one recommender system which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679, as well as an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 331
Proposal for a regulation
Article 29 – paragraph 2
2.  Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
deleted
Amendment 332
Proposal for a regulation
Article 30 – paragraph 1
1.  Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
1.  Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, efficient and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that multicriterion queries can be performed per advertiser and per all data points present in the advertisement, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed and shall make reasonable efforts to ensure that the information is accurate and complete.
Amendment 333
Proposal for a regulation
Article 30 – paragraph 2 – point a
(a)  the content of the advertisement;
(a)  the content of the advertisement, including the name of the product, service or brand and the object of the advertisement;
Amendment 334
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(ba)  the natural or legal person who paid for the advertisement, where that person is different from the one referred to in point (b);
Amendment 335
Proposal for a regulation
Article 30 – paragraph 2 – point d
(d)  whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;
(d)  whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including any parameters used to exclude particular groups;
Amendment 336
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
(da)  where it is disclosed, a copy of the content of commercial communications published on the very large online platforms that are not marketed, sold or arranged by the very large online platform, which have through appropriate channels been declared as such to the very large online platform;
Amendment 337
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
(ea)  cases where the advertisement was removed on the basis of a notice submitted in accordance with Article 14 or an order issued pursuant to Article 8.
Amendment 338
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a.  The Board shall, after consulting vetted researchers, publish guidelines on the structure and organisation on repositories created pursuant to paragraph 1.
Amendment 339
Proposal for a regulation
Article 30 a (new)
Article 30a
Deep fakes
Where a very large online platform becomes aware that a piece of content is a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful (deep fakes), the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.
Amendment 340
Proposal for a regulation
Article 31 – paragraph 1
1.  Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
1.  Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, and without delay specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only request, access and use that data for those purposes.
Amendment 341
Proposal for a regulation
Article 31 – paragraph 1 a (new)
1a.  The very large online platform shall be obliged to explain the design, logic and the functioning of the algorithms if requested by the Digital Service Coordinator of establishment.
Amendment 342
Proposal for a regulation
Article 31 – paragraph 2
2.  Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
2.  Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers, vetted not-for-profit bodies, organisations or associations, who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification, mitigation and understanding of systemic risks as set out in Article 26(1) and Article 27(1).
Amendment 343
Proposal for a regulation
Article 31 – paragraph 2 a (new)
2a.  Vetted researchers, vetted not-for-profit bodies, organisations and associations shall have access to aggregate numbers for the total views and view rate of content prior to a removal on the basis of orders issued in accordance with Article 8 or content moderation engaged in at the provider’s own initiative and under its terms and conditions.
Amendment 344
Proposal for a regulation
Article 31 – paragraph 3
3.  Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate.
3.  Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate, and with an easily accessible and user-friendly mechanism to search for multiple criteria.
Amendment 345
Proposal for a regulation
Article 31 – paragraph 4
4.  In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
4.  In order to be vetted by the Digital Services Coordinator of establishment or the Commission, researchers, not-for-profit bodies, organisations or associations shall:
(a)   be affiliated with academic institutions or civil society organisations representing the public interest and meeting the requirements under Article 68;
(b)   be independent from commercial interests, including from any very large online platform;
(c)   disclose the funding financing the research;
(d)   be independent from any government, administrative or other state bodies, outside the academic institution of affiliation if public;
(e)   have proven records of expertise in the fields related to the risks investigated or related research methodologies; and
(f)   preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 346
Proposal for a regulation
Article 31 – paragraph 4 a (new)
4a.  Where a very large online platform has grounds to believe that a researcher, a not-for-profit body, an organisation or association is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, it shall immediately inform the relevant authority, either the Digital Service Coordinator of establishment or the Commission, which shall decide without undue delay if access shall be withdrawn and when the access shall be restored and under what conditions.
Amendment 347
Proposal for a regulation
Article 31 – paragraph 4 b (new)
4b.  Where the Digital Services Coordinator of establishment, or the Commission have grounds to believe that a researcher, a not-for-profit body, an organisation or association is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, it shall immediately inform the very large online platform. The very large online platform shall be entitled to withdraw access to data upon receiving the information. The Digital Services Coordinator of establishment, or the Commission shall decide if and when access shall be restored and under what conditions.
Amendment 348
Proposal for a regulation
Article 31 – paragraph 5
5.  The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
5.  The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers or not-for-profit bodies, organisations or associations can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, and maintaining the security of their service.
Amendment 349
Proposal for a regulation
Article 31 – paragraph 6 – point b
(b)  giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.
(b)  giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information.
Amendment 350
Proposal for a regulation
Article 31 – paragraph 7 a (new)
7a.  Digital Service Coordinators and the Commission shall, once a year, report the following information:
(a)  the number of requests made to them as referred to in paragraphs 1, 2 and 6;
(b)  the number of such requests that have been declined or withdrawn by the Digital Service Coordinator or the Commission and the reasons for which they have been declined or withdrawn, including following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1, 2 and 6.
Amendment 351
Proposal for a regulation
Article 31 – paragraph 7 b (new)
7b.  Upon completion of their research, the vetted researchers that have been granted access to data shall publish their findings without disclosing confidential data and in compliance with Regulation (EU) 2016/679.
Amendment 352
Proposal for a regulation
Article 32 – paragraph 2
2.  Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned.
2.  Very large online platforms shall only designate persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3 as compliance officers. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned.
Amendment 353
Proposal for a regulation
Article 32 – paragraph 3 – point a
(a)  cooperating with the Digital Services Coordinator of establishment and the Commission for the purpose of this Regulation;
(a)  cooperating with the Digital Services Coordinator of establishment, the Board and the Commission for the purpose of this Regulation;
Amendment 354
Proposal for a regulation
Article 33 – paragraph 1
1.  Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months.
1.  Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months in a standardised, machine-readable and easily accessible format.
Amendment 355
Proposal for a regulation
Article 33 – paragraph 1 a (new)
1a.  Such reports shall include content moderation information separated and presented for each Member State in which the services are offered and for the Union as a whole. The reports shall be published in at least one of the official languages of the Member States of the Union in which services are offered.
Amendment 356
Proposal for a regulation
Article 33 – paragraph 2 – point b
(b)  the related risk mitigation measures identified and implemented pursuant to Article 27;
(b)  the specific mitigation measures identified and implemented pursuant to Article 27;
Amendment 357
Proposal for a regulation
Article 33 – paragraph 2 – point d a (new)
(da)  where appropriate, information about the representatives of the recipients of the service, independent experts and civil society organisations, consulted for the risk assessment in accordance with Article 26.
Amendment 358
Proposal for a regulation
Article 33 – paragraph 3
3.  Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports.
3.  Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports, in compliance with Regulation (EU) 2016/679.
Amendment 359
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
1.  The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies at least for the following:
1.  The Commission shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, in accordance with Regulation (EU) No 1025/2012, at least for the following:
Amendment 360
Proposal for a regulation
Article 34 – paragraph 1 – point a a (new)
(aa)  terms and conditions under Article 12, including as regards acceptance of and changes to those terms and conditions;
Amendment 361
Proposal for a regulation
Article 34 – paragraph 1 – point a b (new)
(ab)  information on traceability of traders under Article 22;
Amendment 362
Proposal for a regulation
Article 34 – paragraph 1 – point a c (new)
(ac)  advertising practices under Article 24 and recommender systems under Article 24a;
Amendment 363
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
(fa)  transparency reporting obligations pursuant to Article 13;
Amendment 364
Proposal for a regulation
Article 34 – paragraph 1 – point f b (new)
(fb)  technical specifications to ensure that intermediary services shall be made accessible for persons with disabilities in accordance with the accessibility requirements of Directive (EU) 2019/882.
Amendment 365
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a.  The Commission shall support and promote the development and implementation of voluntary standards set by the relevant European and international standardisation bodies aimed at the protection of minors.
Amendment 366
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2a.  The Commission shall be empowered to adopt implementing acts laying down common specifications for the items listed in points (a) to (fb) of paragraph 1 where the Commission has requested one or more European standardisation organisations to draft a harmonised standard and there has not been a publication of the reference to that standard in the Official Journal of the European Union within [24 months after the entry into force of this Regulation] or the request has not been accepted by any of the European standardisation organisations.
Amendment 367
Proposal for a regulation
Article 35 – paragraph 1
1.  The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
1.  The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law. Particular attention shall be given to avoiding negative effects on fair competition, data access and security, the general monitoring prohibition and the protection of privacy and personal data. The Commission and the Board shall also encourage and facilitate regular review and adaption of the Codes of conduct to ensure that they are fit for purpose.
Amendment 368
Proposal for a regulation
Article 35 – paragraph 2
2.  Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2.  Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as relevant competent authorities, civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 369
Proposal for a regulation
Article 35 – paragraph 3
3.  When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
3.  When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their specific objectives, define the nature of the public policy objective pursued and, where appropriate, the role of competent authorities, contain key performance indicators to measure the achievement of those objectives and take fully into account of the needs and interests of all interested parties, and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants.
Amendment 370
Proposal for a regulation
Article 35 – paragraph 4
4.  The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions.
4.  The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions and request that the organisations involved amend their codes of conduct accordingly.
Amendment 371
Proposal for a regulation
Article 35 – paragraph 5
5.  The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
5.  The Commission and the Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic failure to comply with the Codes of Conduct, the Commission and the Board may take a decision to temporarily suspend or definitively exclude platforms that do not meet their commitments as signatories to the codes of conduct.
Amendment 372
Proposal for a regulation
Article 36 – paragraph 1
1.  The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
1.  The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for all actors in the online advertising eco-system, beyond the requirements of Articles 24 and 30.
Amendment 373
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
2.  The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least:
2.  The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct address at least:
Amendment 374
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
(ba)  the different types of data that can be used.
Amendment 375
Proposal for a regulation
Article 36 – paragraph 3
3.  The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.
3.  The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes three years after the application of this Regulation.
Amendment 376
Proposal for a regulation
Article 36 – paragraph 3 a (new)
3a.  The Commission shall encourage all the actors in the online advertising eco-system referred to in paragraph 1 to endorse and comply with the commitments stated in the codes of conduct.
Amendment 377
Proposal for a regulation
Article 37 – paragraph 1
1.  The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
1.  The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
Amendment 378
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
(fa)  measures to ensure accessibility for persons with disabilities during implementation of crisis protocols, including by providing accessible description about these protocols.
Amendment 379
Proposal for a regulation
Article 37 – paragraph 5
5.  If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it may request the participants to revise the crisis protocol, including by taking additional measures.
5.  If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it shall request the participants to revise the crisis protocol, including by taking additional measures.
Amendment 380
Proposal for a regulation
Article 38 – paragraph 4 a (new)
4a.  Member States shall ensure that the competent authorities, referred to in paragraph 1 and in particular their Digital Services Coordinators, have adequate technical financial and human resources to carry out their tasks under this Regulation.
Amendment 381
Proposal for a regulation
Article 39 – paragraph 1
1.  Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks.
1.  Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner.
Amendment 382
Proposal for a regulation
Article 40 – paragraph 1
1.  The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation.
1.  The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of the supervision and enforcement by the national competent authorities, in accordance with this Chapter, of the obligations imposed on intermediaries under this Regulation.
Amendment 383
Proposal for a regulation
Article 40 – paragraph 2
2.  A provider of intermediary services which does not have an establishment in the Union but which offers services in the Union shall, for the purposes of Chapters III and IV, be deemed to be under the jurisdiction of the Member State where its legal representative resides or is established.
2.  A provider of intermediary services which does not have an establishment in the Union but which offers services in the Union shall, for the purposes of this Article, be deemed to be under the jurisdiction of the Member State where its legal representative resides or is established.
Amendment 384
Proposal for a regulation
Article 40 – paragraph 3
3.  Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected.
3.  Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of this Article. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected.
Amendment 385
Proposal for a regulation
Article 41 – paragraph 1 – point a
(a)  the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period;
(a)  the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information without undue delay, or at the latest within three months;
Amendment 386
Proposal for a regulation
Article 41 – paragraph 2 – point e
(e)  the power to adopt interim measures to avoid the risk of serious harm.
(e)  the power to adopt proportionate interim measures or to request the relevant judicial authority to do so, to avoid the risk of serious harm.
Amendment 387
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1
As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those others persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities.
As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those other persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities.
Amendment 388
Proposal for a regulation
Article 41 – paragraph 3 – introductory part
3.  Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures:
3.  Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists or is continuously repeated and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures:
Amendment 389
Proposal for a regulation
Article 41 – paragraph 3 – point a
(a)  require the management body of the providers, within a reasonable time period, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken;
(a)  require the management body of the providers, within a reasonable time period, which shall in any case not exceed three months, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken;
Amendment 390
Proposal for a regulation
Article 41 – paragraph 3 – point b
(b)  where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
(b)  where the Digital Services Coordinator considers that the provider has not complied with the requirements of the first indent, that the infringement persists or is continuously repeated and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
Amendment 391
Proposal for a regulation
Article 41 – paragraph 6 a (new)
6a.  The Commission shall publish guidelines by [six months after the entry into force of this Regulation] on the powers of and procedures applicable to the Digital Services Coordinators.
Amendment 392
Proposal for a regulation
Article 42 – paragraph 2
2.  Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them.
2.  Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission and the Board of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them.
Amendment 393
Proposal for a regulation
Article 42 – paragraph 3
3.  Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned.
3.  Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual worldwide turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual worldwide turnover of the provider concerned.
Amendment 394
Proposal for a regulation
Article 42 – paragraph 4
4.  Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
4.  Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily worldwide turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 395
Proposal for a regulation
Article 42 – paragraph 4 a (new)
4a.  Member States shall ensure that administrative or judicial authorities issuing orders pursuant to Article 8 and 9 shall only issue penalties or fines in line with this Article.
Amendment 396
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
1.   Recipients of the service, , shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. During these proceedings, both parties shall have the right to be heard and receive appropriate information about the status of the proceedings. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment without undue delay. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority, without undue delay.
Amendment 397
Proposal for a regulation
Article 43 – paragraph 1 a (new)
1a.   Upon receipt of the complaint, transmitted pursuant to paragraph 1, the Digital Services Coordinator of establishment shall assess the matter in a timely manner and shall inform within six months the Digital Services Coordinator of the Member State where the recipient resides or is established if it intends to proceed with an investigation. If it opens an investigation, it shall provide an update at least every three months. The Digital Services Coordinator of the Member State where the recipient resides or is established shall consequently inform the recipient.
Amendment 398
Proposal for a regulation
Article 43 a (new)
Article 43a
Compensation
Without prejudice to Article 5, recipients of the service shall have the right to seek, in accordance with relevant Union and national law compensation from providers of intermediary services, against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation.
Amendment 399
Proposal for a regulation
Article 44 – paragraph 1
1.  Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board.
1.  Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports in a standardised and machine-readable format available to the public, and shall communicate them to the Commission and to the Board.
Amendment 400
Proposal for a regulation
Article 44 – paragraph 2 – point a
(a)  the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;
(a)  the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned, including information on the name of the issuing authority, the name of the provider and the type of action specified in the order, as well as a justification that the order complies with Article 3 of Directive 2000/31/EC;
Amendment 401
Proposal for a regulation
Article 44 – paragraph 2 – point b
(b)  the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 8 and 9.
(b)  the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 8 and 9, the number of appeals made against those orders, as well as the outcome of the appeals.
Amendment 402
Proposal for a regulation
Article 44 – paragraph 2 a (new)
2a.  The Commission shall make publicly available a biennial report analysing the annual reports, communicated pursuant to paragraph 1 and shall submit it to the European Parliament and to the Council.
Amendment 403
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 1
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommend the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
Amendment 404
Proposal for a regulation
Article 45 – paragraph 2 – introductory part
2.  A request or recommendation pursuant to paragraph 1 shall at least indicate:
2.  A request pursuant to paragraph 1 shall at least indicate:
Amendment 405
Proposal for a regulation
Article 45 – paragraph 2 a (new)
2a.  A request pursuant to paragraph 1 shall be at the same time communicated to the Commission. Where the Commission believes that the request is not justified or where the Commission is currently taking action on the same matter, the Commission can ask for the request to be withdrawn.
Amendment 406
Proposal for a regulation
Article 45 – paragraph 3
3.  The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
3.  The Digital Services Coordinator of establishment shall take into utmost account the request pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
Amendment 407
Proposal for a regulation
Article 45 – paragraph 4
4.  The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation.
4.  The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation.
Amendment 408
Proposal for a regulation
Article 45 – paragraph 5
5.  Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
5.  Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
Amendment 409
Proposal for a regulation
Article 45 – paragraph 7
7.  Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request.
7.  Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
Amendment 410
Proposal for a regulation
Article 46 – paragraph 1 – subparagraph 1
Such joint investigations are without prejudice to the tasks and powers of the participating Digital Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the Board through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation.
deleted
Amendment 411
Proposal for a regulation
Article 46 – paragraph 1 a (new)
1a.  Where a Digital Services Coordinator of establishment has reasons to suspect that a provider of intermediary services has infringed this Regulation in a manner involving at least one other Member State, it may propose to the Digital Services Coordinator of destination concerned to launch a joint investigation. The joint investigation shall be based on an agreement between the Member States concerned.
Amendment 412
Proposal for a regulation
Article 46 – paragraph 1 b (new)
1b.  Upon request of the Digital Services Coordinator of destination who has reasons to suspect that a provider of intermediary services has infringed this Regulation in its Member State, the Board may recommend to the Digital Services Coordinator of establishment to launch a joint investigation with the Digital Services Coordinator of destination concerned. The joint investigation shall be based on an agreement between the Member States concerned.
Where there is no agreement within one month, the joint investigation shall be under the supervision of the Digital Services Coordinator of establishment.
Such joint investigations are without prejudice to the tasks and powers of the participating Digital Services Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the Board through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation.
Amendment 413
Proposal for a regulation
Article 47 – paragraph 2 – point b
(b)  coordinating and contributing to guidance and analysis of the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation;
(b)  coordinating and providing guidance and analysis to the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation;
Amendment 414
Proposal for a regulation
Article 47 – paragraph 2 – point b a (new)
(ba)  contributing to the effective application of Article 3 of Directive 2000/31/EC to prevent fragmentation of the digital single market;
Amendment 415
Proposal for a regulation
Article 47 – paragraph 2 – point c a (new)
(ca)  contribute to the effective cooperation with the competent authorities of third countries and with international organisations.
Amendment 416
Proposal for a regulation
Article 48 – paragraph 1
1.  The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them.
1.  The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator, may participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. The meeting shall be deemed valid where at least two thirds of its members are present.
Amendment 417
Proposal for a regulation
Article 48 – paragraph 1 a (new)
1a.  The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance with the tasks of the Board pursuant to this Regulation and with its rules of procedure.
Amendment 418
Proposal for a regulation
Article 48 – paragraph 2 – introductory part
2.  Each Member State shall have one vote. The Commission shall not have voting rights.
2.  Each Member State shall have one vote, to be cast by the Digital Services Coordinator. The Commission shall not have voting rights.
Amendment 419
Proposal for a regulation
Article 48 – paragraph 3
3.  The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure.
deleted
Amendments 420 and 562/rev
Proposal for a regulation
Article 48 – paragraph 5
5.  The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.
5.  The Board may invite experts and observers to attend its meetings, and shall cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.
Amendment 421
Proposal for a regulation
Article 48 – paragraph 5 a (new)
5a.  The Board shall, where appropriate, consult interested parties and shall make the results of that consultation publicly available.
Amendment 422
Proposal for a regulation
Article 48 – paragraph 6
6.  The Board shall adopt its rules of procedure, following the consent of the Commission.
6.  The Board shall adopt its rules of procedure by a two-thirds majority of its members, following the consent of the Commission.
Amendment 423
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
(ca)  issue specific recommendations for the implementation of Article 13a;
Amendment 424
Proposal for a regulation
Article 49 – paragraph 1 – point d
(d)  advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation;
(d)  advise the Commission to take the measures referred to in Article 51 and adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation;
Amendment 425
Proposal for a regulation
Article 49 – paragraph 1 – point d a (new)
(da)  monitor the compliance with Article 3 of Directive 2000/31/EC of measures taken by a Member State restricting the freedom to provide services of intermediary service providers from another Member State and ensure that those measures are strictly necessary and do not restrict the application of this Regulation;
Amendment 426
Proposal for a regulation
Article 49 – paragraph 1 – point e
(e)  support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts as provided for in this Regulation, as well as the identification of emerging issues, with regard to matters covered by this Regulation.
(e)  support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in close collaboration with relevant stakeholders as provided for in this Regulation, including by issuing opinions, recommendations or advice on matters related to Article 34, as well as the identification of emerging issues, with regard to matters covered by this Regulation.
Amendment 427
Proposal for a regulation
Article 49 – paragraph 2
2.  Digital Services Coordinators and other national competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.
2.  Digital Services Coordinators and other national competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice and an explanation on the investigations, actions and the measures that they have implemented when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.
Amendment 428
Proposal for a regulation
Article 49 a (new)
Article 49a
Reports
1.   The Board shall draw up an annual report regarding its activities. The report shall be made public and be transmitted to the European Parliament, to the Council and to the Commission in all official languages of the Union.
2.   The annual report shall include, among other information, a review of the practical application of the opinions, guidelines, recommendations advice and any other measures taken under Article 49(1).
Amendment 429
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of the provisions of Section 4 of Chapter III, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period and no later than three months.
Amendment 430
Proposal for a regulation
Article 50 – paragraph 2
2.  When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
2.  When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may recommend, where appropriate, participation in a code of conduct as provided for in Article 35.
Amendment 431
Proposal for a regulation
Article 51 – title
Intervention by the Commission and opening of proceedings
Opening of proceedings by the Commission
Amendment 432
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1.  The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
1.  The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, shall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 433
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2.  Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2.  Where the Commission initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 434
Proposal for a regulation
Article 52 – paragraph 1
1.  In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
1.  In order to carry out the tasks assigned to it under this Section, the Commission may by reasoned request or by decision require the very large online platforms concerned, their legal representatives as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period
Amendment 435
Proposal for a regulation
Article 52 – paragraph 3 a (new)
3a.  The purpose of the request shall include reasoning on why and how the information is necessary and proportionate to the objective pursued and why it cannot be received by other means.
Amendment 436
Proposal for a regulation
Article 52 – paragraph 4
4.  The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading.
4.  The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1).
Amendment 437
Proposal for a regulation
Article 55 – paragraph 1
1.  In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
1.  In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service the Commission may, by decision, order proportionate interim measures in compliance with fundamental rights against the very large online platform concerned on the basis of a prima facie finding of an infringement.
Amendment 438
Proposal for a regulation
Article 56 – paragraph 2 – introductory part
2.  The Commission may, upon request or on its own initiative, reopen the proceedings:
2.  The Commission shall reopen the proceedings:
Amendment 439
Proposal for a regulation
Article 58 – paragraph 1 – point b
(b)  interim measures ordered pursuant to Article 55;
(b)  interim measures ordered pursuant to Article 55; or
Amendment 440
Proposal for a regulation
Article 58 – paragraph 3
3.  In the decision adopted pursuant to paragraph 1 the Commission shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period and to provide information on the measures that that platform intends to take to comply with the decision.
3.  In the decision adopted pursuant to paragraph 1 the Commission shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within one month and to provide information on the measures that that platform intends to take to comply with the decision.
Amendment 441
Proposal for a regulation
Article 58 – paragraph 5
5.  Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision.
5.  Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. The decision shall apply with immediate effect.
Amendment 442
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
1.  In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently:
1.  In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total worldwide turnover in the preceding financial year where it finds that the platform, intentionally or negligently:
Amendment 443
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
2.  The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently:
2.  The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total worldwide turnover in the preceding financial year, where they intentionally or negligently:
Amendment 444
Proposal for a regulation
Article 59 – paragraph 4
4.  In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
4.  In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement any fines issued under Article 42 for the same infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
Amendment 445
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
1.  The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
1.  The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily worldwide turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
Amendment 446
Proposal for a regulation
Article 64 – paragraph 1
1.  The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed.
1.  The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed, along with, where possible and justified, non-confidential documents or other forms of information on which the decision is based.
Amendment 447
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 1
Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures it intends to request and identifying the intended addressee or addressees thereof.
Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than 14 working days describing the measures it intends to request and identifying the intended addressee or addressees thereof.
Amendment 448
Proposal for a regulation
Article 66 – paragraph 1 – point c a (new)
(ca)  the development and implementation of standards provided for in Article 34.
Amendment 449
Proposal for a regulation
Article 68 – paragraph 1 – introductory part
Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council52 , recipients of intermediary services shall have the right to mandate a body, organisation or association to exercise the rights referred to in Articles 17, 18 and 19 on their behalf, provided the body, organisation or association meets all of the following conditions:
Without prejudice to Directive (EU) 2020/1818 of the European Parliament and of the Council52 , recipients of intermediary services shall have the right to mandate a, or a body, organisation or association to exercise the rights referred to in Articles 8, 12, 13, 14, 15, 17, 18, 19, 43 and 43a on their behalf, provided the body, organisation or association meets all of the following conditions:
__________________
__________________
52 [Reference]
52 [Reference]
Amendment 450
Proposal for a regulation
Article 69 – paragraph 2
2.  The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
2.  The delegation of power referred to in Articles 13a, 16, 23, 25, and 31 shall be conferred on the Commission for five years starting from [date of expected adoption of the Regulation]. The Commission shall draw up a report in respect of the delegation of power not later than nine months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than three months before the end of each period.
Amendment 451
Proposal for a regulation
Article 69 – paragraph 3
3.  The delegation of power referred to in Articles 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
3.  The delegation of power referred to in Articles 13a, 16, 23, 25, 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
Amendment 452
Proposal for a regulation
Article 69 – paragraph 5
5.  A delegated act adopted pursuant to Articles 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
5.  A delegated act adopted pursuant to Articles 13a, 16, 23, 25, 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of four months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
Amendment 453
Proposal for a regulation
Article 70 – paragraph 1
1.  The Commission shall be assisted by the Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011.
1.  The Commission shall be assisted by a Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011.
Amendment 454
Proposal for a regulation
Article 73 – paragraph 1
1.  By five years after the entry into force of this Regulation at the latest, and every five years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee.
1.  By three years after the entry into force of this Regulation at the latest, and every three years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. This report shall address in particular:
(a)   the application of Article 25, including with respect to the number of average monthly active recipients of the service;
(b)   the application of Article 11;
(c)   the application of Article 14,
(d)   the application of Articles 35 and 36.
Amendment 455
Proposal for a regulation
Article 73 – paragraph 1 a (new)
1a.  Where appropriate, the report referred to in paragraph 1 shall be accompanied by a proposal for amendment of this Regulation.
Amendment 456
Proposal for a regulation
Article 73 – paragraph 3
3.   In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources.
3.   In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources, and pay specific attention to small and medium-sized enterprises and the position of new competitors.
Amendment 457
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
2.  It shall apply from [date - three months after its entry into force].
2.  It shall apply from [date - six months after its entry into force].

(1) The matter was referred back for interinstitutional negotiations to the committee responsible, pursuant to Rule 59(4), fourth subparagraph (A9-0356/2021).

Last updated: 13 May 2022Legal notice - Privacy policy