REPORT on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

20.12.2021 - (COM(2020)0825 – C9-0418/2020 – 2020/0361(COD)) - ***I

Committee on the Internal Market and Consumer Protection
Rapporteur: Christel Schaldemose
Rapporteurs for the opinion (*):
Henna Virkkunen, Committee on Industry, Research and Energy
Geoffroy Didier, Committee on Legal Affairs
Patrick Breyer, Committee on Civil Liberties, Justice and Home Affairs
(*) Associated committees – Rule 57 of the Rules of Procedure


DRAFT EUROPEAN PARLIAMENT LEGISLATIVE RESOLUTION

on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

(COM(2020)0825 – C9-0418/2020 – 2020/0361(COD))

(Ordinary legislative procedure: first reading)

The European Parliament,

 having regard to the Commission proposal to Parliament and the Council (COM(2020)0825),

 having regard to Article 294(2) and Article 114 of the Treaty on the Functioning of the European Union, pursuant to which the Commission submitted the proposal to Parliament (C9-0418/2020,

 having regard to Article 294(3) of the Treaty on the Functioning of the European Union,

 having regard to the opinion of the European Economic and Social Committee of 27 April 2021[1],

 having regard to the opinion of the Committee of the Regions of 1 July 2021[2],

 having regard to Rule 59 of its Rules of Procedure,

 having regard to opinions of the Committee on Industry, Research and Energy, the Committee on Legal Affairs, the Committee on Civil Liberties, Justice and Home Affairs, the Committee on Economic and Monetary Affairs, the Committee on Transport and Tourism, the Committee on Culture and Education and the Committee on Women’s Rights and Gender Equality,

 having regard to the report of the Committee on the Internal Market and Consumer Protection (A9-0356/2021),

1. Adopts its position at first reading hereinafter set out;

2. Calls on the Commission to refer the matter to Parliament again if it replaces, substantially amends or intends to substantially amend its proposal;

3. Instructs its President to forward its position to the Council, the Commission and the national parliaments.

 

Amendment  1

 

Proposal for a regulation

Recital 1

 

Text proposed by the Commission

Amendment

(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.

(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel and innovative ways, transforming their communication, consumption and business habits. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, for individual users, companies and for society as a whole.

__________________

__________________

25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).

25 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).

Amendment  2

 

Proposal for a regulation

Recital 2

 

Text proposed by the Commission

Amendment

(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.

(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services, and resulting in a fragmentation of the internal market. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice, without lock-in effects, and reducing administrative burden for intermediary services, especially for micro, small and medium sized enterprises.

Amendment  3

 

Proposal for a regulation

Recital 3

 

Text proposed by the Commission

Amendment

(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination.

(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights and freedoms guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the rights to privacy, to protection of personal data, respect for human dignity, private and family life, the freedom of expression and information, the freedom and the pluralism of the media,and the freedom to conduct a business, a high level of consumer protection, the equality between women and men and the right to non-discrimination. Children have particular rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child (UNCRC). As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.

Amendment  4

 

Proposal for a regulation

Recital 4

 

Text proposed by the Commission

Amendment

(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.

(4) In order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers, protecting consumers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated, while respecting fundamental rights.

Amendment  5

 

Proposal for a regulation

Recital 4 a (new)

 

Text proposed by the Commission

Amendment

 

(4a) Given the importance of digital services, it is essential that this Regulation ensures a regulatory framework which ensures full, equal and unrestricted access to intermediary services for all recipients of services, including persons with disabilities. Therefore, it is important that accessibility requirements for intermediary services, including their user interfaces, are consistent with existing Union law, such as the European Accessibility Act and the Web Accessibility Directive and that Union law is further developed, so that no one is left behind as result of digital innovation.

Amendment  6

 

Proposal for a regulation

Recital 6

 

Text proposed by the Commission

Amendment

(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union.

(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport of persons and goods, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union.

Amendment  7

 

Proposal for a regulation

Recital 8

 

Text proposed by the Commission

Amendment

(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union.

(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the directing of activities towards one or more Member States. The directing of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The directing of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union.

__________________

__________________

27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).

27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).

Amendment  8

 

Proposal for a regulation

Recital 9

 

Text proposed by the Commission

Amendment

(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level.

(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) 2021/784 of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation should apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures. To assist Member States and service providers, the Commission should provide guidelines as to how to interpret the interaction and complementary nature between different Union legal acts and this Regulation and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. In particular, the guidelines should clarify any potential conflicts between the conditions and obligations laid down in legal acts, referred to in this Regulation, explaining which legal act should prevail.

__________________

__________________

28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 .

28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 .

29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation

29 Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation

Amendment  9

 

Proposal for a regulation

Recital 9 a (new)

 

Text proposed by the Commission

Amendment

 

(9a) In line with Article 167(4) of the Treaty on the Functioning of the European Union, cultural aspects should be taken into account, in particular in order to respect and to promote the cultural and linguistic diversity. It is essential that this Regulation contributes to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity.

Amendment  10

 

Proposal for a regulation

Recital 10

 

Text proposed by the Commission

Amendment

(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions.

(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33, Directive (EU) 2018/1972 of the European Parliament and of the Council33a, as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive (EU) 2019/882 of the European Parliament and of the Council, Regulation(EU) 2019/1020, Directive2001/95/EC, Directive 2013/11/EC of the European Parliament and of the Council, Regulation 2017/239437a , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union or national law on working conditions.

__________________

__________________

30 Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1).

30 Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1).

31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57).

31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57).

32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37.

32 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37.

33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC.

33 Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC.

 

33a Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast)

34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’)

34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’)

35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council.

35 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council.

36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts.

36 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts.

37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules

37 Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules

 

37a Regulation (EU) 2017/2394 of the European Parliament and of the Council of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004

38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).

38 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).

Amendment  11

 

Proposal for a regulation

Recital 11

 

Text proposed by the Commission

Amendment

(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected.

(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 of the European Parliament and of the Council, which establish specific rules and procedures that should remain unaffected.

Amendment  12

 

Proposal for a regulation

Recital 12

 

Text proposed by the Commission

Amendment

(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.

(12) In order to achieve the objective of ensuring a safe, accessible, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should underpin the general idea that what is illegal offline should also be illegal online. The concept of “illegal content” should be defined appropriately and should cover information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable Union or national law is either itself illegal, such as illegal hate speech, or terrorist content and unlawful discriminatory content, or that is not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law, the provision of illegal services in particular in the area of accommodation services on short-term rental platforms non-compliant with Union or national law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is in conformity with Union law, including the Charter and what the precise nature or subject matter is of the law in question.

Amendment  13

 

Proposal for a regulation

Recital 13

 

Text proposed by the Commission

Amendment

(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.

(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor or a purely ancillary feature of another service or functionality of the principal service and that feature or functionality cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature or functionality is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. For the purposes of this Regulation, cloud computing services should not be considered to be an online platform in cases where allowing the dissemination of specific content constitutes a minor or ancillary feature. Moreover, cloud computing services, when serving as infrastructure, for example, as the underlining infrastructural storage and computing services of an internet-based application or online platform, should not in itself be seen as disseminating to the public information stored or processed at the request of a recipient of an application or online platform which it hosts.

Amendment  14

 

Proposal for a regulation

Recital 14

 

Text proposed by the Commission

Amendment

(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.

(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision on whom to grant access. Information exchanged using interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, are not considered to have been disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.

__________________

__________________

39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36

39 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36

Amendment  15

 

Proposal for a regulation

Recital 16

 

Text proposed by the Commission

Amendment

(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union.

(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity, consistency, predictability, accessibility and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments.

Amendment  16

 

Proposal for a regulation

Recital 18

 

Text proposed by the Commission

Amendment

(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.

(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. The mere ranking or displaying in an order, or the use of a recommender system should not, however, be deemed as having control over an information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.

Amendment  17

 

Proposal for a regulation

Recital 20

 

Text proposed by the Commission

Amendment

(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.

(20) Where a provider of intermediary services deliberately collaborates with a recipient of the services in order to undertake illegal activities, the service should be deemed not to have been provided neutrally and the provider should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.

Amendment  18

 

Proposal for a regulation

Recital 21

 

Text proposed by the Commission

Amendment

(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.

(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved in the content of the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.

Amendment  19

 

Proposal for a regulation

Recital 22

 

Text proposed by the Commission

Amendment

(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.

(22) In order to benefit from the exemption from liability for hosting services, the provider should, after having become aware of the illegal nature of the content and thus obtaining actual knowledge or awareness, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of a high level of consumer protection and of the Charter of Fundamental Rights, including the principle of freedom of expression and the right to receive and impart information and ideas without interference by public authority. The provider can obtain actual knowledge or awareness of the illegal nature of the content through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent hosting service provider to reasonably identify, assess and where appropriate act against the allegedly illegal content. As long as providers act upon obtaining actual knowledge, they should benefit from the exemptions from liability referred to in this Regulation.

Amendment  20

 

Proposal for a regulation

Recital 23

 

Text proposed by the Commission

Amendment

(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.

(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of a consumer. Such a belief may arise, for example, where the online platform allowing distance contracts with traders fails to display clearly the identity of the trader pursuant to this Regulation, or is marketing the product or service in its own name rather than using the name of the trader who will supply it, or where the provider determines the final price of the goods or services offered by the trader.

Amendment  21

 

Proposal for a regulation

Recital 25

 

Text proposed by the Commission

Amendment

(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.

(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, solely because they are carrying out voluntary own-initiative investigations, provided those activities are carried out in good faith and in a diligent manner and are accompanied with additional safeguards against over-removal of legal content. Providers of intermediary services should make best efforts to ensure that where automated tools are used for content moderation, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.

Amendment  22

 

Proposal for a regulation

Recital 26

 

Text proposed by the Commission

Amendment

(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.

(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed and open online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the specific provider that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently providers should act where they are in the best place to do so.

Amendment  23

 

Proposal for a regulation

Recital 27

 

Text proposed by the Commission

Amendment

(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.

(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be and among others, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, Virtual Private Networks, cloud infrastructure services, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.

Amendment  24

 

Proposal for a regulation

Recital 27 a (new)

 

Text proposed by the Commission

Amendment

 

(27a) A single webpage or website may include elements that qualify differently between ‘mere conduit’, ‘caching’ or hosting services and the rules for exemptions from liability should apply to each accordingly. For example, a search engine could act solely as a ‘caching’ service as to information included in the results of an inquiry. Elements displayed alongside those results, such as online advertisements, would however still qualify as a hosting service.

Amendment  25

 

Proposal for a regulation

Recital 28

 

Text proposed by the Commission

Amendment

(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.

(28) Providers of intermediary services should not be subject to a monitoring obligation, neither de jure, nor de facto with respect to obligations of a general nature. This does not concern specific and properly identified monitoring obligations in a specific case, where set out in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implement Union legal acts, in accordance with the conditions established in this Regulation and other Union law considered as lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, Member States should not prevent providers of intermediary services from providing end-to-end encrypted services. Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, and effectively prevents unauthorised third party access. Furthermore, to ensure effective digital privacy, Member States should not impose a general obligation on providers of intermediary services to limit the anonymous use of their services.

Amendment  26

 

Proposal for a regulation

Recital 29

 

Text proposed by the Commission

Amendment

(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.

(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with Union law, including the Charter on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.

Amendment  27

 

Proposal for a regulation

Recital 30

 

Text proposed by the Commission

Amendment

(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information.

(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, including the Charter and in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online, or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information.

Amendment  28

 

Proposal for a regulation

Recital 31

 

Text proposed by the Commission

Amendment

(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.

(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law in conformity with Union law, including Directive 2000/31/EC and the Charter, enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. Exceptionally, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.

Amendment  29

 

Proposal for a regulation

Recital 32

 

Text proposed by the Commission

Amendment

(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.

(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. Member States should ensure full implementation of the Union legal framework on confidentiality of communications and online privacy, as well as on protection of natural persons with regard to the processing of personal data enshrined in Directive (EU) 2016/680. In particular, Member States should respect the rights of individuals and journalists and refrain from seeking information which could harm media freedom or freedom of expression.

Amendment  30

 

Proposal for a regulation

Recital 33

 

Text proposed by the Commission

Amendment

(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.

(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, as defined in Union or national law in compliance with Union law, respectively, where they are addressed to providers of intermediary services established in another Member State, they should not in principle restrict those providers’ freedom to provide their services across borders. The competent authority should transmit the orders to act against illegal content and to provide information directly to the relevant addressee by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement should notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council. This Regulation should be without prejudice to the rules on the mutual recognition and enforcement of judgements, namely as regards the right to refuse recognition and enforcement of an order to act against illegal content, in particular where such an order is contrary to the public policy in the Member State where recognition or enforcement is sought.

Amendment  31

 

Proposal for a regulation

Recital 33 a (new)

 

Text proposed by the Commission

Amendment

 

(33a) This Regulation should not prevent the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, to issue an order to restore content, where such content has been in compliance with the terms and conditions of the intermediary service provider, but has been erroneously considered as illegal by the service provider and has been removed.

Amendment  32

 

Proposal for a regulation

Recital 33 b (new)

 

Text proposed by the Commission

Amendment

 

(33b) To ensure the effective implementation of this Regulation, orders to act against illegal content and to provide information should comply with Union law, including with the Charter. The Commission should provide an effective response to breaches of Union law through infringement proceedings.

Amendment  33

 

Proposal for a regulation

Recital 34

 

Text proposed by the Commission

Amendment

(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.

(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective, predictable and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as a high level of consumer protection, the safety and trust of the recipients of the service, including minors and vulnerable users, the protection of relevant fundamental rights enshrined in the Charter, the meaningful accountability of those providers and the empowerment of recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.

Amendment  34

 

Proposal for a regulation

Recital 35

 

Text proposed by the Commission

Amendment

(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.

(35) In that regard, it is important that the due diligence obligations are adapted to the type, nature and size of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation in relation to those services. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.

Amendment  35

 

Proposal for a regulation

Recital 36

 

Text proposed by the Commission

Amendment

(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .

(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to designate a single point of contact and to publish relevant and up to date information relating to their point of contact, including the languages to be used in such communications. Such information should be notified to the Digital Service Coordinator in the Member State of establishment. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. It should be possible that this contact point is the same contact point as required under other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location.

Amendment  36

 

Proposal for a regulation

Recital 36 a (new)

 

Text proposed by the Commission

Amendment

 

(36a) Providers of intermediary services should also be required to designate a single point of contact for recipients of services, which allows rapid, direct and efficient communication in particular by easily accessible means such as telephone number, email addresses, electronic contact forms, chatbots or instant messaging. It should be explicitly indicated when a user communicates with chatbots. To facilitate rapid, direct and efficient communication, recipients of services should not be faced with lengthy phone menus or hidden contact information. In particular, phone menus should always include the option to speak to a human. Providers of intermediary services should allow recipients of services to choose means of direct and efficient communication which do not solely rely on automated tools. This requirement should not affect the internal organisation of providers of intermediary services, including the ability to use third-party services to provide this communication system, such as external service providers and call centres.

Amendment  37

 

Proposal for a regulation

Recital 37

 

Text proposed by the Commission

Amendment

(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with.

(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. It should be possible that a legal representative is mandated by more than one provider of intermediary services, in accordance with national law, provided that such providers qualify as micro, small or medium sized enterprises as defined in Recommendation 2003/361/EC.

Amendment  38

 

Proposal for a regulation

Recital 38

 

Text proposed by the Commission

Amendment

(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes.

(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of protecting fundamental rights, in particular freedom of expression and of information, transparency, the protection of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are drafted in a clear and unambiguous language in line with applicable Union and national law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available, using, where appropriate graphical elements, such as icons.

Amendment  39

 

Proposal for a regulation

Recital 39

 

Text proposed by the Commission

Amendment

(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40

(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should draw up an annual report in a standardised and machine-readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC 40 which do not also qualify as very large online platforms.

__________________

__________________

40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36).

40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36).

Amendment  40

 

Proposal for a regulation

Recital 39 a (new)

 

Text proposed by the Commission

Amendment

 

(39a) Recipients of a service should be able to make a free, autonomous and informed decisions or choices when using a service and providers of intermediary services shall not use any means, including via its interface, to distort or impair that decision-making. In particular, recipients of the service should be empowered to make such decision sinter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. However, certain practices typically exploit cognitive biases and prompt recipients of the service to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark patterns’). This should include, but should not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more visual prominence to a consent option, repetitively requesting or urging the recipient to make a decision such as making the procedure of cancelling a service significantly more cumbersome than signing up to it. However, rules preventing dark patterns should not be understood as preventing providers to interact directly with users and to offer new or additional services to them. In particular it should be possible to approach a user again in a reasonable time, even if the user had denied consent for specific data processing purposes, in accordance with Regulation (EU) 2016/679. The Commission should be empowered to adopt a delegated act to define practices that could be considered as dark patterns.

Amendment  41

 

Proposal for a regulation

Recital 40

 

Text proposed by the Commission

Amendment

(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.

(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can establish that the content in question is clearly illegal without additional legal or factual examination of the information indicated in the notice and remove or disable access to that content ('action'). Such mechanism should include a clearly identifiable reporting mechanism, located close to the content in question allowing to notify quickly and easily items of information considered to be illegal content under Union or national law. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. While individuals should always be able to submit notices anonymously, such notices should not give rise to actual knowledge, except in the case of information considered to involve one of the offences referred to in Directive 2011/93/EU. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.

Amendment  42

 

Proposal for a regulation

Recital 40 a (new)

 

Text proposed by the Commission

Amendment

 

(40a) Nevertheless, notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content. Such hosting service providers should redirect such notices to the particular online platform and inform the Digital Services Coordinator.

Amendment  43

 

Proposal for a regulation

Recital 40 b (new)

 

Text proposed by the Commission

Amendment

 

(40b) Moreover, hosting providers should seek to act only against the items of information notified. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action.

Amendment  44

 

Proposal for a regulation

Recital 41

 

Text proposed by the Commission

Amendment

(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.

(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent objective, non-arbitrary and non-discriminatory processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.

Amendment  45

 

Proposal for a regulation

Recital 41 a (new)

 

Text proposed by the Commission

Amendment

 

(41a) Providers of hosting services should act upon notices without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action. The provider of hosting services should inform the individual or entity notifying the specific content of its decision without undue delay after taking a decision whether to act upon the notice or not.

Amendment  46

 

Proposal for a regulation

Recital 42

 

Text proposed by the Commission

Amendment

(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.

(42) Where a hosting service provider decides to remove, disable access to, demote or impose other measures with regard to information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have been proven to be efficient, proportionate and accurate, that provider should in a clear and user-friendly manner inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. The obligation should however not apply in a number of situations, namely when the content is deceptive or part of high-volume of commercial content, or when it has been requested by a judicial or law enforcement authority to not inform the recipient due to an ongoing criminal investigation until the criminal investigation is closed. Where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium, it should not be required to do so.

Amendment  47

 

Proposal for a regulation

Recital 42 a (new)

 

Text proposed by the Commission

Amendment

 

(42a) A provider of hosting services may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving an imminent threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council1. In such instances, the provider of hosting services should inform without delay the competent law enforcement authorities of such suspicion, providing, upon their request, all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. The information notified by the hosting service provider should not be used for any purpose other than those directly related to the individual serious criminal offence notified. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms provider of hosting services. Providers of hosting services should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. In order to facilitate the notification of suspicions of criminal offenses, Member States should notify to the Commission the list of the competent law enforcement or judicial authorities.

 

__________________

 

1 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).

Amendment  48

 

Proposal for a regulation

Recital 43 a (new)

 

Text proposed by the Commission

Amendment

 

(43a) Similarly, in order to ensure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of Chapter III Section 3, in whole or in parts, to those providers of intermediary services that are non-for profit t, or are medium-sized enterprises, but do not present any systemic risk related to illegal content and have limited exposure to illegal content. The providers should present justified reasons for why they should be issued a waiver and send their application first to their Digital Services Coordinators of establishment for a preliminary assessment. The Commission should examine such an application taking into account a preliminary assessment carried out by the Digital Services Coordinators of establishment. The preliminary assessment should be sent together with the application to the Commission. The Commission should monitor the application of the waiver and have the right revoke a waiver at any time. The Commission should maintain a public list of all waiver issued and their conditions.

Amendment  49

 

Proposal for a regulation

Recital 44

 

Text proposed by the Commission

Amendment

(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.

(44) Recipients of the service, should be able to easily and effectively contest certain decisions, of online platforms that negatively affect them. This should include decisions of online platforms allowing consumers to conclude distance contracts with traders to suspend the provisions of their services to traders. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non-discriminatory, non-arbitrary and fair outcomes within ten working days starting on the date on which the online platform received the complaint. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.

Amendment  50

 

Proposal for a regulation

Recital 46

 

Text proposed by the Commission

Amendment

(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43

(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, and expeditiously, taking into account due process and without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in an objective manner. Such trusted flagger status should only be awarded, for a period of two years, to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner and have transparent funding structure. The Digital Services Coordinator should be allowed to renew the status where the trusted flagger concerned continues to meet the requirements of this Regulation. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations, consumer organisations, and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. Trusted flaggers should publish easily comprehensible and detailed reports on notices submitted in accordance with Article 14. Those reports should indicate information such as notices categorised by the entity of the provider of hosting services, the type of content notified, the legal provisions allegedly breached by the content in question, and the action taken by the provider. The reports should also include information about any potential conflict of interest and sources of funding as well as the procedure put in place by the trusted flagger to retain its independence. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and respect for exceptions and limitations to intellectual property rights. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 In order to avoid abuses of the status of trusted flagger, it should be possible to suspend such status when a Digital Service Coordinator of establishment opened an investigation based on legitimate reasons. The suspension should not be longer than the time needed to conduct the investigation and should be maintained if the Digital Services Coordinator of establishment concluded that the entity in question could still be considered as a trusted flagger.

__________________

__________________

43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53

43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53

Amendment  51

 

Proposal for a regulation

Recital 46 a (new)

 

Text proposed by the Commission

Amendment

 

(46a) The strict application of universal design to all new technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity. It is essential to ensure that providers of online platforms, which offer services in the Union, design and provide those services in accordance with the accessibility requirements, set out in Directive (EU) 2019/882. In particular, providers of online platforms should ensure that information provided, forms provided and procedures that are in place are made available in a manner that they are easy to find, easy to understand, and accessible to persons with disabilities.

Amendment  52

 

Proposal for a regulation

Recital 47

 

Text proposed by the Commission

Amendment

(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.

(47) The misuse of services of online platforms by frequently providing illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate, proportionate and effective safeguards against such misuse. The misuse of services of online platforms could be established with regard to frequently provided illegal content where it is evident that that content is illegal without conducting a detailed legal or factual analysis. Notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should be entitled to temporarily or, in a limited number of situations, permanently suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.

Amendment  53

 

Proposal for a regulation

Recital 48

 

Text proposed by the Commission

Amendment

(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.

deleted

___________

 

1 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).

 

Amendment  54

 

Proposal for a regulation

Recital 49

 

Text proposed by the Commission

Amendment

(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.

(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms that allow consumers to conclude distance contracts with traders should obtain additional information on the trader and the products and services they intend to offer on the platform. The online platform should therefore be required to obtain information on the name, telephone number and electronic mail of the economic operator and the type of product or service the trader intends to offer on the online platform. Prior to offering its services to the trader, the online platform operator should make best efforts to assess if the information provided by the trader is reliable. In addition, the platform should take adequate measures, such as where applicable, random checks, to identify and prevent illegal content from appearing on their interface. The fulfilment of the obligations on traceability of the traders, products and services should facilitate the compliance by platforms allowing consumers to conclude distance contracts with the obligation to inform consumers of the identity of their contracting party established under Directive 2011/83/EU of the European Parliament and of the Council, as well as the obligations established under Regulation (EU) No 1215/2012 as regards the Member State in which consumers can pursue their consumer rights. The requirement to provide essential information should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary and no longer than six months after the end of a relationship with the trader, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a direct legitimate interest, including through the orders to provide information referred to in this Regulation.

Amendment  55

 

Proposal for a regulation

Recital 50

 

Text proposed by the Commission

Amendment

(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 .

(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should, before allowing the display of the product or services on its online interface, make reasonable efforts to assess the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the best efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a user-friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 .

__________________

__________________

45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en

45 https://ec.europa.eu/taxation_customs/vies/vieshome.do?selectedLanguage=en

46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council

46 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council

47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’)

47 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’)

48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers

48 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers

Amendment  56

 

Proposal for a regulation

Recital 50 a (new)

 

Text proposed by the Commission

Amendment

 

(50a) Online platforms that allow consumers to conclude distance contracts with traders should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services, in compliance with the no general monitoring principle. Online platforms covered should inform recipients when the service or product they have acquired through their services are illegal.

Amendment  57

 

Proposal for a regulation

Recital 52

 

Text proposed by the Commission

Amendment

(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.

(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. New advertising models have generated changes in the way information is presented and have created new personal data collection patterns and business models that might affect privacy, personal autonomy, democracy, quality news reporting and facilitate manipulation and discrimination. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed, as well as the natural or legal person who finances the advertisement. In addition, recipients of the service should have easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. In addition to these information obligations, online platforms should ensure that recipients of the service can refuse or withdraw their consent for targeted advertising purposes, in accordance with Regulation (EU) 2016/679 in a way that is not more difficult nor time-consuming than to give their consent. Online platforms should also not use personal data for commercial purposes related to direct marketing, profiling and behaviourally targeted advertising of minors. The online platform should not be obliged to maintain, acquire or process additional information in order to assess the age of the recipient of the service.

Amendment  58

 

Proposal for a regulation

Recital 52 a (new)

 

Text proposed by the Commission

Amendment

 

(52a) A core part of an online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should ensure that recipients can understand how recommender system impact the way information is displayed, and can influence how information is presented to them. They should clearly present the parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them.

Amendment  59

 

Proposal for a regulation

Recital 53

 

Text proposed by the Commission

Amendment

(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.

(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no proportionate alternative and less restrictive measures that would effectively achieve the same result.

Amendment  60

 

Proposal for a regulation

Recital 54

 

Text proposed by the Commission

Amendment

(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.

(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. Accordingly, the number of average monthly recipients of the service should reflect the recipients actually reached by the service either by being exposed to content or by providing content disseminated on the platforms’ interface in that period of time.

Amendment  61

 

Proposal for a regulation

Recital 56

 

Text proposed by the Commission

Amendment

(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.

(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures where mitigation is possible without adversely impacting fundamental rights.

Amendment  62

 

Proposal for a regulation

Recital 57

 

Text proposed by the Commission

Amendment

(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.

(57) Four categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination and amplification of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous and counterfeit products and illegally-traded animals. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the actual and foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, freedom of the press, human dignity, the right to private life, the right to gender equality, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. A fourth category of risks concerns any actual and foreseeable negative effects on the protection of public health, including behavioural addictions due to excessive use of a service or other serious negative effects to the person's physical, mental, social and financial well-being.

Amendment  63

 

Proposal for a regulation

Recital 58

 

Text proposed by the Commission

Amendment

(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.

(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment where mitigation is possible without adversely impacting fundamental rights. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content and of content that is incompatible with their terms and conditions. They should also consider mitigation measures in case of malfunctioning or intentional manipulation and exploitation of the service, or in case of risks inherent to the intended operation of the service, including the amplification of illegal content, of content that is in breach with their terms and conditions or any other content having negative effects, by adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the recipients of the service. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. The decision as to the choice of measures should remain with the very large online platform. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. The Commission should evaluate the implementation and effectiveness of the mitigating measures and issue recommendations when the measures implemented are deemed inappropriate or ineffective to address the systemic risk at stake.

Amendment  64

 

Proposal for a regulation

Recital 59

 

Text proposed by the Commission

Amendment

(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.

(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, independent experts and civil society organisations.

Amendment  65

 

Proposal for a regulation

Recital 60

 

Text proposed by the Commission

Amendment

(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.

(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through external independent auditing, for their compliance with the obligations laid down by this Regulation. In particular, audits should assess the clarity, coherence and predictable enforcement of terms of service, the completeness, methodology and consistency of the transparency reporting obligations, the accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers regarding notices of illegal content and terms of service violations, the accuracy of classification of removed information, the internal complaint handling mechanism, the interaction with trusted flaggers and assessment of their accuracy, the diligence with regard to the verification of the traceability of traders, the adequateness and correctness of the risk assessment, the adequateness and effectiveness of the risk mitigation measures taken and, where relevant, any complementary commitments undertaken pursuant to codes of conduct and crises protocols. They should give the vetted auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Vetted auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets,that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. This guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation applicable to very large online platforms. Auditors should be legally and financially independent and should not have conflict of interest involving the very large online platform concerned and other very large online platforms, so as to be able to perform their tasks in an adequate and trustworthy manner. Additionally, vetted auditors and their employees should not have provided any service to the very large online platform audited for 12 months before the audit. They should also commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for 12 months after their position in the auditing organisation has ended. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.

Amendment  66

 

Proposal for a regulation

Recital 61

 

Text proposed by the Commission

Amendment

(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.

(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, a statement of reasons for the failure to reach such a conclusion should be included in the audit opinion.

Amendment  67

 

Proposal for a regulation

Recital 62

 

Text proposed by the Commission

Amendment

(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.

(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. Often, they facilitate the search for relevant content for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should let the recipients decide whether they want to be subject to recommender systems based on profiling and ensure that there is an option which is not based on profiling. In addition, online platforms should ensure that recipients are appropriately informed, on the use of recommender systems, and that recipients can influence the information presented to them through making active choices. They should clearly present the main parameters for such recommender systems in an easily comprehensible and user-friendly manner to ensure that the recipients understand how information is prioritised for them, the reason why, and how to modify the parameters used to curate the content presented for the recipients. Very large online platforms should implement appropriate technical and organisational measures for ensuring that recommender systems are designed in a consumer friendly manner and do not influence end users’ behaviour through dark patterns.

Amendment  68

 

Proposal for a regulation

Recital 63

 

Text proposed by the Commission

Amendment

(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.

(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements, including the name of the product, service or brand and the object of the advertisement, and related data on the advertiser, and, if different, the natural or legal person who paid for the advertisement, and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.

Amendment  69

 

Proposal for a regulation

Recital 64

 

Text proposed by the Commission

Amendment

(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.

(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data and algorithms. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by vetted researchers, vetted not-for-profit bodies, organisations or associations, on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, not-for-profit bodies, organisations or associations,. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including personal data, trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. Vetted researchers, not-for-profit bodies, organisations or associations should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks.

Amendment  70

 

Proposal for a regulation

Recital 66

 

Text proposed by the Commission

Amendment

(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.

(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, about the interoperability of advertisement repositories, or about terms and conditions. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. In the absence of relevant standards agreed within [24 months after the entry into force of this Regulation], the Commission should be able to establish technical specifications by implementing acts until a voluntary standard is agreed.

Amendment  71

 

Proposal for a regulation

Recital 67

 

Text proposed by the Commission

Amendment

(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.

(67) The Commission and the Board should encourage the drawing-up of codes of conduct as well as the compliance with the provisions of these codes to contribute to the application of this Regulation. The Commission and the Board should aim that the codes of conduct clearly define the nature of the public interest objectives being addressed, that they contain mechanisms for independent evaluation of the achievement of these objectives and that the role of competent authorities is clearly defined. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.

Amendment  72

 

Proposal for a regulation

Recital 68

 

Text proposed by the Commission

Amendment

(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.

(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation, or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of intentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure.

Amendment  73

 

Proposal for a regulation

Recital 69

 

Text proposed by the Commission

Amendment

(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.

(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. The Commission should also encourage the development of codes of conduct to facilitate compliance with obligations in areas, such as protection of minors or short-term rental. Other areas for consideration could be to promote diversity of information through support of high quality journalism and to foster credibility of information, whilst respecting confidentiality of journalistic sources. Moreover, it is important to ensure consistency with already existing enforcement mechanisms, such as those in the area of electronic communications or media and with independent regulatory structures in these fields as defined by Union and national law.

Amendment  74

 

Proposal for a regulation

Recital 70

 

Text proposed by the Commission

Amendment

(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.

(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. The effectiveness of the codes of conduct should be regularly assessed. Unlike legislation, codes of conduct are not subject to democratic scrutiny and their compliance with fundamental rights is not subject to judicial review. In order to enhance accountability, participation and transparency, procedural safeguards for drawing up codes of conduct are needed. Before initiating or facilitating the drawing-up or the revision of codes of conduct, the Commission may invite where appropriate, the Fundamental Rights Agency or the European Data Protection Supervisor to express their opinion.

Amendment  75

 

Proposal for a regulation

Recital 71

 

Text proposed by the Commission

Amendment

(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.

(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.

Amendment  76

 

Proposal for a regulation

Recital 72

 

Text proposed by the Commission

Amendment

(72) The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should appoint at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure.

(72) The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should designate at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure.

Amendment  77

 

Proposal for a regulation

Recital 73

 

Text proposed by the Commission

Amendment

(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level.

(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be designated as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level.

Amendment  78

 

Proposal for a regulation

Recital 74

 

Text proposed by the Commission

Amendment

(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate.

(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities have the necessary financial and human resources to carry out their tasks under this Regulation. It is also necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate.

Amendment  79

 

Proposal for a regulation

Recital 75

 

Text proposed by the Commission

Amendment

(75) Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to apply and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members.

(75) Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to supervise the application and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members.

Amendment  80

 

Proposal for a regulation

Recital 76

 

Text proposed by the Commission

Amendment

(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.

(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in this Regulation by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.

Amendment  81

 

Proposal for a regulation

Recital 77

 

Text proposed by the Commission

Amendment

(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation.

(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to adopt proportionate interim measures in case of risk of serious harm, as well as to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation.

Amendment  82

 

Proposal for a regulation

Recital 78

 

Text proposed by the Commission

Amendment

(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation.

(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure consistent and uniform application of this Regulation, the Commission should adopt guidance on the rules and procedures related to the powers of Digital Services Coordinators.

Amendment  83

 

Proposal for a regulation

Recital 79

 

Text proposed by the Commission

Amendment

(79) In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should in principle take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States.

(79) In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States.

Amendment  84

 

Proposal for a regulation

Recital 80

 

Text proposed by the Commission

Amendment

(80) Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, whether the provider is active in several Member States.

(80) Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, the number of recipients affected, the intentional or negligent character of the infringement and whether the provider is active in several Member States. The Commission should issue guidance to Member States concerning the criteria and conditions to impose proportionate penalties.

Amendment  85

 

Proposal for a regulation

Recital 81

 

Text proposed by the Commission

Amendment

(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation.

(81) In order to ensure effective enforcement of the obligations, laid down in this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. The Digital Services Coordinator of establishment should assess the complaint in a timely manner and inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.

Amendment  86

 

Proposal for a regulation

Recital 82

 

Text proposed by the Commission

Amendment

(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available.

(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements of this Regulation. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available.

Amendment  87

 

Proposal for a regulation

Recital 83 a (new)

 

Text proposed by the Commission

Amendment

 

(83a) Without prejudice to the provisions on the exemption from liability, provided for in this Regulation as regards the information transmitted or stored at the request of a recipient of the service, providers of intermediary services should be liable for the infringement of their obligations laid down in this Regulation. Recipients of the service and organisations representing them should be entitled to have access to proportionate and effective remedies. They should in particular have the right to seek, in accordance with national or Union law, compensation from those providers of intermediary services against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation.

Amendment  88

 

Proposal for a regulation

Recital 84

 

Text proposed by the Commission

Amendment

(84) The Digital Services Coordinator should regularly publish a report on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State.

(84) The Digital Services Coordinator should regularly publish a report in a standardised and machine-readable format on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, based on the Internal Market Information system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State.

Amendment  89

 

Proposal for a regulation

Recital 86

 

Text proposed by the Commission

Amendment

(86) In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved.

(86) In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation on the basis of an agreement between the Member States concerned, and in the absence of agreement, under the authority of the Digital Services Coordinator of the Member State of establishment. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved.

Amendment  90

 

Proposal for a regulation

Recital 88

 

Text proposed by the Commission

Amendment

(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State.

(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. The rules of procedure of the Board should ensure respecting the confidentiality of the information.

Amendment  91

 

Proposal for a regulation

Recital 90

 

Text proposed by the Commission

Amendment

(90) For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation.

(90) For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation. The Board should draw up an annual report regarding its activities.

Amendment  92

 

Proposal for a regulation

Recital 91

 

Text proposed by the Commission

Amendment

(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.

(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, gender equality and non-discrimination, eradication of all forms of violence against women and girls and other forms of gender-based violence, data protection, respect for intellectual property, competition, electronic communications, audiovisual services, market surveillance, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.

Amendment  93

 

Proposal for a regulation

Recital 96

 

Text proposed by the Commission

Amendment

(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform.

(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission should, on its own initiative or upon advice of the Board, initiate further investigation on the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform. The Commission should initiate proceedings in view of the possible adoption of decisions in respect of the relevant conduct by the very large online platform for example where that platform is suspected of having infringed this Regulation including where the platform has been found to not implement the operational recommendations from the independent audit that has been endorsed by Digital Services Coordinator of establishment and where the Digital Services Coordinator of establishment did not take any investigatory or enforcement measures.

Amendment  94

 

Proposal for a regulation

Recital 97

 

Text proposed by the Commission

Amendment

(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.

(97) Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.

Amendment  95

 

Proposal for a regulation

Recital 97 a (new)

 

Text proposed by the Commission

Amendment

 

(97a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.

Amendment  96

 

Proposal for a regulation

Recital 99

 

Text proposed by the Commission

Amendment

(99) In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers

(99) In particular, the Commission, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or that individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.

Amendment  97

 

Proposal for a regulation

Recital 100

 

Text proposed by the Commission

Amendment

(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods.

(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods. The Commission should in particular ensure that the penalties are effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and nature of activities carried out, the number of recipients affected, the intentional or negligent character of the infringement as well as the economic capacity of the infringer.

Amendment  98

 

Proposal for a regulation

Recital 102

 

Text proposed by the Commission

Amendment

(102) In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure.

(102) The Commission should carry out a general evaluation of this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. This report should address in particular the definition of very large online platforms and the number of average monthly active recipients of the service. This report should also address the implementation of codes of conduct, as well as the obligation to designate a representative, established in the Union and assess the effect of similar obligations imposed by third countries on European service providers operating abroad. In particular, the Commissions should assess any impact of the costs to European service providers of any similar requirements, including to designate a legal representative, introduced by third countries and any new barriers to non-Union market access after the adoption of this Regulation. The Commission should also assess the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union. In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within three years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure.

Amendment  99

 

Proposal for a regulation

Article 1 – title

 

Text proposed by the Commission

Amendment

Subject matter and scope

Subject matter

Amendment  100

 

Proposal for a regulation

Article 1 – paragraph 1 – point c

 

Text proposed by the Commission

Amendment

(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.

(c) rules on the implementation and enforcement of the requirements set out in this Regulation, including as regards the cooperation of and coordination between the competent authorities.

Amendment  101

 

Proposal for a regulation

Article 1 – paragraph 2 – point b

 

Text proposed by the Commission

Amendment

(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.

(b) set out harmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected

Amendment  102

 

Proposal for a regulation

Article 1 – paragraph 2 – point b a (new)

 

Text proposed by the Commission

Amendment

 

(ba) promote a high level of consumer protection and contribute to increased consumer choice while facilitating innovation, support digital transition and encourage economic growth within the internal market.

Amendment  103

 

Proposal for a regulation

Article 1 – paragraph 3

 

Text proposed by the Commission

Amendment

3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.

deleted

Amendment  104

 

Proposal for a regulation

Article 1 – paragraph 4

 

Text proposed by the Commission

Amendment

4. This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service.

deleted

Amendment  105

 

Proposal for a regulation

Article 1 – paragraph 5

 

Text proposed by the Commission

Amendment

5. This Regulation is without prejudice to the rules laid down by the following:

deleted

(a) Directive 2000/31/EC;

 

(b) Directive 2010/13/EC;

 

(c) Union law on copyright and related rights;

 

(d) Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted];

 

(e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted]

 

(f) Regulation (EU) 2019/1148;

 

(g) Regulation (EU) 2019/1150;

 

(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394;

 

(i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC.

 

Amendment  106

 

Proposal for a regulation

Article 1 a (new)

 

Text proposed by the Commission

Amendment

 

Article 1a

 

Scope

 

1. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.

 

2. This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service.

 

3. This Regulation is without prejudice to the rules laid down by the following:

 

(a) Directive 2000/31/EC;

 

(b) Directive 2010/13/EC;

 

(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market;

 

(d) Regulation (EU) …/….2021/784 on preventing addressing the dissemination of terrorist content online;

 

(e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted]

 

(f) Regulation (EU) 2019/1148;

 

(g) Regulation (EU) 2019/1150;

 

(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation(EU) 2019/1020 and Directive 2001/95/EC on general product safety;

 

(i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC.

 

(j) Directive (EU) 2019/882;

 

(k) Directive (EU) 2018/1972;

 

(l) Directive 2013/11/EU.

 

4. By [12 months after the entry into force of this Regulation] the Commission shall publish guidelines with regard to the relationship between this Regulation and the legal acts referred to in Article 1a (3).

Amendment  107

 

Proposal for a regulation

Article 2 – paragraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) ‘information society services’ means services within the meaning of Article 1(1)(b) of Directive (EU) 2015/1535;

(a) ‘information society services’ means services as defined in Article 1(1)(b) of Directive (EU) 2015/1535;

Amendment  108

 

Proposal for a regulation

Article 2 – paragraph 1 – point b

 

Text proposed by the Commission

Amendment

(b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service;

(b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service in order to seek information or to make it accessible;

Amendment  109

 

Proposal for a regulation

Article 2 – paragraph 1 – point c

 

Text proposed by the Commission

Amendment

(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business or profession;

(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

Amendment  110

 

Proposal for a regulation

Article 2 – paragraph 1 – point d – introductory part

 

Text proposed by the Commission

Amendment

(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as:

(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of a provider of information society services which has a substantial connection to the Union;

Amendment  111

 

Proposal for a regulation

Article 2 – paragraph 1 – point d – indent 1

 

Text proposed by the Commission

Amendment

 a significant number of users in one or more Member States; or

deleted

Amendment  112

 

Proposal for a regulation

Article 2 – paragraph 1 – point d – indent 2

 

Text proposed by the Commission

Amendment

 the targeting of activities towards one or more Member States.

deleted

Amendment  113

 

Proposal for a regulation

Article 2 – paragraph 1 – point d a (new)

 

Text proposed by the Commission

Amendment

 

(da) ‘substantial connection to the Union’ means the connection of a provider with one or more Member States resulting either from its establishment in the Union, or in the absence of such an establishment, from the fact that the provider directs its activities towards one or more Member States;

Amendment  114

 

Proposal for a regulation

Article 2 – paragraph 1 – point e

 

Text proposed by the Commission

Amendment

(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes directly relating to his or her trade, business, craft or profession;

Amendment  115

 

Proposal for a regulation

Article 2 – paragraph 1 – point f – indent 1

 

Text proposed by the Commission

Amendment

 a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;

 a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network including technical auxiliary functional services;

Amendment  116

 

Proposal for a regulation

Article 2 – paragraph 1 – point f – indent 2

 

Text proposed by the Commission

Amendment

 a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

 a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

Amendment  117

 

Proposal for a regulation

Article 2 – paragraph 1 – point g

 

Text proposed by the Commission

Amendment

(g) ‘illegal content’ means any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;

(g) ‘illegal content’ means any information or activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;

Amendment  118

 

Proposal for a regulation

Article 2 – paragraph 1 – point h

 

Text proposed by the Commission

Amendment

(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.

(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor or a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.

Amendment  119

 

Proposal for a regulation

Article 2 – paragraph 1 – point k

 

Text proposed by the Commission

Amendment

(k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications;

(k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications which enables the recipients of the service to access and interact with the relevant intermediary service;

Amendment  120

 

Proposal for a regulation

Article 2 – paragraph 1 – point k a (new)

 

Text proposed by the Commission

Amendment

 

(ka) ‘trusted flagger’ means an entity that has been awarded such status by a Digital Services Coordinator;

Amendment  121

 

Proposal for a regulation

Article 2 – paragraph 1 – point n

 

Text proposed by the Commission

Amendment

(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information;

(n) ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that message;

Amendment  122

 

Proposal for a regulation

Article 2 – paragraph 1 – point n a (new)

 

Text proposed by the Commission

Amendment

 

(na) 'remuneration' means economic compensation consisting of direct or indirect payment for the service provided, including where the intermediary service provider is not directly compensated by the recipient of the service or where the recipient of the service provides data to the service provider, except where such data is collected for the sole purpose of meeting legal requirements;

Amendment  123

 

Proposal for a regulation

Article 2 – paragraph 1 – point o

 

Text proposed by the Commission

Amendment

(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;

(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, prioritise or curate in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;

Amendment  124

 

Proposal for a regulation

Article 2 – paragraph 1 – point p

 

Text proposed by the Commission

Amendment

(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;

(p) ‘content moderation’ means the activities, either automated or not automated, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;

Amendment  125

 

Proposal for a regulation

Article 2 – paragraph 1 – point q

 

Text proposed by the Commission

Amendment

(q) ‘terms and conditions’ means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.

(q) ‘terms and conditions’ means all terms and conditions or specifications, by the service provider irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.

Amendment  126

 

Proposal for a regulation

Article 2 – paragraph 1 – point q a (new)

 

Text proposed by the Commission

Amendment

 

(qa) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882.

Amendment  127

 

Proposal for a regulation

Article 3 – paragraph 3

 

Text proposed by the Commission

Amendment

3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

3. This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Amendment  128

 

Proposal for a regulation

Article 4 – paragraph 1 – introductory part

 

Text proposed by the Commission

Amendment

1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that:

1. Where an information society service is provided that consists of the transmission in communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or secure the information's onward transmission to other recipients of the service upon their request, on condition that the provider:

Amendment  129

 

Proposal for a regulation

Article 4 – paragraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) the provider does not modify the information;

(a) does not modify the information;

Amendment  130

 

Proposal for a regulation

Article 4 – paragraph 1 – point b

 

Text proposed by the Commission

Amendment

(b) the provider complies with conditions on access to the information;

(b) complies with conditions on access to the information;

Amendment  131

 

Proposal for a regulation

Article 4 – paragraph 1 – point c

 

Text proposed by the Commission

Amendment

(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;

(c) complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;

Amendment  132

 

Proposal for a regulation

Article 4 – paragraph 1 – point d

 

Text proposed by the Commission

Amendment

(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and

(d) does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and

Amendment  133

 

Proposal for a regulation

Article 4 – paragraph 1 – point e

 

Text proposed by the Commission

Amendment

(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.

(e) acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.

Amendment  134

 

Proposal for a regulation

Article 4 – paragraph 2

 

Text proposed by the Commission

Amendment

2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

2. This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Amendment  135

 

Proposal for a regulation

Article 5 – paragraph 3

 

Text proposed by the Commission

Amendment

3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.

3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.

Amendment  136

 

Proposal for a regulation

Article 5 – paragraph 4

 

Text proposed by the Commission

Amendment

4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

4. This Article shall not affect the possibility for a judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Amendment  137

 

Proposal for a regulation

Article 6 – paragraph 1

 

Text proposed by the Commission

Amendment

Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.

1. Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4, and 5 solely because they carry out voluntary own-initiative investigations or take measures aimed at detecting, identifying and removing, or disabling of access to, illegal content or take the necessary measures to comply with the requirements of national and Union law, including the Charter and the requirements set out in this Regulation.

Amendment  138

 

Proposal for a regulation

Article 6 – paragraph 1 a (new)

 

Text proposed by the Commission

Amendment

 

1a. Providers of intermediary services shall ensure that voluntary own-initiative investigations carried out and measures taken pursuant to paragraph 1 shall be effective and specific. Such own initiative investigations and measures shall be accompanied by appropriate safeguards, such as human oversight, documentation, or any additional measure to ensure and demonstrate that those investigations and measures are accurate, non-discriminatory, proportionate, transparent and do not lead to over-removal of content. Providers of intermediary services shall make best efforts to ensure that where automated means are used, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content.

Amendment  139

 

Proposal for a regulation

Article 7 – paragraph 1

 

Text proposed by the Commission

Amendment

No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.

1. No general obligation to monitor, neither de jure, nor de facto, through automated or non-automated means, the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity or for monitoring the behaviour of natural persons shall be imposed on those providers.

Amendment  140

 

Proposal for a regulation

Article 7 – paragraph 1 a (new)

 

Text proposed by the Commission

Amendment

 

1a. Providers of intermediary services shall not be obliged to use automated tools for content moderation or for monitoring the behaviour of natural persons.

Amendment  141

 

Proposal for a regulation

Article 7 – paragraph 1 b (new)

 

Text proposed by the Commission

Amendment

 

1b. Member States shall not prevent providers of intermediary services from offering end-to-end encrypted services.

Amendment  142

 

Proposal for a regulation

Article 7 – paragraph 1 c (new)

 

Text proposed by the Commission

Amendment

 

1c. Member States shall not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. Member States shall not oblige providers of intermediary services to generally and indiscriminately retain personal data of the recipients of their services. Any targeted retention of a specific recipient’s data shall be ordered by a judicial authority in accordance with Union or national law.

Amendment  143

 

Proposal for a regulation

Article 8 – paragraph 1

 

Text proposed by the Commission

Amendment

1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.

1. Providers of intermediary services shall, upon the receipt via a secure communications channel of an order to act against one or more specific items of illegal content, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the actions taken and the moment when the actions were taken.

Amendment  144

 

Proposal for a regulation

Article 8 – paragraph 2 – point a – indent -1 (new)

 

Text proposed by the Commission

Amendment

 

 a reference to the legal basis for the order;

Amendment  145

 

Proposal for a regulation

Article 8 – paragraph 2 – point a – indent 1

 

Text proposed by the Commission

Amendment

 a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;

 a sufficiently detailed statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law in conformity with Union law;

Amendment  146

 

Proposal for a regulation

Article 8 – paragraph 2 – point a – indent 1 a (new)

 

Text proposed by the Commission

Amendment

 

 identification of the issuing authority including the date, timestamp and electronic signature of the authority, that allows the recipient to authenticate the order and contact details of a person of contact within the said authority;

Amendment  147

 

Proposal for a regulation

Article 8 – paragraph 2 – point a – indent 2

 

Text proposed by the Commission

Amendment

 one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned;

 a clear indication of the exact electronic location of that information, such as the exact URL or URLs where appropriate or when the exact electronic location is not precisely identifiable; one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned;

Amendment  148

 

Proposal for a regulation

Article 8 – paragraph 2 – point a – indent 3

 

Text proposed by the Commission

Amendment

 information about redress available to the provider of the service and to the recipient of the service who provided the content;

 easily understandable information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content, including the deadlines for appeal;

Amendment  149

 

Proposal for a regulation

Article 8 – paragraph 2 – point a – indent 3 a (new)

 

Text proposed by the Commission

Amendment

 

 where necessary and proportionate, the decision not to disclose information about the removal of or disabling of access to the content for reasons of public security, such as the prevention, investigation, detection and prosecution of serious crime, not exceeding six weeks from that decision;

Amendment  150

 

Proposal for a regulation

Article 8 – paragraph 2 – point b

 

Text proposed by the Commission

Amendment

(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective;

(b) the territorial scope of the order on the basis of the applicable rules of Union and national law in conformity with Union law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; the territorial scope of the order shall be limited to the territory of the Member State issuing the order unless the illegality of the content derives directly from Union law or the rights at stake require a wider territorial scope, in accordance with Union and international law;

Amendment  151

 

Proposal for a regulation

Article 8 – paragraph 2 – point c

 

Text proposed by the Commission

Amendment

(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.

(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10 or in one of the official languages of the Member State that issues the order against the specific item of illegal content; in such case, the point of contact of the service provider may request the competent authority to provide translation into the language declared by the provider;

Amendment  152

 

Proposal for a regulation

Article 8 – paragraph 2 – point c a (new)

 

Text proposed by the Commission

Amendment

 

(ca) the order is in compliance with Article 3 of Directive 2000/31/EC;

Amendment  153