REPORT on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse

16.11.2023 - (COM(2022)0209 – C9‑0174/2022 – 2022/0155(COD)) - ***I

Committee on Civil Liberties, Justice and Home Affairs
Rapporteur: Javier Zarzalejos
Rapporteurs for the opinions of associated committees pursuant to Rule 57 of the Rules of Procedure:
Alex Agius Saliba, Committee on the Internal Market and Consumer Protection

DRAFT EUROPEAN PARLIAMENT LEGISLATIVE RESOLUTION

on the proposal for a regulation of the European Parliament and of the Council Laying down rules to prevent and combat child sexual abuse

(COM(2022)0209 – C9‑0174/2022 – 2022/0155(COD))

(Ordinary legislative procedure: first reading)

The European Parliament,

 having regard to the Commission proposal to Parliament and the Council (COM(2022)0209),

 having regard to Article 294(2) and Article 114 of the Treaty on the Functioning of the European Union, pursuant to which the Commission submitted the proposal to Parliament (C9‑0174/2022),

 having regard to Article 294(3) of the Treaty on the Functioning of the European Union,

 having regard to the reasoned opinions submitted, within the framework of Protocol No 1 and 2 to the EU Treaties, by the Spanish Parliament, the Netherlands Senate, the Irish Houses of the Oireachtas, the French Senate and the Czech Chamber of Deputies,

 having regard to the opinion of the European Economic and Social Committee of 21 September 2022[1],

 having regard to Rule 59 of its Rules of Procedure,

 having regard to the opinions of the Committee on the Internal Market and Consumer Protection, Committee on Budgets, Committee on Culture and Committee on Education and Women’s Rights and Gender Equality

 having regard to the report of the Committee on Civil Liberties, Justice and Home Affairs (A9-0364/2023),

1. Adopts its position at first reading hereinafter set out;

2. Approves its statement annexed to this resolution, which will be published in the L series of the Official Journal of the European Union together with the final legislative act;

3. Calls on the Commission to refer the matter to Parliament again if it replaces, substantially amends or intends to substantially amend its proposal;

4. Instructs its President to forward its position to the Council, the Commission and the national parliaments.

Amendment  1

 

Proposal for a regulation

Recital 1

 

Text proposed by the Commission

Amendment

(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well-being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children.

(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children. However, these services are also used by perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that often cause long-lasting negative consequences on victims and that need to be prevented and combated effectively in order to protect children’s rights and well-being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely in a trusted online environment, especially by children.

Amendment  2

 

Proposal for a regulation

Recital 2

 

Text proposed by the Commission

Amendment

(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services.

(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being in a unique position to prevent and combat such abuse. The measures taken should be effective, targeted, evidence-based, carefully balanced, and proportionate, and subject to constant review so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid directly or indirectly imposing any excessive burdens on the providers of the services.

Amendment  3

 

Proposal for a regulation

Recital 3

 

Text proposed by the Commission

Amendment

(3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.

(3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse and more generally to protect children online, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which sometimes diverge, can have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.

Amendment  4

 

Proposal for a regulation

Recital 4

 

Text proposed by the Commission

Amendment

(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future-proof manner, so as not to hamper innovation.

(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform, effective, proportionate and carefully balanced rules to prevent and combat child sexual abuse in a manner that is effective, targeted and proportionate, and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future-proof manner, so they stimulate innovation and technological development to prevent and combat online child sexual abuse.

Amendment  5

Compromise amendment replacing Amendment(s): 310, 311

 

Proposal for a regulation

Recital 5

 

Text proposed by the Commission

Amendment

(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner.

(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number-independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those services as are publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming online games, image-sharing and video-hosting are also at risk of misuse for the purpose of online child sexual abuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner without lowering child protection standards.

Amendment  6

 

Proposal for a regulation

Recital 6

 

Text proposed by the Commission

Amendment

(6) Online child sexual abuse frequently involves the misuse of information society services offered in the Union by providers established in third countries. In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to all providers, irrespective of their place of establishment or residence, that offer services in the Union, as evidenced by a substantial connection to the Union.

(6) Online child sexual abuse can also involve the misuse of information society services offered in the Union by providers established in third countries. In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to all providers, irrespective of their place of establishment or residence, that offer services in the Union, as evidenced by a substantial connection to the Union.

Amendment  7

 

Proposal for a regulation

Recital 7

 

Text proposed by the Commission

Amendment

(7) This Regulation should be without prejudice to the rules resulting from other Union acts, in particular Directive 2011/93 of the European Parliament and of the Council38 , Directive 2000/31/EC of the European Parliament and of the Council39 and Regulation (EU) …/… of the European Parliament and of the Council40 [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC], Directive 2010/13/EU of the European Parliament and of the Council41 , Regulation (EU) 2016/679 of the European Parliament and of the Council42 , and Directive 2002/58/EC of the European Parliament and of the Council43 .

(7) This Regulation should be without prejudice to the rules resulting from other Union acts, in particular Directive 2011/93 of the European Parliament and of the Council38 , Directive 2000/31/EC of the European Parliament and of the Council39 and Regulation (EU) 2022/2065 of the European Parliament and of the Council40, Directive 2010/13/EU of the European Parliament and of the Council41 , Regulation (EU) 2016/679 of the European Parliament and of the Council42 , and Directive 2002/58/EC of the European Parliament and of the Council43 .

__________________

__________________

38 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).

38 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).

39 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).

39 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).

40 Regulation (EU) …/… of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (OJ L ….).

40 Regulation (EU) …/… of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (OJ L ….).

41 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media service (OJ L 95, 15.4.2010, p. 1).

41 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media service (OJ L 95, 15.4.2010, p. 1).

42 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (OJ L 119, 4.5.2016, p. 1).

42 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (OJ L 119, 4.5.2016, p. 1).

43 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (‘Directive on privacy and electronic communications’) (OJ L 201, 31.7.2002, p. 37).

43 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (‘Directive on privacy and electronic communications’) (OJ L 201, 31.7.2002, p. 37).

 

Amendment  8

 

Proposal for a regulation

Recital 8

 

Text proposed by the Commission

Amendment

(8) This Regulation should be considered lex specialis in relation to the generally applicable framework set out in Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] laying down harmonised rules on the provision of certain information society services in the internal market. The rules set out in Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] apply in respect of issues that are not or not fully addressed by this Regulation.

(8) This Regulation should be considered lex specialis in relation to the generally applicable framework set out in Regulation (EU) 2022/2065 laying down harmonised rules on the provision of certain information society services in the internal market. The rules set out in Regulation (EU) 2022/2065 apply in respect of issues that are not or not fully addressed by this Regulation.

Amendment  9 

Proposal for a regulation

Recital 9 a (new)

 

Text proposed by the Commission

Amendment

 

(9a) Encryption, and especially end-to-end encryption, is an increasingly important tool to guarantee the security and confidentiality of the communications of all users, including children. Any restrictions or undermining of the end-to-end encryption can be used and abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting, weakening or undermining end-to-end encryption. Providers of information society services should under no circumstances be prevented from providing their services using the highest standards of encryption, considering that such encryption is essential for trust in and security of the digital services.

Amendment  10

 

Proposal for a regulation

Recital 10

 

Text proposed by the Commission

Amendment

(10) In the interest of clarity and consistency, the definitions provided for in this Regulation should, where possible and appropriate, be based on and aligned with the relevant definitions contained in other acts of Union law, such as Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC].

(10) In the interest of clarity and consistency, the definitions provided for in this Regulation should, where possible and appropriate, be based on and aligned with the relevant definitions contained in other acts of Union law, such as Regulation (EU) 2022/2065.

Amendment  11

 

Proposal for a regulation

Recital 11

 

Text proposed by the Commission

Amendment

(11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44 . Mere technical accessibility of a website from the Union should not, alone, be considered as establishing a substantial connection to the Union.

(11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, where the number of recipients of the service in one or more Member States is significant in relation to its or their population, or on the basis of the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44 . Mere technical accessibility of a website from the Union should not, on that ground alone, be considered as establishing a substantial connection to the Union.

__________________

__________________

44 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).

44 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).

Amendment  12

 

Proposal for a regulation

Recital 14

 

Text proposed by the Commission

Amendment

(14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available interpersonal communications services should assess such risk for each of the services that they offer in the Union. To guide their risk assessment, a non-exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is updated regularly and when needed for particular reasons.

(14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number independent interpersonal communications services should assess such risk stemming, inter alia, from the design, functioning and use of their services that they offer in the Union. That risk assessment should be specific to the services they offer and proportionate to the risk considering its severity and probability. To guide their risk assessment, a non-exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is updated regularly and when needed for particular reasons.

Amendment  13

 

Proposal for a regulation

Recital 14 a (new)

 

Text proposed by the Commission

Amendment

 

(14a) The obligation to conduct a risk assessment should apply, in any case, to very large online platforms and to those providers which are substantial, exposed to online child sexual abuse. Providers that qualify as small and micro enterprises as defined in Commission Recommendation 2003/361/EC should carry out a simplified risk assessment. Irrespective of their size or their substantially exposure to online child sexual abuse, providers of online games that operate number-independent interpersonal communications service within their games, platforms primarily used for the dissemination of pornographic content and providers offering services directly targeting children should carry out a risk assessment.

Amendment  14

 

Proposal for a regulation

Recital 15

 

Text proposed by the Commission

Amendment

(15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public. For the purposes of the present Regulation, those providers may draw on such a risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.

(15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) 2022/2065 respect to information that they store and disseminate to the public. For the purposes of the present Regulation, and in order to ensure consistency and avoid unnecessary burdens and duplications, those providers may draw on such a risk assessment for the purpose of the risk assessment under this Regulation and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.

Amendment  15

 

Proposal for a regulation

Recital 16

 

Text proposed by the Commission

Amendment

(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.

(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) 2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.

Amendment  16

 

Proposal for a regulation

Recital 17

 

Text proposed by the Commission

Amendment

(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.

(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect and prevent, online child sexual abuse in their services. Mitigation measures should aim to contribute to prevent child sexual abuse from happening in the first place, and consequently detection orders should be issued only to providers that have failed to take all reasonable and proportionate mitigation measures to address the risk identified.

Amendment  17

 

Proposal for a regulation

Recital 17 a (new)

 

Text proposed by the Commission

Amendment

 

(17a) top Online platforms primarily used for the dissemination of pornographic content and providers of online games falling under the scope of this Regulation should take additional technical and organisational measures to ensure safety and security by design and by default for children.

Amendment  18

 

Proposal for a regulation

Recital 18

 

Text proposed by the Commission

Amendment

(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self-regulatory cooperation, and those contained in guidelines from the Commission. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measures.

(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number-independent interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights or if they disproportionately affect people experiencing intersectional discrimination, including on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation. Particular care should be taken to assess the impact on girls, who are at a greater risk of being subject to child sexual abuse and gender-based violence. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the ongoing effectiveness of the measures, the financial and technological capabilities and the size of the provider concerned. Therefore mitigation measures should always be the least intrusive option possible. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self-regulatory cooperation, and those contained in guidelines from the Commission. Clear targets, oversight, review and adaptation, led by the competent authorities, are needed to avoid measures becoming redundant, disproportionate, ineffective, counterproductive or outdated. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measures.

Amendment  19

 

Proposal for a regulation

Recital 18 a (new)

 

Text proposed by the Commission

Amendment

 

(18a) Parental control features and functionalities should be limited to allowing allow parents, or guardians only to prevent children from accessing platforms or services that are inappropriate for their age or fall under an age-restriction applicable under national law, or to help prevent them from being exposed to content that is inappropriate. Those measures should be in accordance with Regulation (EU) 2016/679 and the Convention on the Rights of the Child, in particular General Comment 25 (2021) on children’s rights in relation to the digital environment, respect the integrity and safety of the device and not allow unauthorised access or control by third parties.

Amendment  20

 

Proposal for a regulation

Recital 18 b (new)

 

Text proposed by the Commission

Amendment

 

(18b) Providers should have to establish and operate an accessible, age-appropriate, child-friendly and user-friendly reporting mechanism that allows any user or entity to flag or notify them the presence of potential online child sexual abuse on their services, including self-generated material.

Amendment  21

 

Proposal for a regulation

Recital 18 c (new)

 

Text proposed by the Commission

Amendment

 

(18c) Providers that have identified a risk of use of their services for the purpose of the solicitation of children, should be able to take age verification measures. The implementation of technical procedures to verify the age of users is likely to result in the processing of personal data. Such processing is particularly sensitive in view of its purpose and is subject to Regulation (EU) 2016/679. Age verification systems should strictly comply with the principle of data minimization. In addition, the requirement to set up an age verification system for the legitimate purpose of protecting minors provided for in this Regulation does not justify a general obligation to identify oneself prior to consulting any site offering content. Being able, in principle, to benefit from online public communication services without having to identify oneself, or by using pseudonyms, contributes to the freedom to inform oneself and to the protection of users' privacy. This is an essential element in the exercise of these freedoms on the Internet. Providers should use systems that provide proof of age without revealing the identity of the user as foreseen in Regulation .../... amending Regulation (EU) No 910/2014 as regards establishing a framework for a European Digital Identity. Such services could, for example, be based on a trusted third-party organization, which would have to incorporate a double anonymity mechanism preventing the trusted third party from identifying the site or application at the origin of a verification request, on the one hand, and preventing the transmission of identifying data relating to the user to the site or application, on the other. The means of proof should therefore be in the hands of its bearer and limited to a single age attribute. The trusted third-party organisation should also incorporate all personal data protection guarantees, and in particular inform the person concerned, in simple terms and adapted to each audience, of the risks and rights associated with the processing of his or her data.

Amendment  22

 

Proposal for a regulation

Recital 19

 

Text proposed by the Commission

Amendment

(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures to assess and mitigate that risk. The providers should make that assessment in a diligent manner, making efforts that are reasonable under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the providers of the services offered through the software application where possible.

(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores considered as gatekeepers under Regulation (EU) 2022/1925 should be made subject to obligations to take certain reasonable measures to assess and mitigate that risk, specifically preventing children from accessing the software applications in relation to which the provider of software application has explicity informed that it does not permit its use by children or when it has an age rating model in place. The providers should make that assessment in a diligent manner, making efforts that are reasonable under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the providers of the services offered through the software application where possible.

Amendment  23

 

Proposal for a regulation

Recital 20

 

Text proposed by the Commission

Amendment

(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.

(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when the provider refuses to cooperate by putting in place the mitigating measures aimed to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request, as a measure of last resort, the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available number-independent interpersonal communications services, it should only be possible to address detection orders to providers of such services. As a matter of principle, detection orders should be addressed to the service provider acting as a controller. However, in some circumstances, determining whether a service provider has the role of controller or processor can prove particularly challenging or addressing the controller may be detrimental to an ongoing investigation. Consequently, as an exception, it should be possible to address a detection order directly to the service provider that stores or otherwise processes the data.

Amendment  24

 

Proposal for a regulation

Recital 21

 

Text proposed by the Commission

Amendment

(21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. One of the elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection order.

(21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued by a judicial authority and only after a diligent and objective assessment leading to the finding of reasonable grounds of suspicion for a link, at least an indirect one, of the service concerned being misused by individual users, or a specific group of users, either as such or as subscribers to a specific channel of communication for child sexual abuse material. Reasonable grounds are those resulting from any information reliable and legally acquired that suggest that individual users, or a specific group of users, either as such or as subscribers to a specific channel of communication might have a link, even an indirect or remote one, with child sexual abuse material. A link with child sexual abuse material should be deemed to exist where on the basis of objective evidence there is a reasonable suspicion that such material will be detected in the use of a service by a user. Where a channel is operated specifically for the purpose of distributing child sexual abuse material, the subscribers to that channel should be considered linked to child sexual abuse material. Conduct which is legal according to Directive 2011/92/EU or national law transposing it should not be deemed a reasonable ground of suspicion. In order to conduct such an assessment a fluent dialogue needs to be established between the Coordinating Authority and the provider. With the view at achiving that aim, it should be possible for the Coordinating Authority to request additional information to the EU Centre, the competent data protection authorities or any other public authority or entities.

Amendment  25

 

Proposal for a regulation

Recital 21 a (new)

 

Text proposed by the Commission

Amendment

 

(21a) The definition of child sexual abuse material provided in Article 2 has to be interpreted taking into account Directive 2011/93/EU. Therefore, personal communication between consenting peers as well as children over the age of sexual consent and their partners are out of the scope of the definition insofar those images does not involve any abuse or exploitation or payment or remuneration for pornographic performance and the images have not been disseminated without the consent of the parties involved. Likewise, images produced for medical or scientific purposes, strictly verifiable as such, should remain out of the scope of definition of child sexual abuse material.

Amendment  26

 

Proposal for a regulation

Recital 22

 

Text proposed by the Commission

Amendment

(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.

(22) It should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, the likelihood and seriousness of any potential negative consequences for other parties affected, including the users of the service. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.

Amendment  27

 

Proposal for a regulation

Recital 23

 

Text proposed by the Commission

Amendment

(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.

(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is limited in time so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to individual users, or a specific group of users, either as such or as subscribers to a specific channel of communication in respect of whom there are reasonable grounds of suspicion for a link, even an indirect one, with child sexual abuse material as defined in Article 2 as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.

Amendment  28

 

Proposal for a regulation

Recital 24

 

Text proposed by the Commission

Amendment

(24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possible, having regard to the important public policy objective at stake and the need to act without undue delay to protect children. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providing their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time period in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detection order addressed to it under this Regulation.

(24) The competent judicial authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so without undue delay, having regard to the important public policy objective at stake and the need to act without undue delay to protect children. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providing their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time period in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detection order addressed to it under this Regulation.

Amendment  29

 

Proposal for a regulation

Recital 25

 

Text proposed by the Commission

Amendment

(25) Where new services are concerned, that is, services not previously offered in the Union, the evidence available on the potential misuse of the service in the last 12 months is normally non-existent. Taking this into account, and to ensure the effectiveness of this Regulation, the Coordinating Authority should be able to draw on evidence stemming from comparable services when assessing whether to request the issuance of a detection order in respect of such a new service. A service should be considered comparable where it provides a functional equivalent to the service in question, having regard to all relevant facts and circumstances, in particular its main characteristics and functionalities, the manner in which it is offered and used, the user base, the applicable terms and conditions and risk mitigation measures, as well as the overall remaining risk profile.

deleted

Amendment  30

 

Proposal for a regulation

Recital 26

 

Text proposed by the Commission

Amendment

(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.

(26) The measures taken by providers of hosting services and providers of publicly available number-independent interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users, while ensuring the effective detection of child sexual abuse material and the balance of all the fundamental rights at stake. In that regard, providers should ensure effective internal procedures and safeguards to prevent general monitoring. Detection orders should not apply to end-to-end encryption.

Amendment  31

 

Proposal for a regulation

Recital 27

 

Text proposed by the Commission

Amendment

(27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board should be consulted on those technologies and the ways in which they should be best deployed to ensure compliance with applicable rules of Union law on the protection of personal data. The advice of the European Data Protection Board should be taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers may operate the technologies made available by the EU Centre or by others or technologies that they developed themselves, as long as they meet the requirements of this Regulation.

(27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board must be consulted on the use of those technologies and the ways in which they should be best deployed to ensure compliance with applicable rules of Union law on the protection of personal data. The advice of the European Data Protection Board should be taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers should not be limited to operating the technologies made available by the EU Centre or by others but should always be allowed to use or technologies that they developed themselves, as long as they meet the requirements of this Regulation and other applicable Union law, such as Regulation (EU) 2016/679. Those technologies should be independently audited as regards their performance and reliability.

Amendment  32

 

Proposal for a regulation

Recital 27 a (new)

 

Text proposed by the Commission

Amendment

 

(27a) Since the Commission consultations to the EDPB regarding several aspects of this Regulation will entail more work for the EDPB, its budget and staffing should be adapted accordingly. The situation of national authorities, who likewise will be regularly consulted by service providers, should also reflect their increased responsibilities.

Amendment  33

 

Proposal for a regulation

Recital 28

 

Text proposed by the Commission

Amendment

(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positives and avoid to the extent erroneous reporting to the EU Centre, providers should ensure human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.

(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently accurate and reliable, as well as to identify false positives and false negatives and avoid to the extent erroneous reporting to the EU Centre, providers should ensure adequate human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and false positives generated by the technologies, based on an analysis of anonymised representative data samples. Providers should ensure that staff carrying out such task is adequately trained.

Amendment  34

 

Proposal for a regulation

Recital 29

 

Text proposed by the Commission

Amendment

(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiative. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them.

(29) Providers of hosting services, and providers of publicly available number-independent interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, upon obtaining actual knowledge on potential online child sexual abuse on their services, they should act expeditiously to remove or to disable access to that content and to report it to the EU Centre in accordance with this Regulation. The removal or disabling of access should respect the fundamental rights of the recipients of the service, including the right to freedom of expression and of information.

 

In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Providers can obtain such actual knowledge on potential online child sexual abuse on their services, for example, through its own-initiative investigations, through the execution of detection orders, through notifications done by the Coordinating Authorities, as well as through information flagged by users, self-reported by victims or organisations, such as hotlines, acting in the public interest against child sexual abuse. To this end, it is important that providers, regardless of their size, have the obligation to put in place mechanisms that facilitate the flagging or notification of online child sexual abuse.

 

Those reports should report contain a minimum of information as specified in this Regulation, and providers should ensure the quality of the information submitted so the EU Centre can conduct its assessment and competent law enforcement authorities can focus on reports that are most likely to lead to the recovery of a child or/and the arrest of an offender, or both.

Amendment  35

 

Proposal for a regulation

Recital 30

 

Text proposed by the Commission

Amendment

(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.

(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection and in order to stop or limit its dissemination, Coordinating Authorities of establishment should have the power to request competent judicial authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should, without undue delay, inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions, established for a limited time period, needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences. As a matter of principle, removal orders should be addressed to the service provider acting as a controller. However, in some circumstances, determining whether a service provider has the role of controller or processor can prove particularly challenging or addressing the controller could be detrimental to an ongoing investigation. Consequently, by way of derogation, it should be possible to address a removal order directly to the service provider that stores or otherwise processes the data.

Amendment  36

 

Proposal for a regulation

Recital 31

 

Text proposed by the Commission

Amendment

(31) The rules of this Regulation should not be understood as affecting the requirements regarding removal orders set out in Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC].

(31) The rules of this Regulation should not be understood as affecting the requirements regarding removal orders set out in Regulation (EU) 2022/2065.

Amendment  37

 

Proposal for a regulation

Recital 34

 

Text proposed by the Commission

Amendment

(34) Considering that acquiring, possessing, knowingly obtaining access and transmitting child sexual abuse material constitute criminal offences under Directive 2011/93/EU, it is necessary to exempt providers of relevant information society services from criminal liability when they are involved in such activities, insofar as their activities remain strictly limited to what is needed for the purpose of complying with their obligations under this Regulation and they act in good faith.

(34) Considering that acquiring, possessing, knowingly obtaining access and transmitting child sexual abuse material constitute criminal offences under Directive 2011/93/EU, it is necessary to exempt providers of relevant information society services from criminal liability when they are involved in such activities, including when carrying out voluntary own-initiative investigations, or taking other measures, insofar as their activities remain strictly limited to what is needed for the purpose of complying with their obligations under the Union law, including this Regulation and they act in good faith and in a diligent manner.

Amendment  38

 

Proposal for a regulation

Recital 35

 

Text proposed by the Commission

Amendment

(35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation.

(35) Each act of dissemination of child sexual abuse material, including the non-consensual dissemination of self-generated material, is a criminal offence that affects the rights of the victims depicted, of whom the vast majority are girls. Repeated dissemination of child sexual abuse material constitutes a form of revictimization which could cause long-lasting negative consequences on the victim, and may reach extreme level in cases of so-called ‘highly traded’ material. Victims or their parents and guardians or legal representatives acting on their behalf should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available number-independent interpersonal communications services in accordance with this Regulation. In dealing with such requests from cases of highly traded child sexual abuse material, particular care should be taken by the EU Centre and Coordinating Authorities to ensure the safeguarding of the victims concerned. For that purpose, staff dealing with such cases shall be specifically trained to interact with victims of serious abuse.

 

This information should be provided, within a reasonable period of time, in the language indicated by the victim, in a confidential, age-appropriate, accessible, understandable and gender-sensitive manner and tailored to the specific vulnerabilities of the victims, such as their disability. The information should also cover information regarding vitims’ rights, support and assistance.

Amendment  39

 

Proposal for a regulation

Recital 36

 

Text proposed by the Commission

Amendment

(36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities.

(36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims or their parents and guardians or legal representatives who request the removal or disabling of access of the material in question in a timely manner, in order to minimise the impact that such offences have on the physical and mental health of the victim. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted and receive adequate support by specifically trained staff of the EU Centre in this regard, via the Coordinating Authorities.

Amendment  40

 

Proposal for a regulation

Recital 37

 

Text proposed by the Commission

Amendment

(37) To ensure the efficient management of such victim support functions, victims should be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre.

(37) To ensure the efficient management of such victim support functions, victims should be informed about the existence of such functions and be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre.

Amendment  41

 

Proposal for a regulation

Recital 38 a (new)

 

Text proposed by the Commission

Amendment

 

(38a) The Union budget should provide complementary funding to ensure a high level of support and protection for victims, including through sufficient resources in dedicated funding programmes, and through the promotion of innovative solutions to improve the quality and accessibility of the needed services. The relevant programmes under the next Multiannual Financial Framework should contain sufficient financial and human resources to ensure sufficient funding for an adequate Union contribution to the proper implementation.

Amendment  42

 

Proposal for a regulation

Recital 40

 

Text proposed by the Commission

Amendment

(40) In order to facilitate smooth and efficient communications by electronic means, including, where relevant, by acknowledging the receipt of such communications, relating to matters covered by this Regulation, providers of relevant information society services should be required to designate a single point of contact and to publish relevant information relating to that point of contact, including the languages to be used in such communications. In contrast to the provider’s legal representative, the point of contact should serve operational purposes and should not be required to have a physical location. Suitable conditions should be set in relation to the languages of communication to be specified, so as to ensure that smooth communication is not unreasonably complicated. For providers subject to the obligation to establish a compliance function and nominate compliance officers in accordance with Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC], one of these compliance officers may be designated as the point of contact under this Regulation, in order to facilitate coherent implementation of the obligations arising from both frameworks.

(40) In order to facilitate smooth and efficient communications by electronic means, including, where relevant, by acknowledging the receipt of such communications, relating to matters covered by this Regulation, providers of relevant information society services should be required to designate a single point of contact and to publish relevant information relating to that point of contact, including the languages to be used in such communications. In contrast to the provider’s legal representative, the point of contact should serve operational purposes and should not be required to have a physical location. Suitable conditions should be set in relation to the languages of communication to be specified, so as to ensure that smooth communication is not unreasonably complicated. For providers subject to the obligation to establish a compliance function and nominate compliance officers in accordance with Regulation (EU) 2022/2065, one of these compliance officers may be designated as the point of contact under this Regulation, in order to facilitate coherent implementation of the obligations arising from both frameworks.

Amendment  43

 

Proposal for a regulation

Recital 42

 

Text proposed by the Commission

Amendment

(42) Where relevant and convenient, subject to the choice of the provider of relevant information society services and the need to meet the applicable legal requirements in this respect, it should be possible for those providers to designate a single point of contact and a single legal representative for the purposes of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] and this Regulation.

(42) Where relevant and convenient, subject to the choice of the provider of relevant information society services and the need to meet the applicable legal requirements in this respect, it should be possible for those providers to designate a single point of contact and a single legal representative for the purposes of Regulation (EU) 2022/2065, and this Regulation.

Amendment  44

 

Proposal for a regulation

Recital 44

 

Text proposed by the Commission

Amendment

(44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation, without prejudice to the enforcement powers of other national authorities.

(44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation, including issues related to prevention and combating child sexual abuse and assistance to victims, without prejudice to the enforcement powers of other national authorities.

Amendment  45

 

Proposal for a regulation

Recital 47

 

Text proposed by the Commission

Amendment

(47) The Coordinating Authority, as well as other competent authorities, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities have not only the necessary investigatory and enforcement powers, but also the necessary financial, human, technological and other resources to adequately carry out their tasks under this Regulation. In particular, given the variety of providers of relevant information society services and their use of advanced technology in offering their services, it is essential that the Coordinating Authority, as well as other competent authorities, are equipped with the necessary number of staff, including experts with specialised skills. The resources of Coordinating Authorities should be determined taking into account the size, complexity and potential societal impact of the providers of relevant information society services under the jurisdiction of the designating Member State, as well as the reach of their services across the Union.

(47) The Coordinating Authority, as well as other competent authorities, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities have not only the necessary investigatory and enforcement powers, but also all necessary resources, including sufficient financial, human, technological and other resources to adequately carry out their tasks under this Regulation. In particular, given the variety of providers of relevant information society services and their use of advanced technology in offering their services, it is essential that the Coordinating Authority, as well as other competent authorities, are equipped with the necessary number of staff, including experts with specialised skills. The resources of Coordinating Authorities should be determined taking into account the size, complexity and potential societal impact of the providers of relevant information society services under the jurisdiction of the designating Member State, as well as the reach of their services across the Union.

Amendment  46

 

Proposal for a regulation

Recital 48

 

Text proposed by the Commission

Amendment

(48) Given the need to ensure the effectiveness of the obligations imposed, Coordinating Authorities should be granted enforcement powers to address infringements of this Regulation. These powers should include the power to temporarily restrict access of users of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider on which the infringement takes place. In light of the high level of interference with the rights of the service providers that such a power entails, the latter should only be exercised when certain conditions are met. Those conditions should include the condition that the infringement results in the regular and structural facilitation of child sexual abuse offences, which should be understood as referring to a situation in which it is apparent from all available evidence that such facilitation has occurred on a large scale and over an extended period of time.

(48) Given the need to ensure the effectiveness of the obligations imposed, Coordinating Authorities should be granted enforcement powers to address infringements of this Regulation. These powers should include the power to request the competent judicial authority of the Member State that designated them to temporarily restrict access of users of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider on which the infringement takes place. In light of the high level of interference with the rights of the users and of the service providers that such a power entails, the latter should only be exercised when certain conditions are met. Those conditions should include the condition that the infringement results in the regular and structural facilitation of child sexual abuse offences, which should be understood as referring to a situation in which it is apparent from all available evidence that such facilitation has occurred on a large scale and over an extended period of time.

Amendment  47

 

Proposal for a regulation

Recital 49

 

Text proposed by the Commission

Amendment

(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.

(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal, blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.

Amendment  48

 

Proposal for a regulation

Recital 50

 

Text proposed by the Commission

Amendment

(50) With a view to ensuring that providers of hosting services are aware of the misuse made of their services and to afford them an opportunity to take expeditious action to remove or disable access on a voluntary basis, Coordinating Authorities of establishment should be able to notify those providers of the presence of known child sexual abuse material on their services and requesting removal or disabling of access thereof, for the providers’ voluntary consideration. Such notifying activities should be clearly distinguished from the Coordinating Authorities’ powers under this Regulation to request the issuance of removal orders, which impose on the provider concerned a binding legal obligation to remove or disable access to the material in question within a set time period.

(50) With a view to ensuring that providers of hosting services are aware of the misuse made of their services and to afford them an opportunity to take expeditious action to remove or disable access, Coordinating Authorities of establishment should be able to notify those providers of the presence of known child sexual abuse material on their services and requesting removal or disabling of access thereof. Such notifying activities should be clearly distinguished from the Coordinating Authorities’ powers under this Regulation to request the competent judicial authority of the Member State that designated them the issuance of removal orders.

Amendment  49

 

Proposal for a regulation

Recital 53

 

Text proposed by the Commission

Amendment

(53) Member States should ensure that for infringements of the obligations laid down in this Regulation there are penalties that are effective, proportionate and dissuasive, taking into account elements such as the nature, gravity, recurrence and duration of the infringement, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the provider of relevant information society services concerned.

(53) Member States should ensure that for infringements of the obligations laid down in this Regulation there are penalties which can be of an administrative or criminal nature, as well as, where appropriate, fining guidelines, that are effective, proportionate and dissuasive, taking into account elements such as the nature, gravity, recurrence and duration of the infringement, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the provider of relevant information society services concerned. Particularly severe penalties should be imposed in the event that the provider of relevant information society services in the event that those service providers concerned systematically or persistently fail to comply with the obligations set out in this Regulation. Member States should ensure that those penalties do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.

Amendment  50

 

Proposal for a regulation

Recital 55

 

Text proposed by the Commission

Amendment

(55) It is essential for the proper functioning of the system of mandatory detection and blocking of online child sexual abuse set up by this Regulation that the EU Centre receives, via the Coordinating Authorities, material identified as constituting child sexual abuse material or transcripts of conversations identified as constituting the solicitation of children, such as may have been found for example during criminal investigations, so that that material or conversations can serve as an accurate and reliable basis for the EU Centre to generate indicators of such abuses. In order to achieve that result, the identification should be made after a diligent assessment, conducted in the context of a procedure that guarantees a fair and objective outcome, either by the Coordinating Authorities themselves or by a court or another independent administrative authority than the Coordinating Authority. Whilst the swift assessment, identification and submission of such material is important also in other contexts, it is crucial in connection to new child sexual abuse material and the solicitation of children reported under this Regulation, considering that this material can lead to the identification of ongoing or imminent abuse and the rescuing of victims. Therefore, specific time limits should be set in connection to such reporting.

(55) It is essential for the proper functioning of the system of mandatory detection and blocking of online child sexual abuse set up by this Regulation that the EU Centre receives, via the Coordinating Authorities, material identified as constituting child sexual abuse material or transcripts of conversations identified as constituting the solicitation of children, such as may have been found for example during criminal investigations, so that that material or conversations can serve as an accurate and reliable basis for the EU Centre to generate indicators of such abuses. In order to achieve that result, the identification should be made after a diligent assessment, conducted in the context of a procedure that guarantees a fair and objective outcome, either by the Coordinating Authorities themselves or by a court or another independent administrative authority than the Coordinating Authority which must be subject to judicial validation. Whilst the swift assessment, identification and submission of such material is important also in other contexts, it is crucial in connection to new child sexual abuse material and the solicitation of children reported under this Regulation, considering that this material can lead to the identification of ongoing or imminent abuse and the rescuing of victims. Therefore, specific time limits should be set in connection to such reporting.

Amendment  51

 

Proposal for a regulation

Recital 58

 

Text proposed by the Commission

Amendment

(58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary information-sharing systems. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’) and national authorities to build on existing systems and best practices, where relevant.

(58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary secure information-sharing systems, such as, once available, the software provided by eu-LISA pursuant to Regulation1a (EU) 2023/969. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’) and national authorities to build on existing systems and best practices, where relevant.

 

__________________

 

1a Regulation (EU) 2023/969 establishing a collaboration platform to support the functioning of joint investigation teams and amending Regulation (EU) 2018/1726

Amendment  52

 

Proposal for a regulation

Recital 59

 

Text proposed by the Commission

Amendment

(59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU Centre should serve as a central facilitator, carrying out a range of specific tasks. The performance of those tasks requires strong guarantees of independence, in particular from law enforcement authorities, as well as a governance structure ensuring the effective, efficient and coherent performance of its different tasks, and legal personality to be able to interact effectively with all relevant stakeholders. Therefore, it should be established as a decentralised Union agency.

(59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU Centre should serve as a central facilitator, carrying out a range of specific tasks. The performance of those tasks requires strong guarantees of independence, in particular from law enforcement authorities, a governance structure ensuring the effective, efficient and coherent performance of its different tasks, legal personality to be able to interact effectively with all relevant stakeholders and an autonomous budget. Therefore, it should be established as a decentralised Union agency, and provided with the necessary human and financial resources to fulfil the objectives, tasks and responsibilities assigned to it under this Regulation, including expenditure related to the making available of technologies and the costs related to the analysis of data samples undertaken for micro, small and medium enterprises. It should be mainly financed by a contribution from the general budget of the Union, with the necessary appropriations drawn exclusively from unallocated margins under the relevant heading of the Multiannual Financial Framework and/or through the mobilisation of the relevant special instruments. In order to ensure that the Agency can respond flexibly to human resource needs, it is in particular appropriate that it has autonomy regarding the recruitment of contract agents.

Amendment  53

 

Proposal for a regulation

Recital 59 a (new)

 

Text proposed by the Commission

Amendment

 

(59a) Taking into consideration the central role of the EU Centre in the implementation of the Regulation and in view of the date of expiry of the interim Regulation on 3 August 2024, the EU Centre activities should start as soon as possible. The Commission should allocate an adequate level of resources for the quick establishment and initial operation of the EU Centre and provide commensurate assistance, including by seconding staff, to help the EU Centre reaching cruising speed in due time and no later than three years after the adoption of this Regulation.

Amendment  54

 

Proposal for a regulation

Recital 59 b (new)

 

Text proposed by the Commission

Amendment

 

(59b) The arrangements concerning the seat of the EU Centre should be laid down in a headquarters agreement between the EU Centre and the host Member State. The headquarters agreement should stipulate the conditions of establishment of the seat and the advantages conferred by the Member State on the EU Centre and its staff. In line with point 9 of the Common Approach of 19 July 2012 on the location of the seats of decentralized agencies, the EU Centre should conclude a headquarters agreement with the host Member State in a timely manner before it starts its operational phase. In light of the case-law of the Court of Justice, the choice of the location of the seat should be made in accordance with the ordinary legislative procedure and should comply with the criteria laid down in this Regulation.

Amendment  55

 

Proposal for a regulation

Recital 59 c (new)

 

Text proposed by the Commission

Amendment

 

(59c) The selection procedure for the location of the seat of the EU Centre should respect the following steps: (i) Parliament’s mandate for the interinstitutional negotiations would provide criteria for the selection of the host city; (ii) Parliament would negotiate those criteria with the Council; (iii) such criteria would constitute the basis for an inter-institutional call for applications made together by Parliament and Council; (iv) the candidates would be invited to joint hearings among Parliament and Council; (v) Parliament’s negotiating team would draw a short-list of candidates; (vi) such short-list would be negotiated against the Council’s short-list; (vii) before an agreement among co-legislators on the host city is reached; (viii) and before the plenary approves the outcome of the interinstitutional negotiations.

Amendment  56

 

Proposal for a regulation

Recital 60

 

Text proposed by the Commission

Amendment

(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the generation and sharing of knowledge and expertise related to online child sexual abuse.

(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available number-independent interpersonal communications services and providers of internet access services, The EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as proactively and on its own initiative conduct searches on publicly accessible content on hosting services for known child sexual abuse material. The EU Centre should facilitate the generation and sharing of knowledge, best practices and expertise related to online child sexual abuse, supporting the development of awareness-raising and prevention campaigns, educational and intervention programs, tools and materials in order to increase digital skills, while integrating a child rights perspective and ensuring a gender-sensitive and age-appropriate approach. The EU Centre should promote and ensure the appropriate support and assistance to victims.

Amendment  57

 

Proposal for a regulation

Recital 61

 

Text proposed by the Commission

Amendment

(61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.

(61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable hashes and indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.

Amendment  58

 

Proposal for a regulation

Recital 62

 

Text proposed by the Commission

Amendment

(62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for each of those three types of online child sexual abuse, and with maintaining and operating those databases. For accountability purposes and to allow for corrections where needed, it should keep records of the submissions and the process used for the generation of the indicators.

(62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for known child sexual abuse material, new child sexual abuse material and solicitation of children and with maintaining, timely updating and operating those databases. For accountability purposes and to allow for corrections where needed, it should keep records of the submissions and the process used for the generation of the indicators.

Amendment  59

 

Proposal for a regulation

Recital 63

 

Text proposed by the Commission

Amendment

(63) For the purpose of ensuring the traceability of the reporting process and of any follow-up activity undertaken based on reporting, as well as of allowing for the provision of feedback on reporting to providers of hosting services and providers of publicly available interpersonal communications services, generating statistics concerning reports and the reliable and swift management and processing of reports, the EU Centre should create a dedicated database of such reports. To be able to fulfil the above purposes, that database should also contain relevant information relating to those reports, such as the indicators representing the material and ancillary tags, which can indicate, for example, the fact that a reported image or video is part of a series of images and videos depicting the same victim or victims.

(63) For the purpose of ensuring the traceability of the reporting process and of any follow-up activity undertaken based on reporting, as well as of allowing for the provision of feedback on reporting to providers of hosting services and providers of publicly available number-independent interpersonal communications services, generating statistics concerning reports and the reliable and swift management and processing of reports, the EU Centre should create a dedicated database of such reports. To be able to fulfil the above purposes, that database should also contain relevant information relating to those reports, such as the indicators representing the material and ancillary tags, which can indicate, for example, the fact that a reported image or video is part of a series of images and videos depicting the same victim or victims.

Amendment  60

 

Proposal for a regulation

Recital 64

 

Text proposed by the Commission

Amendment

(64) Given the sensitivity of the data concerned and with a view to avoiding any errors and possible misuse, it is necessary to lay down strict rules on the access to those databases of indicators and databases of reports, on the data contained therein and on their security. In particular, the data concerned should not be stored for longer than is strictly necessary. For the above reasons, access to the database of indicators should be given only to the parties and for the purposes specified in this Regulation, subject to the controls by the EU Centre, and be limited in time and in scope to what is strictly necessary for those purposes.

(64) Given the sensitivity of the data concerned and with a view to avoiding any errors and possible misuse, it is necessary to lay down strict rules on the access to those databases of indicators and databases of reports, on the data contained therein and on their security. In particular, the data concerned should not be stored for longer than is strictly necessary. For the above reasons, access to the database of indicators should be given only upon request to the parties and for the purposes specified in this Regulation, subject to the controls by the EU Centre, and be limited in time and in scope to what is strictly necessary for those purposes.

Amendment  61

 

Proposal for a regulation

Recital 65

 

Text proposed by the Commission

Amendment

(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU Centre should assess those reports in order to identify those that are manifestly unfounded, that is, where it is immediately evident, without any substantive legal or factual analysis, that the reported activities do not constitute online child sexual abuse. Where the report is manifestly unfounded, the EU Centre should provide feedback to the reporting provider of hosting services or provider of publicly available interpersonal communications services in order to allow for improvements in the technologies and processes used and for other appropriate steps, such as reinstating material wrongly removed. As every report could be an important means to investigate and prosecute the child sexual abuse offences concerned and to rescue the victim of the abuse, reports should be processed as quickly as possible.

(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre and those report should be thoroughly assessed in a timely manner to ensure that a decision on the criminal relevance of the reported material is made as early as possible and to limit the retention of irrelevant data as far as possible. Report should be considered unfounded, where it is evident, that the reported activities do not constitute online child sexual abuse. In those cases the EU Centre should provide feedback to the reporting provider of hosting services or provider of publicly available number-independent interpersonal communications services in order to allow for improvements in the technologies and processes used and for other appropriate steps, such as reinstating material wrongly removed. Where the EU Centre considers that a report is not unfounded, it should forward the report to the competent law enforcement authority or authorities of the Member State likely to have jurisdiction to investigate or prosecute the potential child sexual abuse to which the report relates or to Europol in those cases where that competent law enforcement authority or those competent law enforcement authorities cannot be determined with sufficient certainty. Even in cases where the competent national law enforcement authority has been identified, the EU Centre should forward all not unfounded reports to Europol in accordance with the Union law. As what constitutes an actionable report may differ from one Member State to another, due to differing national legislations, every report could serve as an important means to investigate and prosecute the child sexual abuse offences concerned and to rescue the victim of the abuse.

Amendment  62

 

Proposal for a regulation

Recital 66

 

Text proposed by the Commission

Amendment

(66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to request the provider of the hosting service concerned to remove or disable access to the item or items in question, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basis.

(66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to request the provider of the hosting service concerned to remove or disable access to the item or items in question, as soon as possible, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basis. The EU Centre should be able to proactively, on its own initiative, analyse publicly accessible content for known child sexual abuse and to follow publicly accessible uniform resource locators.

Amendment  63

 

Proposal for a regulation

Recital 67

 

Text proposed by the Commission

Amendment

(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.

(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, for best practices, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. Where the EU Centre makes technologies available for providers of hosting services and providers of number-independent communication services to install and operate in order to execute detection orders, it should also make publicly available relevant information, such as the detailed licensing conditions, including licensing fees, under which the EU Centre is permitted, or has obtained permission to make such technologies available. Such information should cover all details regarding the procurement of such technologies, as well as their development over time, where relevant.

Amendment  64

 

Proposal for a regulation

Recital 68

 

Text proposed by the Commission

Amendment

(68) Processing and storing certain personal data is necessary for the performance of the EU Centre’s tasks under this Regulation. In order to ensure that such personal data is adequately protected, the EU Centre should only process and store personal data if strictly necessary for the purposes detailed in this Regulation. It should do so in a secure manner and limit storage to what is strictly necessary for the performance of the relevant tasks.

(68) Processing and storing certain personal data is necessary for the performance of the EU Centre’s tasks under this Regulation. In order to ensure that such personal data is adequately protected, the EU Centre should only process and store personal data if strictly necessary for the purposes detailed in this Regulation. It should do so in a secure and supervised manner and limit storage to what is strictly necessary for the performance of the relevant tasks.

Amendment  65

 

Proposal for a regulation

Recital 69

 

Text proposed by the Commission

Amendment

(69) In order to allow for the effective and efficient performance of its tasks, the EU Centre should closely cooperate with Coordinating Authorities, the Europol and relevant partner organisations, such as the US National Centre for Missing and Exploited Children or the International Association of Internet Hotlines (‘INHOPE’) network of hotlines for reporting child sexual abuse material, within the limits sets by this Regulation and other legal instruments regulating their respective activities. To facilitate such cooperation, the necessary arrangements should be made, including the designation of contact officers by Coordinating Authorities and the conclusion of memoranda of understanding with Europol and, where appropriate, with one or more of the relevant partner organisations.

(69) In order to allow for the effective and efficient performance of its tasks, the EU Centre should closely cooperate with Coordinating Authorities, the Europol and relevant partner organisations, such as the US National Centre for Missing and Exploited Children or the International Association of Internet Hotlines (‘INHOPE’) network of hotlines for reporting child sexual abuse material, within the limits sets by this Regulation and other legal instruments regulating their respective activities. To facilitate such cooperation, the necessary arrangements should be made, including the designation of contact officers by Coordinating Authorities and the conclusion of publicly accessible memoranda of understanding with Europol and, where appropriate, with one or more of the relevant partner organisations.

Amendment  66

 

Proposal for a regulation

Recital 70

 

Text proposed by the Commission

Amendment

(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.

(70) Hotlines play a very important role in the fight against child sexual abuse online, namely with regard to the reporting, detection and rapid removal of child sexual abuse material. Helplines are also essential in providing support for children in need. Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they cooperate and coordinate effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.

Amendment  67

 

Proposal for a regulation

Recital 72

 

Text proposed by the Commission

Amendment

(72) Considering the need for the EU Centre to cooperate intensively with Europol, the EU Centre’s headquarters should be located alongside Europol’s, which is located in The Hague, the Netherlands. The highly sensitive nature of the reports shared with Europol by the EU Centre and the technical requirements, such as on secure data connections, both benefit from a shared location between the EU Centre and Europol. It would also allow the EU Centre, while being an independent entity, to rely on the support services of Europol, notably those regarding human resources management, information technology (IT), including cybersecurity, the building and communications. Sharing such support services is more cost-efficient and ensure a more professional service than duplicating them by creating them anew.

deleted

Amendment  68

 

Proposal for a regulation

Recital 74

 

Text proposed by the Commission

Amendment

(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.

(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection and prevention of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards, data protection and safeguards in detection technology.

Amendment  69

 

Proposal for a regulation

Recital 74 a (new)

 

Text proposed by the Commission

Amendment

 

(74a) One of the pillars of this Regulation is the assistance and support of victims and survivors of child sexual abuse. In order to better understand and address victims’ individual needs is essential to create a forum where victims’ organizations are heard and the EU Center can learn from their experience, expertise and knowledge. The Victims’ Rights and Survivors Consultative Forum should play a key role in advising the EU Center in its approach to all victim-related issues. Its member should be appointed mainly among victims or their parents, guardians or legal representatives, as well as from representatives of organisations acting in the public interest against child sexual abuse and promoting victims’ and survivors’ rights, but could also include members from other organisations such as organisations promoting rights of children belonging to vulnerable groups, organisations promoting children's rights which includes children’s digital rights.

Amendment  70

 

Proposal for a regulation

Recital 75

 

Text proposed by the Commission

Amendment

(75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.

(75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse gender- and age-disaggregated data and information, based on anonymised gathering of non-personal data and to publish in a machine-readable format annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.

Amendment  71

 

Proposal for a regulation

Recital 78

 

Text proposed by the Commission

Amendment

(78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45 provides for a temporary solution in respect of the use of technologies by certain providers of publicly available interpersonal communications services for the purpose of combating online child sexual abuse, pending the preparation and adoption of a long-term legal framework. This Regulation provides that long-term legal framework. Regulation (EU) 2021/1232 should therefore be repealed.

(78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45 provides for a temporary solution in respect of the voluntary use of technologies by certain providers of publicly available interpersonal communications services for the purpose of combating online child sexual abuse. This Regulation, which provides for a clear and uniform long-term legal framework and establishes a mandatory regime for certain providers, should substitute the temporary and voluntary one. However, until the date of effective application of this Regulation and in order to secure that online child sexual abuse online can be effectively and lawfully combated without interruptions and that there is a smooth transition between the voluntary and the mandatory regime, Regulation (EU) 2021/1232 shall apply for a limited period of 9 months after the entry into force of this Regulation.

__________________

__________________

45 Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41).

45 Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41).

Amendment  72

 

Proposal for a regulation

Recital 82

 

Text proposed by the Commission

Amendment

(82) In order to allow all affected parties sufficient time to take the necessary measures to comply with this Regulation, provision should be made for an appropriate time period between the date of its entry into force and that of its application.

(82) In order to allow all affected parties sufficient time to take the necessary measures to comply with this Regulation, and in particular the establishment of the EU Centre, provision should be made for an appropriate time period between the date of its entry into force and that of its application.

Amendment  73

 

Proposal for a regulation

Recital 84

 

Text proposed by the Commission

Amendment

(84) The European Data Protection Supervisor and the European Data Protection Board were consulted in accordance with Article 42(2) of Regulation (EU) 2018/1725 of the European Parliament and of the Council48 and delivered their opinion on […].

(84) The European Data Protection Supervisor and the European Data Protection Board were consulted in accordance with Article 42(2) of Regulation (EU) 2018/1725 of the European Parliament and of the Council48 and delivered their joint opinion on 28 July 2022.

__________________

__________________

48 Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) No 45/2001 and Decision No 1247/2002/EC (OJ L 295, 21.11.2018, p. 39).

48 Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) No 45/2001 and Decision No 1247/2002/EC (OJ L 295, 21.11.2018, p. 39).

Amendment  74

 

Proposal for a regulation

Article 1 – paragraph 1 – subparagraph 1

 

Text proposed by the Commission

Amendment

This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market.

This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse, in order to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter are effectively protected;

Amendment  75

 

Proposal for a regulation

Article 1 – paragraph 1 – subparagraph 2 – point b

 

Text proposed by the Commission

Amendment

(b) obligations on providers of hosting services and providers of interpersonal communication services to detect and report online child sexual abuse;

(b) obligations on providers of hosting services and providers of number-independent interpersonal communication services to detect and report online child sexual abuse;

Amendment  76

 

Proposal for a regulation

Article 1 – paragraph 1 – subparagraph 2 – point d a (new)

 

Text proposed by the Commission

Amendment

 

(d a) obligations on providers of online games; and

Amendment  77

 

Proposal for a regulation

Article 1 – paragraph 1 – subparagraph 2 – point e

 

Text proposed by the Commission

Amendment

(e) rules on the implementation and enforcement of this Regulation, including as regards the designation and functioning of the competent authorities of the Member States, the EU Centre on Child Sexual Abuse established in Article 40 (‘EU Centre’) and cooperation and transparency.

(e) rules on the implementation and enforcement of this Regulation, including as regards the designation and functioning of the competent authorities of the Member States;

Amendment  78

 

Proposal for a regulation

Article 1 – paragraph 1 – subparagraph 2 – point e a (new)

 

Text proposed by the Commission

Amendment

 

(ea) rules on the establishment, functioning, cooperation, transparency and powers of the EU Centre For Child Protection on Child Sexual Abuse established in Article 40 (‘EU Centre’);

Amendment  79

 

Proposal for a regulation

Article 1 – paragraph 2 a (new)

 

Text proposed by the Commission

Amendment

 

2a. This Regulation shall not apply to audio communications.

Amendment  80

 

Proposal for a regulation

Article 1 – paragraph 3 – point b

 

Text proposed by the Commission

Amendment

(b) Directive 2000/31/EC and Regulation (EU) /… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

(b) Directive 2000/31/EC and Regulation (EU) 2022/2065 on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

Amendment  81

 

Proposal for a regulation

Article 1 – paragraph 3 – point d a (new)

 

Text proposed by the Commission

Amendment

 

(da) Directive (EU) 2022/2555 of the European Parliament and the Council of 14 December 2022 on measures for high common level of cybercecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972 and repealing Directive (EU) 2016/1148 (NIS 2 Directive); and

Amendment  82

 

Proposal for a regulation

Article 1 – paragraph 3 – point d b (new)

 

Text proposed by the Commission

Amendment

 

(db) Regulation (EU) .../... on Artificial Intelligence (Artificial Intelligence Act);

Amendment  83

 

Proposal for a regulation

Article 1 – paragraph 3 a (new)

 

Text proposed by the Commission

Amendment

 

3a. Nothing in this Regulation shall be interpreted as prohibiting, weakening or undermining end-to-end encryption. Providers shall not in particular be prohibited to offer end-to-end encrypted services.

Amendment  84

 

Proposal for a regulation

Article 1 – paragraph 3 b (new)

 

Text proposed by the Commission

Amendment

 

3b. Nothing in this Regulation shall undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.

Amendment  85

 

Proposal for a regulation

Article 1 – paragraph 4

 

Text proposed by the Commission

Amendment

4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the detection orders issued in accordance with Section 2 of Chapter 1 of this Regulation.

4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC with the sole objective of enabling relevant information society services to use specific technologies for the processing of personal and other data to the extent strictly necessary to detect and report online child sexual abuse and remove child sexual abuse material from their services for the execution of the detection orders issued in accordance with Section 2 of Chapter 1 of this Regulation.

Amendment  86

 

Proposal for a regulation

Article 2 – paragraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) ‘hosting service’ means an information society service as defined in Article 2, point (f), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

(a) ‘hosting service’ means an information society service as defined in Article 3, point (g), third indent, of Regulation (EU) 2022/2065;

Amendment  87

 

Proposal for a regulation

Article 2 – paragraph 1 – point b a (new)

 

Text proposed by the Commission

Amendment

 

(ba) ‘number-independent interpersonal communications service’ means an interpersonal communications service as defined in Article (2), point (7) of Directive (EU) 2018/1972;

Amendment  88

 

Proposal for a regulation

Article 2 – paragraph 1 – point b b (new)

 

Text proposed by the Commission

Amendment

 

(bb) ‘number-independent interpersonal communications service within games’ means any service defined in Article (2), point (7) of Directive (EU) 2018/1972 which is part of a game;

Amendment  89

 

Proposal for a regulation

Article 2 – paragraph 1 – point c

 

Text proposed by the Commission

Amendment

(c) ‘software application’ means a digital product or service as defined in Article 2, point 13, of Regulation (EU) …/… [on contestable and fair markets in the digital sector (Digital Markets Act)];

(c) ‘software application’ means a digital product or service as defined in Article (2), point (15), of Regulation (EU) 2022/1925;

Amendment  90

 

Proposal for a regulation

Article 2 – paragraph 1 – point d

 

Text proposed by the Commission

Amendment

(d) ‘software application store’ means a service as defined in Article 2, point 12, of Regulation (EU) …/… [on contestable and fair markets in the digital sector (Digital Markets Act)];

(d) ‘software application store’ means a service as defined in Article (2), point (14), of Regulation (EU) 2022/1925;

Amendment  91

 

Proposal for a regulation

Article 2 – paragraph 1 – point f – point ii

 

Text proposed by the Commission

Amendment

(ii) an interpersonal communications service;

(ii) a number-independent interpersonal communications service;

Amendment  92

 

Proposal for a regulation

Article 2 – paragraph 1 – point f – point iv a (new)

 

Text proposed by the Commission

Amendment

 

(iva) a number-independent interpersonal communication service within online games.

Amendment  93

 

Proposal for a regulation

Article 2 – paragraph 1 – point g

 

Text proposed by the Commission

Amendment

(g) ‘to offer services in the Union’ means to offer services in the Union as defined in Article 2, point (d), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

(g) ‘to offer services in the Union’ means to offer services in the Union as defined in Article 3, point (d), of Regulation (EU) 2022/2065;

Amendment  94

 

Proposal for a regulation

Article 2 – paragraph 1 – point j

 

Text proposed by the Commission

Amendment

(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 17 years;

deleted

Amendment  95

 

Proposal for a regulation

Article 2 – paragraph 1 – point m

 

Text proposed by the Commission

Amendment

(m) ‘known child sexual abuse material’ means potential child sexual abuse material detected using the indicators contained in the database of indicators referred to in Article 44(1), point (a);

(m) ‘known child sexual abuse material’ means child sexual abuse material detected using the indicators contained in the database of indicators referred to in Article 44(1), point (a);

Amendment  96

 

Proposal for a regulation

Article 2 – paragraph 1 – point q a (new)

 

Text proposed by the Commission

Amendment

 

(qa) 'victim' means a person who being under 18 suffered child sexual abuse offences or/and whose child sexual abuse material is hosted or disseminated in the Union;

Amendment  97

 

Proposal for a regulation

Article 2 – paragraph 1 – point r

 

Text proposed by the Commission

Amendment

(r) ‘recommender system’ means the system as defined in Article 2, point (o), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

(r) ‘recommender system’ means the system as defined in Article 2, point (o), of Regulation (EU) 2022/2065;

Amendment  98

 

Proposal for a regulation

Article 2 – paragraph 1 – point s

 

Text proposed by the Commission

Amendment

(s) ‘content data’ means data as defined in Article 2, point 10, of Regulation (EU) … [on European Production and Preservation Orders for electronic evidence in criminal matters (…/… e-evidence Regulation)];

(s) ‘content data’ means texts, videos and images;

Amendment  99

 

Proposal for a regulation

Article 2 – paragraph 1 – point t

 

Text proposed by the Commission

Amendment

(t) ‘content moderation’ means the activities as defined in Article 2, point (p), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

(t) ‘content moderation’ means the activities as defined in Article 2, point (t), of Regulation (EU) 2022/2065;

Amendment  100

 

Proposal for a regulation

Article 2 – paragraph 1 – point v

 

Text proposed by the Commission

Amendment

(v) ‘terms and conditions’ means terms and conditions as defined in Article 2, point (q), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];

(v) ‘terms and conditions’ means terms and conditions as defined in Article 2, point (u), of Regulation (EU) 2022/2065;

Amendment  101

 

Proposal for a regulation

Article 2 – paragraph 1 – point w a (new)

 

Text proposed by the Commission

Amendment

 

(wa) ‘hotline’ means an organisation officially recognised by its Member State of establishment that provides a mechanism, other than the reporting channels provided by law enforcement authorities, for receiving anonymous complaints from victims and the public about alleged online child sexual abuse;

Amendment  102

 

Proposal for a regulation

Article 2 – paragraph 1 – point w b (new)

 

Text proposed by the Commission

Amendment

 

(wb) “help-line” means an organisation that provides services for children in need officially recognised by its Member State of establishment;

Amendment  103

 

Proposal for a regulation

Article 3 – paragraph 1

 

Text proposed by the Commission

Amendment

1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse.

1. Providers of hosting services and providers of number-independent interpersonal communications services shall identify, analyse and assess for each such service that they offer, the significant risk stemming, inter alia, from the design, functioning and use of their services for the purpose of online child sexual abuse. That risk assessment shall be specific to the services they offer and proportionate to the risk considering its severity and probability. To that end, providers subject to an obligation to conduct a risk assessment under Regulation (EU) 2022/2065 may draw on that risk assessment and complement it with a more specific assessment of the risks of the use of their services for the purpose of online child sexual abuse.

Amendment  104

 

Proposal for a regulation

Article 3 – paragraph 1 a (new)

 

Text proposed by the Commission

Amendment

 

1a. Providers which are not substantially exposed to online child sexual abuse and which are not very large online platforms pursuant to Article 33 of Regulation (EU) 2022/2065 are exempted from these obligations provided for in this Article and Article 4.

 

A hosting service provider or a number-independent interpersonal communication service provider is substantially exposed to online child sexual abuse and therefore subject to the obligation to conduct a risk assessment in accordance with this Article:

 

(a) if it has received two removal orders in the previous 12 months;

 

(b) from the moment the provider becomes aware of any information indicating potential online child sexual abuse on its services and submits, in accordance with Article 12, a report to the EU Centre; or

 

(c) from the moment the provider is notified by the national competent authority or by the EU Centre, in accordance with Article 49, of the presence of one or more specific items of known child sexual abuse material on its services.

Amendment  105

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – introductory part

 

Text proposed by the Commission

Amendment

(b) the existence and implementation by the provider of a policy and the availability of functionalities to address the risk referred to in paragraph 1, including through the following:

(b) the existence and implementation by the provider of a policy and the availability and effectiveness of functionalities and protocols to prevent and address the risk referred to in paragraph 1, including through the following:

Amendment  106

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 2

 

Text proposed by the Commission

Amendment

 measures taken to enforce such prohibitions and restrictions;

 measures taken to enforce such prohibitions and restrictions and the amount of human and financial resources dedicated to address child sexual abuse material;

Amendment  107

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 2 a (new)

 

Text proposed by the Commission

Amendment

 

 information and awareness campaigns educating and warning users of the risk of online child sexual abuse;

Amendment  108

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 3 a (new)

 

Text proposed by the Commission

Amendment

 

 functionalities enabling meaningful and proportionate age-appropriate parental controls;

Amendment  109

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 3 b (new)

 

Text proposed by the Commission

Amendment

 

 functionalities, according to Article 12 (3), enabling users to flag or notify potential online child sexual abuse to the provider;

Amendment  110

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 3 c (new)

 

Text proposed by the Commission

Amendment

 

 the capacity of the provider, having regard to the state of the art, to meaningfully deal with those reports and notifications in a timely manner;

Amendment  111

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 3 d (new)

 

Text proposed by the Commission

Amendment

 

 systems and mechanisms that provide child- and user-friendly resources to ensure that children can seek help swiftly, including information on how to contact national hotlines, help-lines or national law enforcement;

Amendment  112

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 3 e (new)

 

Text proposed by the Commission

Amendment

 

 functionalities allowing to detect suspicious links, including those coming from the darknet.

Amendment  113

 

Proposal for a regulation

Article 3 – paragraph 2 – point b – indent 4

 

Text proposed by the Commission

Amendment

 functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible and age-appropriate;

deleted

Amendment  114

 

Proposal for a regulation

Article 3 – paragraph 2 – point d

 

Text proposed by the Commission

Amendment

(d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, and the impact thereof on that risk;

(d) the manner in which the provider designed and operates the service, including the design of their recommender systems and any relevant algorithmic systems, the business model, governance, type of users targeted and relevant systems and processes, and the impact thereof on that risk;

Amendment  115

 

Proposal for a regulation

Article 3 – paragraph 2 – point e – point i

 

Text proposed by the Commission

Amendment

(i) the extent to which the service is used or is likely to be used by children;

(i) the extent to which the service is used or is likely to be used by children and the extent to which the service is directly targeting children;

Amendment  116

 

Proposal for a regulation

Article 3 – paragraph 2 – point e – point ii

 

Text proposed by the Commission

Amendment

(ii) where the service is used by children, the different age groups of the child users and the risk of solicitation of children in relation to those age groups;

(ii) where the service is used or is likely to be used by children or directly targeting children, the different age groups or likely age groups of children and the risk of solicitation of children in relation to those age groups;

Amendment  117

 

Proposal for a regulation

Article 3 – paragraph 2 – point e – point iii – indent 1

 

Text proposed by the Commission

Amendment

 enabling users to search for other users and, in particular, for adult users to search for child users;

 enabling users to search for other users, including through search engines external to the service and, in particular, for adult users to search for children;

Amendment  118

 

Proposal for a regulation

Article 3 – paragraph 2 – point e – point iii – indent 2

 

Text proposed by the Commission

Amendment

 enabling users to establish contact with other users directly, in particular through private communications;

 enabling users to initiate unsolicited contact with other users, including children, directly, in particular through private communications;

Amendment  119

 

Proposal for a regulation

Article 3 – paragraph 2 – point e – point iii – indent 3

 

Text proposed by the Commission

Amendment

 enabling users to share images or videos with other users, in particular through private communications.

 enabling users to share unsolicited with other users, in particular through private communications;

Amendment  120

 

Proposal for a regulation

Article 3 – paragraph 2 – point e – point iii – indent 3 a (new)

 

Text proposed by the Commission

Amendment

 

 Enabling users to indicate personal data in their usernames.

Amendment  121

 

Proposal for a regulation

Article 3 – paragraph 2 – point e a (new)

 

Text proposed by the Commission

Amendment

 

(ea) When carrying out a risk assessment, the provider may take into account any other functionality in accordance with the state of the art to address child sexual abuse.

Amendment  122

 

Proposal for a regulation

Article 3 – paragraph 3 – subparagraph 1

 

Text proposed by the Commission

Amendment

The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse, to support the risk assessment.

The provider may request the EU Centre to perform an analysis of methodology for risk assessment, including, where appropriate, to perform a test on anonymized data samples made available to the EU Centre, to support the risk assessment.

Amendment  123

 

Proposal for a regulation

Article 3 – paragraph 3 – subparagraph 1 a (new)

 

Text proposed by the Commission

Amendment

 

The provider may request the EU Centre to perform an analysis of methodology for risk assessment, including, where appropriate, to perform a test on anonymized data samples made available to the EU Centre, to support the risk assessment.

 

Neither the request referred to in the first subparagraph, nor the subsequent analysis that the EU Centre may perform thereunder, shall exempt the provider from its obligation to conduct the risk assessment in accordance with paragraphs 1 and 2 of this Article and to comply with any other obligations set out in this Regulation.

Amendment  124

 

Proposal for a regulation

Article 3 – paragraph 3 – subparagraph 2

 

Text proposed by the Commission

Amendment

The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise, provided the request is reasonably necessary to support the risk assessment.

The costs incurred by the EU Centre for the support of the risk assessment shall be borne by the requesting provider. However, the EU Centre may bear those costs where the provider is a micro, small or medium-sized enterprise. The EU Centre may reject the request where it is not reasonably necessary to support the risk assessment or does not comply with available budgetary resources. The EU Centre shall provide this support in a timely manner.

Amendment  125

 

Proposal for a regulation

Article 3 – paragraph 5

 

Text proposed by the Commission

Amendment

5. The risk assessment shall include an assessment of any potential remaining risk that, after taking the mitigation measures pursuant to Article 4, the service is used for the purpose of online child sexual abuse.

deleted

Amendment  126

 

Proposal for a regulation

Article 3 – paragraph 6

 

Text proposed by the Commission

Amendment

6. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.

6. The Commission in cooperation with Coordinating Authorities and the EU Centre, and after having consulted the European Data Protection Board and having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.

Amendment  127

 

Proposal for a regulation

Article 3 – paragraph 6 a (new)

 

Text proposed by the Commission

Amendment

 

6a. Providers that qualify as small and micro enterprises as defined in Commission Recommendation 2003/361/EC shall carry out a simplified risk assessment by [date of application of this Regulation + 6 months] or, where the provider did not offer the service in the Union by [date of application of this Regulation], by six months from the date at which the provider started offering the service in the Union.

 

The Commission shall be empowered to adopt delegated acts in accordance with Article 86 of this Regulation in order to provide practical support for micro and small enterprises for carrying out the simplified risk assessment

Amendment  128

 

Proposal for a regulation

Article 3 – paragraph 6 b (new)

 

Text proposed by the Commission

Amendment

 

6b. Irrespective of their size or their substantial, exposure to online child sexual abuse, providers of online games that operate number-independent interpersonal communications service within their games, platforms primarily used for the dissemination of pornographic content and providers offering services directly targeting children shall carry out a risk assessment in accordance with Article 3(1) to (4).

Amendment  129

 

Proposal for a regulation

Article 4 – paragraph 1 – introductory part

 

Text proposed by the Commission

Amendment

1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:

1. Providers of hosting services and providers of number-independent interpersonal communications services shall put in place reasonable, proportionate, targeted and effective mitigation measures, tailored to their specific services and the risk identified pursuant to Article 3. The decision as to the choice of mitigation measures shall remain with the provider. Such measures shall include some or all of the following:

Amendment  130

 

Proposal for a regulation

Article 4 – paragraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision-making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions;

(a) testing and adapting, through state of the art appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision-making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, including the speed, quality and effectiveness of processing notices and reports of alleged online child sexual abuse and, where appropriate, the expeditious removal of the child sexual abuse material;

Amendment  131

 

Proposal for a regulation

Article 4 – paragraph 1 – point a a (new)

 

Text proposed by the Commission

Amendment

 

(aa) adapting the design, features and functions of their services in order to ensure the highest level of privacy, safety, and security by design and by default.

 

In particular, when the service is directly targeting children, providers shall include all of the following mitigation measures unless they are not technically feasible for the service:

 

i. limiting users, by default, to establish unsolicited contact with other users directly, in particular through private communications, by asking for user confirmation before allowing an unknown user to communicate and before displaying their communications;

 

ii. limiting users, by default, to directly share unsolicited content with other users directly, in particular through private communications;

 

iii. limiting users, by default, to directly share personal contact details with other users, such as phone numbers, home addresses and e-mail addresses, via pattern-based matching;

 

iv. providing meaningful and proportionate age-appropriate user-device-based parental control tools which allow parents or guardians to exercise appropriate control over children while respecting the fundamental rights and the confidentiality of communications of the child;

 

v. encouraging children, prior to registering for the service, to talk to consult their parents about how the service works and what parental controls tools are available; vi. providing readily accessible mechanisms for users to block or mute other users;

 

vii. providing human moderation of publicly accessible chats, based on random checks, specific channels at high risk of online child sexual abuse;

 

viii. limiting users, by default, to create screenshots or recordings within the service;

 

ix. optionally or by default using purely on-device functionality under full user control, asking for user confirmation and offering guidance before displaying or sharing certain content such as nudity;

 

x. using purely on-device functionality under full user control, displaying warnings and advice to users at risk of offending or victimisation;

 

xi. allowing, by default, that profiles on social networks are not publicly visible.

 

Services not directly targeting children below thirteen years of age taking the measures outlined in this point may allow users to revert such measures on an individual level.

Amendment  132

 

Proposal for a regulation

Article 4 – paragraph 1 – point c

 

Text proposed by the Commission

Amendment

(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .

(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of relevant information society services, public authorities, hotlines, helplines, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 22 of Regulation (EU) 2022/2065;

Amendment  133

 

Proposal for a regulation

Article 4 – paragraph 1 – point c a (new)

 

Text proposed by the Commission

Amendment

 

(ca) informing and reminding users and non-users, such as parents, about the risks related to the use of their services, the nature of the service and the functionalities offered, what constitutes online child sexual abuse and what is typical offender behaviour;

Amendment  134

 

Proposal for a regulation

Article 4 – paragraph 1 – point c b (new)

 

Text proposed by the Commission

Amendment

 

(cb) enabling users according to Article 12 to flag or notify potential online child sexual abuse to the provider;

Amendment  135

 

Proposal for a regulation

Article 4 – paragraph 1 – point c c (new)

 

Text proposed by the Commission

Amendment

 

(cc) reinforcing awareness-raising measures and adapting their online interface for increased in order to give user and child-friendly information about the risk of online child sexual abuse on its services;

Amendment  136

 

Proposal for a regulation

Article 4 – paragraph 1 – point c d (new)

 

Text proposed by the Commission

Amendment

 

(cd) including clearly visible and identifiable information on the minimum age for using the service;

Amendment  137

 

Proposal for a regulation

Article 4 – paragraph 1 – point c e (new)

 

Text proposed by the Commission

Amendment

 

(ce) Setting up mechanisms to raise awareness among users of any potential infringement by them of this Regulation.

Amendment  138

 

Proposal for a regulation

Article 4 – paragraph 2 – introductory part

 

Text proposed by the Commission

Amendment

2. The mitigation measures shall be:

2. The mitigation measures shall meet all of the following requirements:

Amendment  139

 

Proposal for a regulation

Article 4 – paragraph 2 – point a

 

Text proposed by the Commission

Amendment

(a) effective in mitigating the identified risk;

(a) they shall be effective and proportionate in mitigating the identified risk, taking into account the characteristics of the service provided and the manner in which that service is used;

Amendment  140

 

Proposal for a regulation

Article 4 – paragraph 2 – point b

 

Text proposed by the Commission

Amendment

(b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological capabilities and the number of users;

(b) they shall be targeted and proportionate in relation to that risk, the provider’s financial strength, technological and operational capabilities and the number of users and the amount of content that they provide;

Amendment  141

 

Proposal for a regulation

Article 4 – paragraph 2 – point c

 

Text proposed by the Commission

Amendment

(c) applied in a diligent and non-discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected;

(c) they shall be applied in a diligent and non-discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected;

Amendment  142

 

Proposal for a regulation

Article 4 – paragraph 2 – point d

 

Text proposed by the Commission

Amendment

(d) introduced, reviewed, discontinued or expanded, as appropriate, each time the risk assessment is conducted or updated pursuant to Article 3(4), within three months from the date referred to therein.

(d) they shall be introduced, reviewed in light of their effectiveness and adapted in accordance with the state of the art, discontinued or expanded, as appropriate, each time the risk assessment is conducted or updated pursuant to Article 3(4), as soon as possible and in any case within three months from the date referred to therein;

Amendment  143

 

Proposal for a regulation

Article 4 – paragraph 2 – point d a (new)

 

Text proposed by the Commission

Amendment

 

(da) they shall respect the principles of data protection by design and by default, as well as of data minimisation; and

Amendment  144

 

Proposal for a regulation

Article 4 – paragraph 2 – point d b (new)

 

Text proposed by the Commission

Amendment

 

(db) they shall not restrict the possibility to use a service anonymously.

Amendment  145

 

Proposal for a regulation

Article 4 – paragraph 3

 

Text proposed by the Commission

Amendment

3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.

3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, may take the necessary and proportionate age verification measures to reliably identify children on their services, enabling them to take the mitigation measures.

Amendment  146

 

Proposal for a regulation

Article 4 – paragraph 3 a (new)

 

Text proposed by the Commission

Amendment

 

3a. When providers put forward age verification systems, they shall meet the following criteria:

 

(a) Protect the privacy of users and do not disclose or process data gathered for the purposes of age verification for any other purpose;

 

(b) Not collect any data other than the age of the user for the purposes of age verification;

 

(c) Not retain personal data on the age verification process after its completion;

 

(d) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse;

 

(e) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified;

 

(f) Allow selective disclosure of attributes;

 

(g) Use zero-knowledge protocol;

 

(h) Allow users to use anonymous accounts;

 

(i) Not require the identification of each user of a service;

 

(j) Not retain personal data on the age verification process after its completion;

 

(k) Not require the processing of biometric data.

Amendment  147

 

Proposal for a regulation

Article 4 – paragraph 4

 

Text proposed by the Commission

Amendment

4. Providers of hosting services and providers of interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures.

4. Providers of hosting services and providers of number-independent interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures.

Amendment  148

 

Proposal for a regulation

Article 4 – paragraph 5

 

Text proposed by the Commission

Amendment

5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used.

5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having consulted the European Data Protection Board and having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used.

Amendment  149

 

Proposal for a regulation

Article 4 – paragraph 5 a (new)

 

Text proposed by the Commission

Amendment

 

5a. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having consulted the European Data Protection Board shall, by [date - 12 months from the date of entry into force of this Regulation], issue guidelines on how providers may implement age verification or age assessment measures on application of paragraph (3a), based on selective disclosure of attributes and zero-knowledge protocol.

Amendment  150

 

Proposal for a regulation

Article 4 a (new)

 

Text proposed by the Commission

Amendment

 

Article4a

 

Mitigation measures for platforms primarily used for the dissemination of pornographic content

 

Where an online platform is primarily used for the dissemination of pornographic content, the platform shall take the necessary technical and organizational measures to ensure:

 

a. functionalities according to Article 12(3) enabling users to flag or notify potential online child sexual abuse;

 

b. adequate professional human content moderation to rapidly process notices of potential child sexual abuse material;

 

c. automatic mechanisms and interface design elements to inform users about external resources in the user’s region on preventing child sexual abuse, counselling by specialist helplines, victim support and educational resources by hotlines and child protection organizations;

 

d. automatic detection of searches for child sexual abuse material, warning and advice alerts displayed to users doing such searches, and flagging of the search and the user for human moderation;

 

e. functionalities enabling age verification that meet the criteria of Article 4a (new) of this Regulation.

Amendment  151

 

Proposal for a regulation

Article 4 b (new)

 

Text proposed by the Commission

Amendment

 

Article4b

 

Mitigation measures for number-independent interpersonal communications service within games

 

Providers of online games that operate number-independent interpersonal communications service within their games, shall take all of the following mitigation measures in addition to the requirements referred to in Articles 3 and 4:

 

1. prevent users from initiating unsolicited contact with other users;

 

2. facilitate functionalities according to Article 12(3) enabling users to flag or notify potential online child sexual abuse

 

3. provide technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety and that are set to the most private and secure levels by default;

 

4. provide tools in a prominent way on their platform that allow users or their guardians or legal representatives and potential victims to seek help from their local helpline.

Amendment  152

 

Proposal for a regulation

Article 5 – paragraph 1 – introductory part

 

Text proposed by the Commission

Amendment

1. Providers of hosting services and providers of interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:

1. Providers of hosting services and providers of number-independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:

Amendment  153

 

Proposal for a regulation

Article 5 – paragraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remaining risk referred to in Article 3(5);

(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3;

Amendment  154

 

Proposal for a regulation

Article 5 – paragraph 3 – subparagraph 1

 

Text proposed by the Commission

Amendment

Where necessary for that assessment, that Coordinating Authority may require further information from the provider, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than two weeks.

Where necessary for that assessment, that Coordinating Authority may:

 

(a) carry out the consultations with the provider that it may deem necessary to determine whether the requirements of Articles 3 and 4 have been met;

 

(b) require further information and clarification from the provider within a reasonable time period set by that Coordinating Authority which shall not be longer than two weeks;

 

(c) request the EU Centre, the competent data protection authorities, another national public authority or relevant experts or entities to provide the necessary additional information.

Amendment  155

 

Proposal for a regulation

Article 5 – paragraph 3 – subparagraph 2

 

Text proposed by the Commission

Amendment

The time period referred to in the first subparagraph shall be suspended until that additional information is provided.

deleted

Amendment  156

 

Proposal for a regulation

Article 5 – paragraph 4

 

Text proposed by the Commission

Amendment

4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall require the provider to re-conduct or update the risk assessment or to introduce, review, discontinue or expand, as applicable, the mitigation measures, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than one month.

4. Without prejudice to Articles 7 and 27 to 29, where the Coordinating Authorithy of establishment considers that the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall have the power to address a reasoned decision to the provider requiring it to re-conduct or update the risk assessment or to take the necessary mitigation measures so as to ensure that Articles 3 and 4 are complied with, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than one month.

Amendment  157

 

Proposal for a regulation

Article 5 – paragraph 4 a (new)

 

Text proposed by the Commission

Amendment

 

4a. The provider may, at any time, request the Coordinating Authority of establishment to review and, where appropriate, amend or revoke a decision as referred to in paragraph 4. The Coordinating Authority shall, within three months of receipt of the request, adopt a reasoned decision on the request based on objective factors and notify the provider of that decision.

Amendment  158

 

Proposal for a regulation

Article 5 – paragraph 4 b (new)

 

Text proposed by the Commission

Amendment

 

4b. Where the requirements of Articles 3 and 4 are met, the Coordinating Authority shall issue a positive opinion, which shall be transmitted to the EU Centre and taken into account prior to any decision pursuant to Article 7.

Amendment  159

 

Proposal for a regulation

Article 5 – paragraph 5

 

Text proposed by the Commission

Amendment

5. Providers shall, when transmitting the report to the Coordinating Authority of establishment in accordance with paragraph 1, transmit the report also to the EU Centre.

5. The Coordinating Authority of establishment shall transmit the report referred to in paragraph 1 to the EU Centre, as well as any further information resulting from paragraph 3 and, where applicable, the positive opinion issued according to paragraph 4c.

Amendment  160

 

Proposal for a regulation

Article 5 – paragraph 6

 

Text proposed by the Commission

Amendment

6. Providers shall, upon request, transmit the report to the providers of software application stores, insofar as necessary for the assessment referred to in Article 6(2). Where necessary, they may remove confidential information from the reports.

6. Providers shall, upon request, transmit the report to the providers of software application stores, insofar as necessary for the compliance with the obligations set out in Article 6. Where necessary, they may remove confidential information from the reports.

Amendment  161

 

Proposal for a regulation

Article 6 – paragraph 1 – introductory part

 

Text proposed by the Commission

Amendment

1. Providers of software application stores shall:

1. Providers of software application stores considered as gatekeepers under the Regulation (EU) 2022/1925 shall, based on the information provided by the providers of software applications:

Amendment  162

 

Proposal for a regulation

Article 6 – paragraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children;

(a) indicate that the provider of software application does not permit its use by children or that the software application has an age rating model in place;

Amendment  163

 

Proposal for a regulation

Article 6 – paragraph 1 – point b

 

Text proposed by the Commission

Amendment

(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;

(b) when, according to Union law, parental consent is required for children to access the sofware application, make reasonable efforts to verify that the consent is given or authorised by the holder of parental responsibility over the child, taking into consideration the available technology.

Amendment  164

 

Proposal for a regulation

Article 6 – paragraph 1 – point c

 

Text proposed by the Commission

Amendment

(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b).

deleted

Amendment  165

 

Proposal for a regulation

Article 6 – paragraph 2

 

Text proposed by the Commission

Amendment

2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3.

2. Providers of software application stores considered as gatekeepers under the Regulation (EU) 2022/1925 may, when the provider of software application has indicated to the provider of software application store that it does not permit its use by children, take additional measures to implement those restrictions on children, including reasonable measures to prevent children from accessing those software applications. When putting in place age verification systems, providers of software application stores shall meet the criteria set out in Article 4 (3a) of this Regulation.

Amendment  166

 

Proposal for a regulation

Article 6 – paragraph 3

 

Text proposed by the Commission

Amendment

3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures.

3. Where software application stores take measures under this Article, those software application stores shall not be exempted from the obligations set out in this Regulation.

Amendment  167

 

Proposal for a regulation

Article 6 – paragraph 4

 

Text proposed by the Commission

Amendment

4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.

4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having consulted the European Data Protection Board and after having conducted a public consultation, may issue guidelines on the application of paragraph 1 and 2 having due regard in particular to relevant technological developments and to the manners in which the services covered by that provision are offered and used.

Amendment  168

 

Proposal for a regulation

Article 7 – paragraph 1

 

Text proposed by the Commission

Amendment

1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service.

1. The Coordinating Authority of establishment shall have the power, as a last resort after all the measures in Article 3, 4 and 5 have been exhausted, to request the competent judicial authority of the Member State that designated it to issue a detection order requiring a provider of hosting services or a provider of number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect child sexual abuse material on a specific service.

 

The detection order shall be targeted, specified and limited to individual users, a specific group of users, either as such or as subscribers to a specific channel of communication, in respect of whom there are reasonable grounds of suspicion for a link, even an indirect one, with child sexual abuse material as defined in Article 2.

 

Interpersonal communications to which end-to-end encryption is, has been or will be applied shall not be subject to the measures specified in Article 10.

 

Detection orders shall be addressed to the service provider acting as controller in accordance with Regulation (EU) 2016/679. By way of exception, the detection order may be directly addressed to the service provider that stores or otherwise processes the data on behalf of the controller, where:

 

(a) the controller cannot be identified despite reasonable efforts on the part of the issuing authority; or

 

(b) addressing the controller might be detrimental to an ongoing investigation

Amendment  169

 

Proposal for a regulation

Article 7 – paragraph 2 – subparagraph 1

 

Text proposed by the Commission

Amendment

The Coordinating Authority of establishment shall, before requesting the issuance of a detection order, carry out the investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met.

Based on a reasoned justification, the Coordinating Authority of establishment shall, request the issuance of the detection order and the competent judicial authority shall issue the detection order where it considers that all the following conditions are simultaneously met:

 

(a) there are reasonable grounds of suspicion on individual users, or on a specific group of users, either as such or as subscribers to a specific channel of communication, in respect of whom there is a link, even an indirect one, with child sexual abuse material as defined in Article 2. Reasonable grounds of suspicion are those resulting from any information reliable and legally acquired that suggest that individual users, or a specific group of users, either as such or as subscribers to a specific channel of communication might have a link, even an indirect or remote one, with online child sexual abuse material.

 

(b) the mitigation measures put in place by the provider have insufficient material impact on limiting the identified risk or the service provider fails to to put in place reasonable and proportionate mitigation measures set out in this Regulation.

 

(c) issuing the detection order is necessary and proportionate and outweighs negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties, and without jeopardising the security of communications.

Amendment  170

 

Proposal for a regulation

Article 7 – paragraph 2 – subparagraph 2

 

Text proposed by the Commission

Amendment

To that end, it may, where appropriate, require the provider to submit the necessary information, additional to the report and the further information referred to in Article 5(1) and (3), respectively, within a reasonable time period set by that Coordinating Authority, or request the EU Centre, another public authority or relevant experts or entities to provide the necessary additional information.

deleted

Amendment  171

 

Proposal for a regulation

Article 7 – paragraph 3 – subparagraph 1 – introductory part

 

Text proposed by the Commission

Amendment

Where the Coordinating Authority of establishment takes the preliminary view that the conditions of paragraph 4 have been met, it shall:

Where the Coordinating Authority of establishment takes the view that all the conditions of paragraph 2 have been met, it shall:

Amendment  172

 

Proposal for a regulation

Article 7 – paragraph 3 – subparagraph 1 – point a

 

Text proposed by the Commission

Amendment

(a) establish a draft request for the issuance of a detection order, specifying the main elements of the content of the detection order it intends to request and the reasons for requesting it;

(a) establish a draft request to the competent judicial authority of the Member State that designated it for the issuance of a detection order, specifying the factual and legal grounds upon which the request is based and the duration of the order, as well as, the main elements of the content of the detection order it intends to request and the reasons for requesting it;

Amendment  173

 

Proposal for a regulation

Article 7 – paragraph 3 – subparagraph 1 – point d

 

Text proposed by the Commission

Amendment

(d) invite the EU Centre to provide its opinion on the draft request, within a time period of four weeks from the date of receiving the draft request.

(d) invite the EU Centre, and in particular its Technology Committee, to provide its opinion on the draft request, within a time period of four weeks from the date of receiving the draft request.

Amendment  174

 

Proposal for a regulation

Article 7 – paragraph 3 – subparagraph 2 – introductory part