How the EU is fighting child sexual abuse online

The European Parliament wants to establish effective rules to prevent and combat online child sexual abuse while protecting people’s privacy.

The proliferation of online materials of children engaging or appearing to engage in a sexual act has been on the rise, particularly of materials depicting younger children. In 2023, there were over 36.2 million reports of suspected online child sexual abuse, marking a historic high.

Developing EU legislation on child sexual abuse

The EU has adopted a strategy on combating child sexual abuse. As part of this commitment, the Commission aims to build on the existing rules from 2011. In November 2023, Parliament’s civil liberties committee adopted a report on a proposal for a regulation aiming to prevent and combat child sexual abuse.

Provisional rules from 2021 allow digital companies to look for content being posted on their platforms for child sexual abuse material. The rules provide a temporary exemption from certain EU e-Privacy rules. The proposal that Parliament is working on seeks to establish permanent rules about how companies can detect child sexual abuse material online.

A separate proposal for a directive put forward by the European Commission in 2024 addresses emerging threats linked to technological developments such as live-streaming and the use of artificial intelligence in creating child sexual abuse material.

Safeguarding privacy


The European Parliament wants to strike a balance between safeguarding children in the digital sphere and upholding fundamental rights such as the right to privacy. MEPs’ position on the regulation on combatting child sexual abuse does not endorse widespread web scanning, blanket monitoring of private communications or the creation of backdoors in apps to weaken encryption.

Providers’ duties: risk assessment and mitigation

According to the proposed regulation, providers of hosting or interpersonal communication services would be obliged to perform a risk assessment of the potential presence of sexual content involving children on their services. Once the providers have identified the level of risk, they must implement mitigation measures to address it.


The regulation would provide an extensive list of potential mitigation measures that providers can opt to implement. These include the principle of safety by design (developing products or services in a way that avoids potential harm), mandatory parental controls, the establishment of user reporting mechanisms, and the use of age verification systems when there is a risk of child solicitation.


The regulation would also introduce specific mandatory mitigation measures for services directly targeting children, platforms primarily used for the dissemination of pornographic content, and certain chat services within games.


Service providers would have the autonomy to choose the technologies they would use to fulfil their detection obligations. The rules foresee a simplified procedure for smaller businesses.

Detection Orders as a measure of last resort

If providers fail to meet their obligations, a judicial authority would be able to issue a detection order only as a last resort. This order would compel the provider to employ certain technologies to detect known and new child sexual abuse material.


Detection orders would be used only if there is reasonable suspicion that individual users or groups are linked to child sexual abuse material. The orders would be time-limited, with end-to-end encrypted communication and text messages excluded from their scope. This approach aims to ensure that the privacy and security of users of digital services are maintained.

Support for victims and survivors

The proposal for a regulation on combatting child sexual abuse includes the establishment of an EU Centre for Child Protection. The centre would receive, filter, assess, and forward reports of child sexual abuse content to competent national authorities and Europol. It would also support national authorities, conduct investigations and issue fines.


The Commission’s proposal includes specific rights for victims to request information on online material depicting them and the right to request the removal of this content. Parliament expands these rights to include the right to receive support and assistance from the EU Centre for Child Protection as well as authorities at the national level.

Extension of temporary rules

Parliament approved its negotiation mandate in late November 2023 and is ready to enter into negotiations with the EU countries to determine the final text of the law.


The temporary rules exempting digital companies from e-privacy rules when they look for child sexual abuse material were set to expire in August 2024. To avoid a legal vacuum, Parliament and Council agreed in February 2024 to extend the derogation until April 2026. This provisional agreement was formally adopted by Parliament in April 2024.


At the same time, the co-legislators aim to reach an agreement on the long-term legal framework and avoid further extensions to the temporary derogation.

EU response to new technological risks

To address the evolving threat of online child sexual abuse, Parliament is also advancing legislation that addresses the misuse of new and emerging technologies in such crimes.


In June 2025, the European Parliament adopted its position on the proposal that would explicitly criminalise the use of artificial intelligence systems developed or adapted primarily for child sexual abuse, as well as the live-streaming and online dissemination of child sexual abuse material.


The proposal would also strengthen law enforcement capabilities by permitting undercover operations and the use of covert surveillance techniques, such as digital “honeypots”, which are decoy online environments or profiles used by law enforcement to attract and identify offenders.


These measures aim to equip authorities to respond more effectively to increasingly sophisticated forms of online abuse. The next step is for Parliament to begin negotiations with the Council to finalise the legislation.