Parliament has approved new rules enabling online providers to continue to voluntarily detect, remove and report child sexual abuse material online.
According to Europol, the Covid-19 pandemic has led to a considerable increase in child sexual abuse online, which was already at high levels.
Online child abuse and cyber-grooming during the pandemic
As a result of the lockdown measures, children have been spending more time online, often unsupervised, making them more vulnerable to exploitation. Sexual abuse offenders have taken advantage of the situation to access potential victims. There has also been a rise in sextortion incidents and cyber-grooming, which consists in befriending a child online with the aim of committing sexual abuse.
Enabled by digital technologies, offenders can reach children via webcams, connected devices and chat rooms in social media and video games, while remaining anonymous thanks to technologies like cloud computing and the dark web. The use of such technologies by offenders has made it more difficult for law enforcement authorities to detect, investigate and prosecute child sexual abuse online.
According to the Internet Watch Foundation’s annual report, internet service providers in Europe have become the largest hosts of child sexual abuse material in the world.
Tackling online child abuse, while protecting privacy
On 6 July Parliament backed termporary rules allowing the providers of web-based email, chats and messaging services to detect, remove and report child sexual abuse online on a voluntary basis, as well as to use scanning technologies to detect cyber grooming.
Online material linked to child sexual abuse could be detected through so-called hashing technologies that scan content, such as images and videos, while artificial intelligence could be used to analyse text or traffic data and detect online grooming. Audio communications are excluded from the rules.
According to the report, the material will have to be processed using technologies that are the least intrusive to privacy and will not be able to understand the substance of the content but only to detect patterns. Interactions that are covered by professional secrecy, such as between doctors and their patients, will not be interfered with.
In addition, when no online child sexual abuse has been detected, all data will have to be erased immediately after processing and all data with be permanently deleted within three months.
The rules’ approval follows an informal agreement with the Council on 29 April 2021. The legislation will apply for a maximum of three years. In July 2020, the Commission announced that it will propose a more permanent solution to combat child sexual abuse online in the course of 2021.