- Terrorist content must be removed within one hour, with additional safeguards
- Clarification of the scope and exceptions for educational and journalistic purposes
- No general obligation to monitor or filter content
- Service providers exposed to terrorist content to take specific measures
On Monday Civil Liberties MEPs confirmed with 54 votes in favour, 13 against and 1 abstention their agreement to a new instrument to address dissemination of terrorist content online.
The new legal act will establish a uniform definition of terrorist content aligned to the definitions in Directive on combating terrorism. It will target materials such as texts, images, sound recordings or videos, including live transmissions, that incite, solicit or contribute to terrorist offences, provide instructions for such offences or solicit people to participate in a terrorist group. It also aims to combat content that provides guidance on how to make and use explosives, firearms and other weapons for terrorist purposes.
Rule to remove terrorist content within one hour
Internet platforms will have to remove terrorist content or disable access to it in all member states as soon as possible and in any event within one hour after they have received a removal order from the competent authorities. The competent authority in any Member State will be able to issue a removal order to any hosting service provider providing services within the EU. The competent authorities in the member state where the service provider has its main establishment will have the right to scrutinise the removal order and block its execution if they consider it seriously or manifestly violates the regulation itself or breaches fundamental rights as enshrined in the Charter. Member States will adopt the rules on penalties for breaches of the obligations, taking into account the nature of the breache and the size of company.
Exceptions for educational and journalistic purposes
If material is disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes to prevent or counter terrorism, it will not be considered terrorist content. This also includes content expressing polemic or controversial views in a public debate on sensitive political questions.
No general obligation to monitor or filter content
Internet platforms will have no general obligation to monitor or filter content. However, if they are exposed to terrorist content, they will have to take specific measures to protect their services against its dissemination. The service provider will decide on those measures and there will be no obligation to use automated tools. Service providers will also need to publish annual transparency reports on action taken against the dissemination of terrorist content.
Both Council and the Parliament still have to complete the formal legislative procedure for adoption at second reading before this Regulation can enter into force. The Regulation will apply from 12 months after its entry into force.