Answer given by Mr Breton on behalf of the European Commission
The Digital Services Act (DSA) introduces several safeguards to protect user rights and to ensure that intermediary services, notably online platforms, act responsibly when moderating content.
A new due diligence obligation requires providers of all intermediary services to apply any restrictions included in their terms and conditions in a diligent, objective and proportionate manner, with due regard to the rights and legitimate interests of all parties involved, including freedom of expression.
While the Commission is not in a position to comment on individual cases, the discriminatory or disproportionate application of restrictions will be incompatible with the DSA.
Very large online platforms will be subject to a risk mitigation mechanism to address systemic risks stemming from their services, and this from four months after designation as a very large online platform by a Commission decision.
If Meta is designated, it will also have to comply with these enhanced obligations, and, where it identifies risks, appropriate mitigation measures, such as adapt its terms and conditions and their enforcement, as well as its content moderation processes, including any relevant decision-making processes and dedicated resources, can be necessary.
The DSA will require all providers of hosting services to provide a clear statement of reasons to users for imposing certain restrictions, such as restrictions of visibility or suspension of the service, on the grounds that the content provided is illegal or incompatible with terms and conditions.
This will allow users to make meaningful use of a free-of-charge internal complaint-handling mechanism required, and they can initiate out-of-court dispute settlement or turn to the courts if they disagree with a decision of a platform.
-  COM(2020) 825 final.