Frances Haugen to MEPs: EU digital rules can be a game changer for the world 

Priopćenje za tisak 
 
 

Podijeli ovu stranicu: 

  • Facebook has put its immense profits before people, damaging the health and safety of users, and threatening our democracies  
  • EU has an historic opportunity to set global standards and inspire other countries  
  • More transparency needed to scrutinise online platforms and make informed choices  

The EU’s future Digital Services Act can set the global standards in transparency, oversight and enforcement, Facebook whistleblower Frances Haugen told MEPs.

The Digital Services Act (DSA) has the potential to be a “global gold standard” and inspire other countries to “pursue new rules that would safeguard our democracies”, Ms Haugen stressed in todays’ hearing. She warned, however, that rules need to be strong on transparency, oversight and enforcement, otherwise “we will lose this once-in-a-generation opportunity to align the future of technology and democracy”.

Protecting users rights and increasing accountability

Ms Haugen’s revelations on Facebook’s practices and how they impact on users and their fundamental rights were troubling to MEPs. They expressed their concerns on, among other issues, the exploitation of children and teenagers’ mental health and on micro-targeting, including for political purposes. Questions focused on how to make the platforms more accountable and to ensure that risk assessment and risk mitigation provisions in the proposed Digital Services Act (DSA) are strong enough to avoid abuses, polarisation, and address risks to democracy.

Members also asked Ms Haugen for her views on regulating not only illegal but also harmful content, on content moderation tools and whether targeted advertising should be banned. They also wanted to know what safeguards she would like to see included in EU digital laws, wondering if the package currently on the table was sufficient. Enforcement tools to make sure the DSA has teeth, the transparency of algorithms, giving academic researchers, NGOs and investigative journalists access to platforms’ data, were other issues addressed at the hearing.

Disclose data and make algorithms safer

In her replies, Ms Haugen emphasised the importance of ensuring that companies like Facebook publicly disclose data and how they collect them (on ranking content, advertising, scoring parameters for example) to allow people to make transparent decisions and prohibit “dark patterns” online. Individuals in these companies, not committees, should personally be held accountable for the decisions they make, she added.

On countering disinformation and demoting harmful content, Ms Haugen stressed that Facebook is substantially less transparent than other platforms and could do much more to make algorithms safer by setting limits on how many times content can be reshared, increasing services to support more languages, transparent risk assessment, making platforms more human-scaled and finding ways for users to moderate each other rather than being moderated by artificial intelligence. She commended lawmakers for their content-neutral approach, but warned against possible loopholes and exemptions for media organisations and trade secrets.

During her presentation, Ms Haugen also mentioned how crucial it is for governments to protect tech whistleblowers, as their testimonies will be key to protecting people from harm caused by digital technologies in the future.

The video recording of the hearing will be available here.

The hearing was organised by the European Parliament’s Internal Market and Consumer Protection Committee, in association with other committees: the Industry, Legal Affairs, and Civil Liberties committees, and the special committees on Disinformation and Artificial Intelligence.

Work on regulating platforms is under way in Parliament

The Internal Market and Consumer Protection Committee is currently discussing how the proposal on the Digital Services Act, presented by the European Commission in December 2020, should be amended and improved. Ms Haugen’s presentation will feed into the work of the committee on the DSA, ahead of the vote (date to be decided soon). This legislation is Europe’s chance to shape the digital economy at the EU level as well as to become a global standard-setter on digital regulation.

Background

Ms Frances Haugen is a former Facebook employee specialised in Computer Engineering and, specifically, in algorithmic product management. At Facebook, Ms Haugen worked as Lead Product Manager on the Civic Misinformation team. This team looked at election interference around the world, and worked with issues related to democracy and misinformation. Facebook terminated this team after the 2020 U.S. election and Ms Haugen contacted the Wall Street Journal shortly after. Ms Haugen disclosed thousands of internal documents that she collected while working for Facebook. Some of the most striking facts backed by the leaked documents include how the use of Instagram is seriously damaging teenagers’ mental health, particularly when it comes to fostering eating and body image disorders. In general, the leaked documents show how Facebook’s public claims on a variety of topics - including, beyond mental health, Facebook’s work on hate speech and freedom of speech - often contradict internal research. Overall, Ms Haugen claims that Facebook (which owns other widely used social media companies such as Instagram) intentionally does not make these platforms safer for users because that would have an impact on their profits.