Read the open letter

Facebook HQ
1 Hacker Way
Menlo Park, CA
94025

Dear Mr Zuckerberg and Mr Clegg,

We the undersigned are civil society organisations and individuals deeply concerned about Meta’s failure to stem the flood of online hate and incitement to violence in the Ethiopian war. This war is the world’s deadliest - with up to 600,000 dead - and time and again your services have helped to spread messages of hate and incite violence. By failing to invest in and deploy adequate safety improvements to your software or employ sufficient content moderators, Meta is fanning the flames of hatred, and contributing to thousands of deaths in Ethiopia.

We stand in solidarity with Fisseha Tekle of Amnesty International, Abrham Meareg, an Ethiopian academic and the Katiba Institute, who are taking Facebook to court in Kenya – Facebook’s East African HQ - to say ‘enough’, and to demand urgent action to save lives. Abrham’s father was doxed in a racist attack on Facebook – then murdered shortly afterward. Abrham tried desperately to get Facebook to take these posts down – to no avail. We stand with them and support their demands for justice.

Such hatred spreading on Facebook is not new. Your own documents suggest Facebook has long known it was fuelling conflict in Ethiopia. Frances Haugen’s disclosures warned, in testimony to US Senators, that Facebook is “literally fanning ethnic violence” in Ethiopia. It’s over one year since your own Oversight Board recommended that Meta commissioned “an independent human rights due diligence assessment on how Facebook and Instagram have been used to spread hate speech and unverified rumours that heighten the risk of violence in Ethiopia”. None of this evidence or warnings have spurred you to act. Facebook’s failures follow similar, well-documented failures in other countries including Myanmar, India, and Sri Lanka. They form part of a pattern of behaviour, where Meta persistently fails to protect users in non-English-speaking countries.

Facebook’s algorithmic systems have been repeatedly shown to disproportionately spread and amplify the most harmful forms of content, including advocacy of hatred and incitement to violence – with deadly consequences in conflict-affect settings such as Ethiopia. You have repeatedly failed to detect hate speech and your moderation capabilities are woefully inadequate. At this crucial time when you should be investing in your moderation capacity, we are alarmed to see your largest provider of content moderation services in East Africa has reduced its staff capacity even further.

There is no need for Meta to await judgement in the court case to prevent further harm in Ethiopia. We call on you immediately to:

  1. Unplug the hate machine. By promoting engagement above all else, the design of Facebook’s News Feed and algorithmic systems have been repeatedly shown to spur hatred and violence. Facebook must take immediate steps to redesign its algorithms to make them safe, embedding human rights principles in their design and operation.
  2. Action “Break the Glass” and ‘friction’ measures. During the US Capitol riots in January 2021, Facebook took specific “break the glass” measures to make the platform safer. Only some of these have been taken in Ethiopia - but the crisis warrants a full suite of emergency measures. Moreover, measures aimed at mitigating human rights risks should be embedded as the norm – not just exceptionally when severe damage has already been done. These include ‘friction’ measures which studies have shown to be effective at improving human rights outcomes, e.g., limits on resharing, message forwarding, and group sizes.
  3. Employ enough content moderators to serve all language markets moderated out of the Nairobi hub, and globally, and treat those vital workers with the respect and care they deserve.
  4. Be fully transparent with regulators and the public. Facebook must set out exactly what measures it is taking in Ethiopia - including publishing any internal risk assessments and harm prevention measures.
  5. Establish a reparation fund for victims of the Ethiopian conflict who have suffered harms contributed to by Meta, and formally apologise for Facebook’s role in fuelling the war.

To be clear: pulling the plug on Facebook in Ethiopia is not the answer. We also do not accept a false binary choice between a Facebook without proper protections or no Facebook at all. Instead, we call on you to stop profiting from the spread of hate, and to take responsible decisions which provide users with social media platforms which connect us, rather than divide us.

We await your public response.

 

Individual Signatories

Daniel Motaung, former Facebook content moderator

Frances Haugen, Facebook whistleblower

Ron Deibert (O.C., O.Ont), Professor of Political Science and Director of the Citizen Lab at the University of Toronto’s Munk School of Global Affairs & Public Policy

Dr. Sarah T. Roberts, co-director of the Center for Critical Internet Inquiry (C2i2) and Assistant Professor of Information Studies at the University of California, Los Angeles (UCLA)

Phumzile van Damme, former Member of the National Assembly of South Africa, expert in counter-electoral disinformation, digital rights and platform accountability in Africa

Dr. Safiya Noble, internet studies scholar and Professor of Gender Studies and African American Studies at the University of California, Los Angeles (UCLA)

Dr. Susie Alegre, international human rights lawyer and author

Dr. Cory Doctorow (h.c.) Special Advisor to the Electronic Frontier Foundation (US) Co-founder of the Open Rights Group (UK) Visiting Professor of Computer Science, Open University (UK) Visiting Professor of Practice in Library and Information Science, University of North Carolina (US) Research Affiliate, MIT Media Lab (US)