In a press conference today, a coalition of activists from Myanmar, Syria, and six other countries called on Facebook to take a more consistent and transparent approach to moderation. Facebook has come under fire for its role in fueling a genocide in Myanmar as well as enabling broader political manipulation around the world.
“Many of the countries here have been engaged with Facebook for years to try to receive justice in our communities,” said Thenmozhi Soundararajan, executive director of Equality Labs. “And what we’re finding is that Facebook has different standards for different markets.”
The group also includes activists from Bangladesh, Vietnam, Sri Lanka, India, Philippines, and Ethiopia, which are countries where Facebook has sought to expand its user base. “We are the next billion users of the internet, so we’re going to wield our power, and it really begins today,” said Soundararajan. “We’re not on Facebook’s timeline anymore. We’re on our timeline.”
The new coalition comes less than two months after activist groups in Myanmar sent a public letter to Mark Zuckerberg calling for greater transparency and local engagement. Zuckerberg has publicly acknowledged the situation, telling Congress, “What’s happening in Myanmar is a terrible tragedy, and we need to do more.”
The coalition is particularly notable because it pushes back against Facebook’s usual approach of tackling moderation issues on a country-by-country basis. “The point of siloing moderation out across different countries is to set different benchmarks as to what they’re willing to change,” Soundararajan said.
When asked if the private country-by-country conversations were productive, she replied, “Have you ever had a productive conversation with Facebook?”
The coalition set out three specific demands, calling for sustained transparency, an independent and worldwide public audit, and a public commitment to equal enforcement of standards across all the countries Facebook operates in. In particular, the coalition asked for localized publication of moderation guidelines similar to recent efforts in the United States.
The issue is a particular concern for African activists, given upcoming elections in Togo and the Democratic Republic of Congo. Berhan Taye, an activist and researcher in Ethiopia, said she hasn’t heard any outreach from Facebook on the topic. “We know that Facebook is going to be weaponized,” Taye said. “We haven’t had any clear answer on Facebook on what happened in the Kenyan election and what will happen in the upcoming elections.”
In Sri Lanka, activists argued that the lack of local moderators — specifically moderators fluent in the Sinhalese language spoken by the country’s Buddhist majority — had allowed hate speech run wild on the platform. The group particularly called on Facebook to make public its list of slurs deemed locally unacceptable on the platform, out of concern the list might be skewed against the Muslim minority in the country.
“We’d like to know who is moderating the content,” said a Sri Lankan activist. “If it’s someone aligned with these extremist groups, then that is a problem because this content is going to stay online.”