9 minute read

Facebook’s Ethiopian quandary

As relatives of victims of the recent civil war in Ethiopia accuse Meta in a Kenyan court of failing to adequately police incendiary speech on Facebook during the armed conflict, much greater effort from the company is warranted, write William Davison, Jane Esberg and Alessandro Accorsi, who point out, though, that Meta’s task is hardly straightforward

IN December 2022, the son of slain Ethiopian Professor Meareg Amare was among those who filed a lawsuit against Meta, the owner of Facebook. The constitutional petition, which was filed in Kenya – home to a Meta moderating hub – alleged that the company had evinced a “woeful failure to address violence on the platform” and held it responsible for the November 2021 murder of Meareg, an ethnic Tigrayan living in Bahir Dar, the capital of Amhara region.

Advertisement

His son, Abraham, said users of Facebook, which became Meta in December 2021, had incited violence against his father, which led to his murder. Abraham and his fellow plaintiffs want Meta to apologise for failing to remove offending posts; contribute around $2.4 billion to funds for victims of hate on Facebook; alter its algorithm so the platform does not spread harmful content; and make its content moderation equitable across languages. Meta said in response that it had invested heavily in content moderation.

Meareg is just one of hundreds of thousands of Ethiopians who over the past few years have lost their lives to the country’s sectarian violence. The bulk of this carnage can be traced to the civil war centred on the Tigray region in the country’s north, which began in November 2020, when a constitutional dispute and power struggle between Tigray’s leaders and the federal government spiralled into conflict.

A November 2022 peace deal between the federal government and the Tigray People’s Liberation Front (TPLF) has brought a welcome respite from the violence in and around Tigray. But peace in the north remains fragile, and Ethiopia continues to face violence and instability elsewhere – particularly in Oromia, where the government is trying to crush an ethnonationalist insurgency.

Whatever the merits of Meareg’s legal claims, his story highlights concerns about the complicated role that Facebook and other social media platforms have played in this violence. While these platforms can be useful in conflict situations – for example, by helping share information that can protect civilians and document human rights abuses – they can also be an amplifier and accelerator of already existing tensions and a space to carry out targeted attacks on marginalised groups including, for example, women and girls.

Meta has a history of underinvesting in moderating content in multilingual Ethiopia, just as it did in Myanmar, where the platform was criticised for the role it played prior to the state’s expulsion of Rohingya Muslims in 2017. To be fair, particularly in countries buffeted by conflict, there are issues that no algorithm or investment can fully address – including the challenges of short-order fact-checking in a complex ethno-political landscape.

So, what should be done? Meta can and should do more to address incendiary content on the platform, including through improved language capacities, larger moderation teams, more research transparency and algorithmic changes that do more to demote potentially inflammatory content. Ultimately, however, critics of Meta need to think more broadly in terms of solutions.

Ethiopia needs a healthy, independent media – something successive Ethiopian governments have impeded. Without it, Meta will struggle to verify facts and misinformation.

Of course, the absence of a healthy press also deprives Ethiopians of reliable information about their communities and government; having reliable news sources would better enable them to hold their leaders to account. For these reasons, building up a capacity for rigorous independent reporting should be part of any strategy for managing social media risks in Ethiopia – and Meta should contribute to that effort.

Given Ethiopia’s large and active online community and contentious politics, it seems clear that Meta has not adequately invested in its detection and moderation infrastructure there. Part of the problem may be the way that the company approaches new markets. While it operates in more than 110 languages, Meta offers content moderation in only 70 of them, usually adding new languages under crisis conditions.

In her testimony before the U.S. Senate in October 2021, Facebook whistle-blower Frances Haugen claimed that the company’s lack of investment in Ethiopia allowed for problematic content to be widely disseminated. According to Facebook documents that were leaked by Haugen, the company’s hate speech algorithm – responsible for detecting 97 per cent of the hate speech removed from the platform – could not adequately detect harmful posts in Amharic or Oromo, Ethiopia’s most widely used languages.

Facebook has since made changes, though their extent and value are still unclear.

Haugen’s leak showed the company makes investments in specific countries according to a tiering system that assesses the potential for violence and inflammatory content. When the alarm bell goes off that a country has crossed a certain risk threshold, Meta puts together a quick response team.

These teams, called Integrity Product Operations Centres, can be operational for up to a few months – which, of course, is not a long-term solution. As a source familiar with Meta’s approach put it to Crisis Group, the company’s strategy runs the risk that the mobilisation of resources only happens when it is already too late. There is also a question about whether its internal resources are adequate to deal with a complex, multilingual and polarised conflict scenario at any stage of development.

Greater transparency with respect to the company’s tiering system and other internal decision-making would help outsiders assess the company’s approach to these issues and questions. Researchers only know that Ethiopia ranked high as of early 2020 thanks to Haugen’s leaks. They do not know where Meta forecasts the next crisis is looming.

The limits of Meta’s tiering approach can be seen in how it has operated in Ethiopia. Meta only started working on improving the algorithm’s ability to detect problematic content in Amharic and the Oromo language at the end of 2020 – around 15 years after first becoming operational in the country.

It now supports Amharic, and the Oromo, Tigray and Somali languages. According to sources close to Meta who spoke to Crisis Group, human content moderators familiar with at least two Ethiopian languages were only hired at some point in 2022 – two years after the brutal civil war that had claimed tens of thousands of lives broke out.

This is not to say that Meta is facing an easy task in Ethiopia. The challenges of responsibly operating a social media platform amid Ethiopia’s conflict are enormous.

The hurdles include the difficulty of real-time fact-checking and the complexity of Ethiopia’s ethno-political landscape. With fast-moving conflict dynamics and few sources that can provide verified information, it is particularly hard to gauge the difference between unproven information and false information that incites violence and should therefore be removed from the website.

Meta has also understandably struggled with decisions about whom to de-platform in Ethiopia and under what circumstances. The company identifies three categories of “dangerous organisations” whose accounts it may delist.

The first comprises organisations and their supporters “organising or advocating for violence against civilians, repeatedly dehumanising or advocating for harm against people based on protected characteristics”. The second is for “violent non-state actors” who generally do not target civilians. The third is for groups that routinely use hate speech and are deemed likely to use violence.

This is tricky terrain in Ethiopia. The country is home to many identity-based political constituencies that frequently claim to be victims of violence at the hands of various groups. The prevalence of such claims complicates efforts by an outside party like Meta to adjudicate which groups are in fact “dangerous”. Decisionmaking invariably involves a degree of subjectivity.

Further complicating Meta’s situation in Ethiopia is the weakness of the country’s media landscape. Decades of authoritarian rule made it difficult for a free and rigorous press to emerge.

After a diverse coalition of insurgents overthrew the military dictatorship in 1991, a TPLF-led government took charge and established a system of ethnic federalism. A plethora of communally divided media outlets emerged in the aftermath.

These outlets operate in different languages, furthering the divergence in perspectives. More recently, the proliferation of partisan satellite television, Telegram and YouTube channels has only exacerbated this dynamic.

This points to a broader problem plaguing Meta and other social media companies in countries without a healthy media. There are, of course, partisan outlets around the globe. In more established democracies, however, there are generally also professional news organisations with high verification standards and these entities help define the bounds of what is accepted as accurate.

Without a basis for knowing the facts, Meta will struggle – especially when under time pressure – to identify whether unverified rumours are providing important factual information or when they are fabrications aimed at heightening tensions. A few fact-checking initiatives, such as the aptly named Ethiopia Check, have sprung up.

What is most needed, however, is a strong, autonomous domestic press that strives for accuracy and objectivity. Of course, the longstanding dominant authoritarian political culture in Ethiopia that often punishes dissent and discourages debate is an impediment to the creation of an independent media. Meta therefore faces a particular challenge in places like Ethiopia: the country lacks an independent press and its absence makes people rely heavily on social media for information.

Preserving social media’s crucial role in providing information about conflict in Ethiopia while improving the quality of that information will require changes to happen online and offline. In the online sphere, Meta should continue to strengthen content moderation and reduce the reach of potentially inflammatory content.

Meta should expand its language capacities, both by increasing the number of moderators and improving its hate speech algorithm so that inflammatory posts are more readily demoted. Meta should also expand consultations with, and where needed increase support to, organisations that can help define standards for hate speech, including content that targets women and girls.

But while the company says it will maintain its content moderation operations, for now there are reasons for concern about how that pledge will hold up. A third-party contractor shut the Nairobi hub in January 2023 and Meta has shed jobs globally amid reduced revenues.

Transparency is also important. Two years ago, leaks from inside the company revealed that Ethiopia ranked high in Meta’s tiering system. To this day, nobody outside Meta knows what steps it took to reduce the online circulation of problematic content as a result of that determination.

The Oversight Board was told that improvements to the algorithm were made and that human moderators and language classifiers were hired. But they cannot confirm numbers, data or even the number of takedowns. As a result, the extent of these investments, their timeline and their impact cannot be assessed independently.

At the very least, the company should give researchers and the Oversight Board greater access to data and information that would allow them to evaluate whether the interventions Facebook has taken are effective, and offer recommendations for how they could be more so. Such moves would also help increase trust in Meta.

Still, neither content moderation nor algorithmic adjustments nor other changes to community standards will by themselves address the many ways in which online content can contribute to violence. Responsible management of those risks requires offline changes as well. Strengthening traditional local media, including through developing less partisan news sources, is crucial for the flow of reliable on-the-ground information –including on Facebook.

While the development of such media depends mostly on local factors, Meta has a supporting role to play in this effort. In the US, Meta’s promotion of “trusted” and local news sources – or “high-quality reporting” – has been central to its goals of reducing polarising content. This is, of course, highly dependent on independent sources being identifiable and available, and the Ethiopian government’s repressive policies over the course of decades have discouraged their proliferation and growth.

That said, Meta can promote and, where appropriate, support local reporters, fact-checkers and citizen journalists who report in a relatively unbiased way. Boosting their accounts and providing technical and material aid can improve the quality of news online.

Identifying these individuals will require consultations with civil society groups across the country. In Ethiopia, government support for a free press, rather than continued criminalisation of dissent, will be an essential element of improving coverage.

Some may see Ethiopia’s sweeping 2020 hate speech law as a means of addressing online incitement, but it has failed so far to have a noticeable impact. In fact, it may well also discourage legitimate dissent.

Sadly, there is no silver bullet for Ethiopia’s deep-seated political violence, and that also applies to social media’s role in it. But efforts to reduce the harm caused by online content combined with broader efforts to bolster the traditional media may create an improved online environment that does more to curb inflammatory discourse.

Crisis Group previously was a partner of Meta, and in that capacity was in contact with the company regarding misinformation on Facebook that could provoke deadly violence. Crisis Group reached out to the company to comment on this piece. It responded that it has strict rules outlining what is and is not allowed on Facebook and Instagram.

It went on to say that hate speech and incitement to violence are against these rules and that it invests heavily in teams and technology to help find and remove this content. In Ethiopia, specifically, it said it receives feedback from local civil society organisations and international institutions and employs staff with local expertise who focus on the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya.

This article is from: