In 2004, in a small dorm room at Harvard University, Facebook was born. A platform initially designed for American college students to network and meet each other has grown to host nearly 3 billion global monthly users in little over 16 years. With such exponential growth, unforeseen issues are inevitable. It can be hard to attribute a lack of foresight to anything more than naïveté, but nonetheless, hate speech has been one of the most dogged issues Facebook has tried to address. Hate speech in the United States is a huge problem, but globally, it poses a greater threat, due to Facebook’s inability to moderate its platform. In Myanmar as of August 2018, Facebook did not have a “single employee in the country of some 50 million people.” Instead, its content moderation is outsourced to a separate firm through a covert operation called ‘Project Honey Badger.’ Project Honey Badger is responsible for the moderation of many Asian countries, and in Myanmar, it reported 60 people reviewing (only a handful of which actually speak Burmese) the hate speech reports for more than 18 million active users. As a consequence, hate speech wreaks havoc in Myanmar and throughout many countries around the world. This blog post will argue that in countries with prevalent ethnic group division and weak political institutions, by providing a nearly un-moderated platform for the spread of hate speech, Facebook drives ethnic violence which directly contributes to democratic backsliding. It will review three case studies, Sri Lanka, Myanmar, and Ethiopia, to contextualize how Facebook drives ethnic division and violence and then employ the work of several scholars to connect ethnic violence with democratic backsliding.
In Sri-Lanka in 2018, a false rumor, originating on Facebook, claimed that the Muslim minority was planning to distribute sterilization pills to wipe out the Sinhalese majority. In Ampara, a customer at a Muslim pharmacy begin yelling and harassing the shop owner about something in his food, to which the pharmacist replied in broken Sinhalese: “I don’t know, yes, we put?” This interaction was video recorded and posted to Facebook to prove that the Muslims were indeed distributing sterilization pills to wipe out the Sinhalese. This was, of course, false, but the man was subsequently beaten and killed, his shop destroyed, and the local mosque burned. Critically, “Facebook’s newsfeed played a central role in nearly every step from rumor to killing.”
In Myanmar from the mid-2010s through 2018, Facebook was a key instrument in an ethnic cleansing and genocide perpetrated against the Rohingya Muslim minority group. Importantly, this campaign was largely driven by military officials. These military officials set up seemingly innocuous pages devoted to “Burmese pop stars, models, and other celebrities,” but would eventually begin spreading toxic disinformation. The main goal of the posts was to “generate widespread feelings of vulnerability and fear that could be salved only by the military’s protection.” Facebook’s own commission has found that the platform was instrumental in “foment[ing] division and incit[ing] offline violence.” This rather sculpted corporate admittance really means that Facebook’s content moderation policies or lack thereof fueled a violent ethnic cleansing on a major scale.
In Ethiopia, a popular singer, Hachalu Hundessa was assassinated in June of 2020 after a disinformation campaign on Facebook alleged that Hundessa “abandoned his Oromo roots in siding with Prime Minister Abiy Ahmed.” The assassination sparked days of violence seeing hundreds dead with ethnic minorities seeing the most damage. As in the previous cases, “the bloodshed was supercharged by the almost-instant and widespread sharing of hate speech and incitement to violence by Facebook…Mobs destroyed and burned property. They lynched, beheaded, and dismembered their victims.” Similar to Myanmar, in Ethiopia, the government was also involved in the dissemination of hate speech.
David Waldner and Ellen Lust engage substantively with the driving forces behind democratic backsliding in their article Unwelcome Change: Coming to Terms with Backsliding. In it, they outline six key theories to explain backsliding, and one of them is the theory of social structure and political coalitions. This theory studies the actual divides between ethnic groups as a source of democratic instability. They explain the basic idea to be that in pluralistic societies, ethnic group loyalty can trump national loyalty, which can then lead to politicians appealing directly to members of their own ethnic group, in a process known as outbidding. This process leads directly to “increased ethnic chauvinism, ethnic polarization, the breakdown of democratic institutions, and possibly interethnic political violence.”
This theory of political coalitions driving backsliding is only exacerbated by the presence of a platform that drives deeper wedges between groups. Facebook is particularly unequipped to address these problems in many of the countries that suffer from the political coalition theory. It is a mutually reinforcing cycle: the governments are often unable to engage with Facebook to prevent the spread of hate speech, but even if they were, Facebook is incapable of keeping up with the content produced in languages unfamiliar to its employees. The population suffers as a result, highlighted by Waldner and Lust’s thesis. Ethnic wedges are driven deeper by tensions on Facebook, and, as a result, people and those in power who have political incentive to stoke tension can sow real world damage and drive erosion.
When Facebook grows in countries that have relatively weak institutions and ethnic group in-fighting, the problem of democratic backsliding is compounded. Daron Acemoglu and James Robinson, in their book Economic Origins of Dictatorship and Democracy, contend that political institutions are absolutely critical for democracy because not only do they protect current democratic outcomes but they can be used by the polity to ensure the allocation of future power. The implications of this argument on a nation stricken with ethnic violence due to Facebook are dire. Central to Acemoglu and Robinson’s thesis is the ability of the polity to organize and shape the design of democratic institutions to serve their benefit. So not only is the polity divided along ethnic groups, but their organizational capacity is severely limited. When consumed by violence, it is impossible for a country to come together and redefine their institutions that, in many cases, are already not serving their purpose, driving democratic erosion.
Determining a viable solution to this problem is nearly impossible. The astronomical growth of Facebook and the initial naïveté of its team is partially to blame. A platform, barely over 16 years old, that was initially developed for college students to network now drives genocide around the world, and has denied culpability almost every step of the way. Expecting the platform to evolve and step up to the challenge has proven futile. In the United States and European countries, citizens have turned to their governments for action, also to little avail. In countries where the government is invested in the ethnic violence, it offers little hope. So, barring the creation of some third party regulatory entity (almost entirely unlikely), the problem will likely go unsolved unless sufficient pressure is placed either on governments or Facebook to step up. Until then, Facebook will continue to contribute to the erosion of democracy around the world, especially in the most vulnerable countries.