Facebook says that it provides an essential ingredient for healthy democracies: an open, unbiased forum for public deliberation, the proverbial town square. It promises to strengthen democracy and provide a court for political debate, never to “pick up a racket and start playing.”
However, it has become abundantly clear that Facebook has a symbiotic relationship with populism. Populists—defined by Jan-Werner Müller as leaders who make the moral, emotional claim of directly representing the “people” in opposition to the elite—use social media to speak directly to millions of their supporters. They also benefit from increased emotion in politics, a phenomenon facilitated by algorithms that push users towards more extreme content to keep them hooked as well as political echo-chambers created as users choose to follow only those with similar political viewpoints as themselves. Similarly, social media platforms benefit from the increased political activation generated by populism and polarization. The increased stakes and emotion in populist, polarized politics may result in increased online traffic and advertisement revenue. Social media companies may also see dividends if a populist government—realizing their symbiotic relationship with social media—pushes more friendly, lax regulation of the internet. Thus, populists benefit from social media and social media benefits from populists. Meanwhile, democracy suffers.
This article will turn away from the United States to examine the role of Facebook in another large, diverse democracy struggling with ethnic tensions and populism: India. Over the past year, we have gained a clearer picture of the mutually beneficial relationship between Facebook and the ruling Hindu-nationalist Bharatiya Janata Party (BJP), as well as evidence of its deleterious effects on Indian democracy. Facebook’s actions in India speak to a broader pattern of abuse and poor oversight of the platform in countries in the Global South.
With Facebook shut out of China, India has become the company’s most important market: India is home to the largest number of Facebook users in the world and, as there are 600 million Indians who are just beginning to get access to the internet, Facebook still has ample room to grow. Further betting on the Indian internet boom, Facebook made its largest ever foreign investment into Jio Platforms, a Mumbai-based media conglomerate, in April.
As Facebook is heavily invested in India, so is the ruling BJP heavily reliant on Facebook. President Narendra Modi’s page has 45 million likes—by far the most likes of any world leader—and the BJP’s page has nearly three times more followers than its largest opposition party. Facebook provides Modi and the BJP a method for direct communication with supporters—something necessary for populist leaders to form a direct, emotional connection with the “people.” With its direct style and popular audience, Facebook provides Modi a similar platform to Donald Trump’s Twitter account or Hugo Chavez’s weekly one-to-eight-hour television show, “Aló Presidente.”
Facebook has repeatedly shown bias towards the BJP when censoring hate speech and misinformation. In the weeks leading up to the 2019 elections, the platform took down misleading pages funded by the Pakistani military and the BJP’s rival Congress Party while, without any action from Facebook, the BJP violated transparency rules by spending hundreds of thousands of dollars on ads submitted by secondary organizations that the BJP had created solely for the purpose of posting ads. BJP politician Raja Singh had hundreds of thousands of followers on Instagram and Facebook when he posted deeply hateful, dangerous rhetoric calling for Rohingya Muslims to be shot and mosques to razed. In accordance with hate speech policy, Facebook was going to remove him from all platforms until the head of public policy for India, Ankhi Das, allegedly intervened in his behalf. Only after public and international outcry was Singh permanently banned from Facebook and Das—who had once posted that India’s Muslims were a “degenerate community” and boasted that “we lit a fire to [Modi’s] social media campaign and the rest is of course history”—quit in October 2020.
An unregulated or unevenly regulated Facebook enables and amplifies the “weaponized rhetoric” of dangerous demagogues. Matching with Müller’s definition of populism, Jennifer Mercieca defines a dangerous demagogue as “a political agitator who appeals to the passions and prejudices of the mob in order to obtain power or further his own interests.” This weaponized rhetoric can erode democracy as dangerous demagogues will use it to, as detailed by Levitsky and Ziblatt, reject the democratic rules of the game, deny the legitimacy of political opponents, violate the civil liberties of opponents and the media, and encourage violence.
The BJP’s weaponized rhetoric on Facebook can be linked to each of the four processes identified by Levitsky and Ziblatt; however, the most concerning case is the role of social media in inciting targeted anti-Muslim violence during the February 2020 riots. Facebook has been called to testify at a New Delhi government commission about the platform’s role in this outbreak of communal violence that killed 24 people. Some of these killings were organized over Facebook’s messaging platform, WhatsApp. Furthermore, a BJP politician, Kapil Mishra, uploaded a video onto Facebook hours before rioting began that warned that if protestors did not leave the streets immediately, his supporters would clear them off by force; a clear cut case of using weaponized rhetoric to encourage violence. Facebook only took down this post in June after public comment. However—disturbingly demonstrating the symbiotic relationship between social media usage and ethno-nationalist populism—within two months after the video was posted, the number of hits of Mishra’s page grew from a couple hundred thousand a month to over 2.5 million.
Moving forward, India—and the United States—will need to introduce regulation to close “the gap between responsibility and liability” for the content posted on social media platforms. This has been proved possible and effective elsewhere: in Germany, Facebook has successfully complied with stricter, more comprehensive hate-speech laws than exist anywhere else; in Singapore, it adds a “correction notice” to stories flagged as false by a government agency. There just needs to be the political will to introduce and enforce regulation of hate-speech that would hold Facebook liable for what is circulated on its platforms. In Facebook’s role in inciting violence against Rohingya Muslims in Myanmar, we have tragically, horrifically seen what can happen when Facebook is lenient or “unaware” of the content circulating on its site—as it may realistically be in a non-Western country without strong regulation or oversight.
To counter the dangerous, symbiotic
relationship between Facebook and the BJP’s ethno-nationalist tendencies, the
Indian parliament must do what is best for democracy and the stability of the
country and enact legislation to hold social media platforms liable for
misinformation and hate-speech. Too much damage has already been done.
 Müller, Jan-Werner. 2016. What Is Populism? Philadelphia: University of Pennsylvania Press.
 Mercieca, Jennifer R. 2019. “Dangerous Demagogues and Weaponized Communication.” Rhetoric Society Quarterly 49(3): pp. 264-279.
 Levitsky, Steven and Daniel Ziblatt. 2018. How Democracies Die. New York: Crown. Chapter 1.
Interesting blog post Caroline! It is fascinating (and alarming!) to find out how closely connected real-life political events are to the information circulating on social media platforms. It is also surprising to see how much influence these platforms have in the occurrence of certain events and I agree something should be done about it. Nonetheless, regulating/prohibiting speech is a highly controversial subject, especially in the US as you probably know. Also, it may not be 100% accurate in targeting real hate-speech and misinformation. Therefore if we were to regulate misinformation, how would you ensure that third parties in charge of the regulation aren’t politically biased, and aren’t engaging in censorship of free speech? You would most certainly need transparent and clear rules on what constitutes misinformation, and a fair application of it. What those rules are though and how do we determine them, remains the entire question…
Also, don’t you think this would exacerbate the existing socio-political divisions as well as the partisan polarisation we see in US today? During the pandemic, we’ve seen how, many Americans resisted wearing their health masks and took it as a violation of their freedom and autonomy, so, any ideas on how to deal with all the social uprising it would cause?
This is a really interesting piece. I wonder if this applies to social media within all countries in which populist leaders exist or if this is more specific to India. I particularly am wondering about your claim that “Social media companies may also see dividends if a populist government—realizing their symbiotic relationship with social media—pushes more friendly, lax regulation of the internet” and if this is true in the United States. Are there any clear-cut cases of social media and the ruling government working together to further a populist movement? Later in this article you mention the example of Trump using Twitter to interact with and mobilize his base but in this scenario most people see the relationship between the President and Twitter as contentious, with various attempts by Twitter to put disclaimers about what the President is saying on their platform. There are even questions regarding whether Twitter will be stricter regarding Trump’s tweets once he leaves the office of President and loses his protections as a public leader. While it might be easy to use social media to mobilize one’s base, I feel that in the US, the relationship between the company and the populist is not one of direct, or at least not transparent, collaboration. In my opinion, social media serves more as a tool for campaigning that can be abused (with echo-chamber algorithms and sharing fake articles) than an agent that is rewarded for their help in elections. An example of a somewhat more symbiotic relationship that comes to mind is between a populist leader and media groups. When a populist president is elected, media groups get more viewers from the bases of whichever leader endorses them. I do agree that in the future, social media companies could verge on a more symbiotic relationship with populist leaders in the US, in a similar way to traditional media, and it is imperative that regulations are made to stop this possibility.