Over the past few years, Facebook has faced considerable backlash. The tech giant, run by Mark Zuckerberg, has garnered notoriety for its sketchy corporate practices. These include the Cambridge Analytica data scandal, the IRA’s use of the platform to interfere with U.S. politics, and, most recently, its permissiveness towards false information. Given its size and widespread influence, the international community at large, including America, have attempted to regulate the kind of information allowed on the platform to attenuate misinformation and violent outbreaks. This article explores their laissez faire attitude towards information, the possible ramifications of said attitude, and ways the company can promote the truth on the platform moving forward.
A recent congressional subcommittee hearing on antitrust was held to explore the vast collective market shares of Big Tech giants. Among those in attendance was tech mogul Mark Zuckerberg, who found himself in the hot seat once again, this time for previous business acquisitions. Leaders of the committee labeled these acquisitions as monopolistic power grabs. When questioned about Facebook’s purchase of Instagram, NPR reports that Rep. Jerry Nadler quoted an email he wrote in which Zuckerberg talked about, “… the need to ‘neutralize a competitor’…”
Given the company’s track record, this is not surprising news. Zuckerberg has been at the forefront of numerous controversies. Among other recent problems, the company has come under scrutiny due to the unregulated spread of information. Though Facebook is not directly responsible for the dissemination of false information, it hasn’t done much to stop it. In fact, up until recently, Zuckerberg took a strong stance against the removal of political ads or information containing disinformation, stating, “At the end of the day, I just think that in a democracy, people should be able to see for themselves what politicians are saying.” This is especially alarming now, given the proximity of the 2020 presidential election.
More and more people use Facebook in order to get political information. In fact, the Pew Research Center reports that over four-in-ten U.S. adults get news from Facebook. As a result, the evaluation of sources for credibility needs to be improved. While users should evaluate the integrity of political sources themselves, it’s especially important that Facebook limits incentives for spreading false information. Effective debate and political participation in democracies hinge upon many necessary conditions, including verifiably true information that fuels productive discourse. In her article, “On Democratic Backsliding”, Nancy Bermeo explores ways that democracies degrade in a process known as democratic backsliding. One of the three contemporary forms she lists is the strategic manipulation of elections, particularly by “hampering media access.” While the article focuses on direct election manipulation by incumbent parties, the spread of false information and polarizing content by groups including the IRA mirrors the actions of the aforementioned. These groups were created to undermine America’s democracy by tampering with the information that informs the electorate’s political decisions. In effect, Facebook has become indirectly responsible for democratic backsliding in America.
Now, Facebook has made strides in their efforts to curtail the spread of false information. The company banned two Russian disinformation networks, as well as a troll farm that was found tampering with the 2016 U.S. election. They also said they would disallow new political ads from being introduced onto the site in the week before the 2020 election. Likewise, Facebook marked multiple posts about coronavirus as fake news after the campaign group Avaaz found that over 40 percent of misinformation pertaining to the coronavirus was on Facebook. However, Facebook’s handling of fact-checking, or rather the lack thereof, is still problematic.
A common thread linking each of these developments is Facebook’s lack of initiative. This brings into question Facebook’s interest in protecting the end user, and how much these actions are a ploy to save face. Right now, their main focus appears to lie squarely on self-preservation. The indication, then, is that future accountability cannot be guaranteed, and that users should take extra precautions in verifying the information they view on the website. Zuckerberg’s reluctance towards prohibiting falsehoods should be evidence enough that consumers must remain vigilant.
In the meantime, there are a few ways to try and improve this predicament. First, Facebook can set higher review standards for political advertisements, and to have each advertisement evaluated before release onto the website. Second, Facebook can be broken up given its gargantuan size and monopolistic tendencies, thus increasing competition in the social media market by allowing new companies to enter the market. This could indirectly spark innovations in fact-checking by bolstering the diversity of social media options available to the public. Third, the company could provide users with credibility scores of other websites or sources, using current and reliable metrics best designed for its measurement. And finally, Facebook should increase its transparency with consumers. This will help Facebook improve its tarnished relationship with its consumer base and make it easier for the general public to hold the company accountable for its actions.
Bermeo, N. (2016). On Democratic Backsliding. Journal of Democracy, 27(1), 5–19. https://doi.org/10.1353/jod.2016.0012