Despite their pivotal function as distributors of news, politics, and commentary to millions of users, technology companies have recently faced increased scrutiny and criticism for promoting platforms that foster misinformation and conspiracy theories. Notably, Facebook, Twitter, and Google CEO’s recently testified before the Senate Judiciary Committee and were met with calls for enhanced government regulation from members of both the Republican and Democratic Parties, including a repeal of Section 230 of the Communications Act. The 1996 Law protects these technology giants from the content that users upload and allows them to refrain from substantively moderating false and misleading content on their platforms.
Demands for regulation have never been stronger than in the wake of the 2020 presidential election, which featured claims of widespread election fraud on several different technology platforms. In fact, a recent Pew Research Center study found that 47% of Americans say major technology companies should be regulated by the government more than they are now. Instead of completely censoring posts about widespread election fraud during the 2020 election, Twitter and Facebook merely added warning labels and advisories on many misleading posts. It is evident that platforms which allow these falsehoods to flourish weaken democracy by advancing distrust in U.S. democratic institutions and causing distrust in mainstream news media sources. Through increased government regulation of the technology industry, I contend that misinformation oversight can strengthen democracy by safeguarding democratic norms and institutions.
First, it is important to understand how misinformation weakens democracy by delegitimizing democratic norms and institutions. The media (or technology companies) play a critical role in maintaining a politically informed citizenry. This is important because an informed citizenry is fundamental to several preexisting conceptions of a healthy democracy. As argued by Dahl, a key characteristic of democracy is the continuing responsiveness of the government to the preferences of citizens. This characteristic allows citizens to signify their preferences to their fellow citizens and government through individual and collective action.  When citizens do not have consistent access to factual information, the citizenry may instead advocate for policies that are against their self-interests and spread misinformation about institutions or longstanding norms. Notably, a research study conducted by Gunther, et al. found that intensely partisan misinformation during the 2016 election cycle may have given then-candidate Trump decisive advantages in battleground states. Specifically, the researchers found that there appeared to be a statistical association between belief in fake news stories and vote choice and that embellished news stories led former Obama voters to abandon the Democratic candidate.  In this way, technology giants have played a specific role in eroding democracy by providing a platform for baseless claims and conspiracy theories to develop and gain traction among impressionable voting groups.
On the other hand, increased oversight and regulatory policies can be leveraged by the government to hold technology companies accountable and safeguard democracy. Through a repeal of the 1996 Communications Act, the government can require technology giants to establish comprehensive internal misinformation oversight committees to continuously filter and take down misleading content on their platforms. If millions of impressionable users are prevented from accessing inaccurate and falsified information due to this type of internal oversight, users may be more likely to view and engage with more credible mainstream media sources and restore faith in the technology companies themselves. This information shift may consequently lead to a more informed citizenry that values democratic norms and institutions, and one that holds their elected leaders accountable for undemocratic practices.
However, it is conceivable that this government regulation may actually have an unintended adverse effect of increasing the monopolizing power of large technology companies. Because internal oversight requires significant funding and resource allocation to be able to be implemented, a blanket oversight regulation across all technology companies may drive smaller companies out of the market who cannot afford to dedicate workforces to content moderation. Instead, the federal government would be best served by imposing strict regulatory standards on large technology companies, while allowing smaller companies to face less scrutiny and regulation. In essence, establishing a threshold for regulation may remedy any problems that may arise from a universal regulatory practice.
Overall, despite differing regulatory interests between the Democratic and Republican Party bases, it is evident that technology companies have a unique opportunity to safeguard democracy through content moderation. Unless actionable steps are taken to combat sources of misinformation, the very trust and integrity of democratic institutions are at stake. Recently, a poll conducted by US News found that less than 50% of Republicans believe that President-elect Biden won the election legitimately, while 68% percent of Republicans believe that the election was rigged in his favor. This misinformation rhetoric spread through technology company’s social media platforms is deeply troubling and it will take a multipronged effort to begin to combat it.
 Robert Dahl, Polyarchy: Participation and Opposition, 1971, chapter 1
 Gunther et al., Fake News Did Have a Significant Impact on the Vote in the 2016 Election, Working Paper, Ohio State University, nd