The Informational Crisis & Political Polarization in American Society

By: Sara Ibrahim

Edited by: Allison Rhee and Hannah Cheves

  The spread of misinformation, proliferated by powerful platforms such as Facebook, Twitter, and Google, is one of the most important legal issues affecting political polarization and discourse today. According to the legal concept of systemic duty of care, content moderation should be obligatory in order to mitigate the detrimental effects of misinformation. Additionally, to hold platforms accountable for effective content moderation, Congress should consider serious revisions to Section 230 of the Communications Decency Act. 

            Currently, Americans are suffering from an informational crisis, where the spread of misinformation[1] is shared at an alarming rate— profoundly polarizing those of different political affiliations. Misinformation has been found to spread at a faster rate than true information: according to an MIT Sloan School of Management research project, “falsehoods are 70% more likely to be retweeted on Twitter than the truth, and reach their first 1,500 people six times faster.”[2] Additionally, according to a Pew Research study, false information “may actually be accelerating the process of polarization” by “driving consumers to drop some outlets, to simply consume less information overall, and even to cut out social relationships.”[3] Political polarization results in the demonization of opposing political parties, which exacerbates legislative inaction greatly. Compromise and political discourse is increasingly difficult to have when each side does not respect the other enough to collaborate. 

The effects of polarization go far beyond just politics. As a society, polarization segregates people, making them cling to their own social and political groups and refusing to bridge over to other groups, further dividing us as American citizens. Misinformation is spreading faster than true information, making misinformation an epidemic that affects many facets of society. 

Search engines contribute to the polarization of American society by manipulating how their users see facts. Search engines are first and foremost businesses where consumer attention is the product. In the documentary The Social Dilemma, whistleblowers from Google and Facebook speak on how our worldviews are limited by search engines. They assert that many factors, including where you live and your personal search history, affect your search results. For example, if you Google search “climate change is,” you will find the suggested searches to be anywhere ranging from “a hoax” to “disrupting the planet” depending on what Google knows about your search history, preferences, and location. These results also may not always be scientifically accurate as the algorithm is not programmed to provide suggestions based on true and accurate information, but rather to keep the attention of consumers. This results in the spread of misinformation, which results in the polarization of thought, because each person now relies on a different set of facts. When we all have different facts, meaningful discourse surrounding controversial issues becomes difficult and compromise is unlikely. 

           Through content moderation, rules, and regulations consistent with a systemic duty of care, platforms’ systems need to be modified to prevent online harm.[4][5]  This is nuanced because there is no clear definition of what is classified as “harmful”. Taking down posts that are illegal is one thing; however, “harmful” is an arbitrary and subjective standard. Many would make the argument that removing “harmful” content is a breach of freedom of expression and content moderators should not have such authority. However, this argument ignores that free speech is not absolute and has limitations such as libel and slander, and that these limitations are widely considered to be “harmful.” Moreover, content moderators should have the authority to regulate their content, and the law should foster harm-reductive content moderation.

            Section 230 of the Communications Decency Act is one of the most important existing laws when it comes to platform regulation and is crucial to discuss when talking about online misinformation. It essentially grants platforms immunity from liability regarding third party posts while simultaneously allowing them to moderate posted content. Section 230 has been broadly interpreted and gives no incentive for platforms to take down harmful content. It should be revised to only apply to platforms that enact effective content moderation.[6] Generally, conservatives want to repeal Section 230 as they argue it has been used to silence conservative voices. Liberals argue the opposite, that Section 230 allows platforms to ignore slanderous and harmful content, enabling hateful speech. However, Section 230 does not mention neutrality or impartiality. It merely allows companies to set their own regulations when it comes to content moderation, which is harmful because, as noted earlier, companies are in the business of keeping consumers’ attention, not protecting the truth.             

Ultimately, the spread of misinformation is an issue that results in political polarization and the perpetuation of echo chambers— affecting all aspects of American life and political discourse. Without agreeing on what information constitutes facts, it becomes very difficult to solve any other issues or for meaningful conversations to take place, which then rules out the possibility of political actions to take place as well. To keep meaningful discourse and the hope of unity alive Section 230 must be modified, as it has the power to hold search engines accountable and help curb the spread of misinformation. This legal issue is pivotal and it is the basis of every other issue because if each person has their own facts, divisions are created, which threatens societal stability and peace. Moreover, content moderation should be enacted under the legal protections of systemic duty of care and Section 230 in order to mitigate political polarization and keep meaningful discourse and unity alive. 

 

NOTES:

  1. false information that is disseminated regardless of intent

  2. Sara Brown, “MIT Sloan Research about Social Media, Misinformation, and Elections,” MIT Sloan School of Management. October 5, 2020, https://mitsloan.mit.edu/ideas-made-to-matter/mit-sloan-research-about-social-media-misinformation-and-elections.

  3. David A. Graham, “Some Real News About Fake News,” Atlantic (Atlantic Media Company, June 12, 2019), https://www.theatlantic.com/ideas/archive/2019/06/fake-news-republicans-democrats/591211/.

  4. A systemic duty of care is a legal standard that states that “platforms are dependent on their users’ social connections and, thus, are obliged to reduce online harms to those users.”.

  5. Alex Engler, “How Biden Can Take the High Road on Misinformation,” Lawfare, 2020, https://www.lawfareblog.com/how-biden-can-take-high-road-misinformation.

  6. Olivier Sylvain, “Section 230's Challenge to Civil Rights and Civil Liberties,” Columbia University, April 6, 2018, https://knightcolumbia.org/content/section-230s-challenge-civil-rights-and-civil-liberties.

BIBLIOGRAPHY:

 Brown, Sara. “MIT Sloan Research about Social Media, Misinformation, and Elections,” October 5, 2020, https://mitsloan.mit.edu/ideas-made-to-matter/mit-sloan-research-about-social-media-misinformation-and-elections.

Engler, Alex. “How Biden Can Take the High Road on Misinformation,” Lawfare, 2020, https://www.lawfareblog.com/how-biden-can-take-high-road-misinformation

Graham, David. “Some Real News About Fake News,” The Atlantic (Atlantic Media Company, June 12, 2019), https://www.theatlantic.com/ideas/archive/2019/06/fake-news-republicans-democrats/591211/.

Sylvain, Olivier. “Section 230's Challenge to Civil Rights and Civil Liberties,” Knight First Amendment Institute at Columbia University , April 6, 2018, https://knightcolumbia.org/content/section-230s-challenge-civil-rights-and-civil-liberties.

Author's Social Media Usernames: Twitter (@ibrahimrsara) and Instagram (@ibrahimm.s)