Abigail Sanchez
Opinions Editor

On Oct. 1, the Senate Commerce Committee voted to subpoena the heads of Google, Facebook, and Twitter to testify about concerns regarding Section 230 of the Communication Decency Act. According to CNBC, Section 230 “allows online platforms to be protected from liability for their users’ posts and their moderation practices” if they take down the posts once notified. This wouldn’t be the first time any of them were called to testify before Congress, but the subpoena does raise again an important question regarding social media platforms that has been going on for quite a few years. Just how effective are social media platforms at preventing the spread of misinformation?

With the pandemic and the election happening at this time, it is important for the public to know factual information regarding both topics, but should misinformation spread, it can be catastrophic. Any false information regarding COVID-19 could lead to more cases and deaths, and any false information regarding the election could be defined as voter suppression. This makes it especially important for social media platforms such as Facebook and Twitter to prevent the spread of false information surrounding these two highly important topics, among others. Facebook has already tried to put policies and regulations in place to stop misinformation about COVID-19.

In regards to the election, Facebook also plans on banning any new politics ads starting Oct. 27, the week before the election, and all political ads on Nov. 3 after all polling places close. However, according to Facebook’s policies, politicians are considered exempt from its rules on speech and any posts or advertisements made by politicians will not go through a fact-checking system. In other words, the very people that we elect to represent us when creating policies and laws can easily spread lies on the popular online platform about their or their opponents’ political agenda. Facebook defended its actions in 2019 by stating: “We don’t believe that it’s an appropriate role for us to referee political debates. Nor do we think it would be appropriate to prevent a politician’s speech from reaching its audience and being subject to public debate and scrutiny.” If a politician is spreading outright lies or making claims without evidence, then what exactly is there to debate about? Either they are telling the truth, and nothing but the truth, or they’re not.

Last year in October, Mark Zuckerberg appeared before the U.S. House of Representatives Financial Services Committee in which he was grilled by New York Rep. Ocasio-Cortez regarding Facebook’s politician exemption policy. When pressed about whether he would be willing to take down political ads that contain flat-out lies, Zuckerberg echoed his company’s statement: “In most cases, in a democracy, I believe that people should be able to see for themselves what politicians that they may or may not vote for are saying and judge their character for themselves.” If Zuckerberg does not want to take down lies made by politicians, then the least he could do is flag the post/ad or put a disclaimer stating that the information in the post is not completely truthful.

Misinformation can be easily spread through social media and now politicians have the power to stroke that fire to even greater lengths. In October of 2019, President Trump ran an ad which accused, without evidence, former Vice President Joe Biden of paying “$1 billion to Ukraine to bail out his son Hunter Biden from potential prosecution,” according to The Hill, yet Facebook refused to remove the ad. In an experiment and in response to the refusal of the removal of the ad, Sen. Elizabeth Warren ran an ad falsely claiming Mark Zuckerberg of endorsing President Trump for the 2020 election to convey the lack of action on Facebook’s side in preventing misinformation from spreading on its platform. The thing is, these types of ads reflect upon the person that the ad is about, not the person behind them, which could lead people to easily believe in such false claims about a person or a topic.

If Facebook wants to keep these ads in place so the public can ‘debate’ about them and to see for ourselves the character of the people that we are electing, then fine, but follow in Twitter’s footsteps and add a label to them. On June 27, 2019, Twitter announced that it will begin “labeling tweets from influential government officials who break its rules against bullying and abusive behavior,” according to CNBC. While the tweets won’t be taken down, steps will be taken to ensure that the posts won’t become easier to search up or appear on one’s dashboard. TikTok, another social media platform, has taken it a step further by stating that it will not host any political advertisements on its platform. It is important for us as citizens to hold our politicians accountable. We elect them to represent our interests and create policies that aim towards benefiting the people and the country. Allowing them to get away with spreading false and misleading claims to the public should not be permissible in any circumstance.

When asked about whether she believes social media platforms are effective in preventing the spread of misinformation, second-year Nancee Rodriguez stated, “I don’t think so because, on Twitter, misinformation can be retweeted very quickly. Both [Facebook and Twitter] . . .are going to have people that don’t know what they are talking about, and it [might] blow up.” This is seen in the case of the QAnon conspiracy theory. Believers of QAnon have been steadily growing over time on social media since it began three years ago. It is a conspiracy theory about a “secret cabal of liberal, Satan-worshipping elites who are running a child sex trafficking ring that President Trump will [supposedly] soon uncover,” according to Recode. It has inspired some people to commit or attempt violent crimes, and, in one instance, a woman was arrested after posting threatening messages online that she will “take out” former Vice President Biden. Before her arrest, she drove to New York from Illinois armed with about a dozen knives. A growing number of Republican politicians have also been showing support for QAnon, and President Trump has also shown his support for this radical conspiracy theory, stating, “I don’t know much about the movement other than I understand they like me very much, which I appreciate.”

Unfortunately, social media platforms have been slow at addressing this problem. Just recently, Facebook decided to ban any pages, groups, and Instagram accounts representing QAnon, one year after the FBI identified QAnon as a domestic terrorist threat, according to CNN Business. But at this point, there are already so many followers that it will be hard to completely ban QAnon from its platform and from spreading further false claims and accusations. The main issue is that Facebook is trying to crackdown on QAnon, but won’t do anything about the politicians supporting the conspiracy theory. Another issue which had made Facebook and Twitter hesitant on stopping QAnon from the start is respecting the free speech of its users. While I agree that free speech is important and essential to our democracy, when it becomes used to spread hate, misinformation, and incites violence, then the social media platforms have a duty to prevent such rhetoric from spreading and causing a threat to public safety. There is a difference between “I believe that anyone who engages in looting and rioting behavior should be arrested” and “when the looting starts, the shooting starts.

Republicans, especially, are concerned with the suppression of conservative views on social media platforms. Perhaps if some of them were not blatant in their support of extremist views such as QAnon, then it wouldn’t be such a problem. While it is understandable that social media platforms cannot regulate every post and stop misinformation from being reposted or retweeted, they should make more of an effort to stop such false/misleading information from spreading on its platforms, especially when the misinformation comes from politicians who are meant to represent our best interests.

Featured Photo: Emerson Little / Quaker Campus

Abigail Sanchez has been writing for the Quaker Campus since fall 2019 and is currently the Opinions Editor of the Quaker Campus. She is also a freelance writer and has written for two feminist media platforms. She enjoys writing about political and social issues that affect the country and her community. In her spare time, Abigail likes to listen to music, read books, and write fictional stories.
Previous Post

A Breakdown of California’s Propositions

Next Post

The Unexpected Rise of Among Us

Leave a Reply

Your email address will not be published. Required fields are marked *

Next Post

The Unexpected Rise of Among Us