Number of years in The Coat of Arms: 2
Favorite aspect of journalism: it gives my illustrations a good platform to relay a greater message.
Interests outside of school: playing volleyball, drawing and watching cartoons.
Class of 2023
January 5, 2022
In this new Coat of Arms column, News Editor Alex Levitt and Opinions Editor Penelope Stinson will discuss a controversial issue relevant to the Menlo community or the world as a whole. In this edition, the discussion centers around censorship by social media companies, and the extent to which free speech should be allowed on these platforms.
To what extent should social media companies (e.g. Twitter, Instagram, Facebook) censor posts and possible misinformation and/or hate?
Alex:
When posts actively encourage violence, social media platforms can and should take them down. However, social media companies have taken on an extremely broad stance on the definition of “misinformation,” leading to a lot of reasonable voices being shut out from the public. For example, Facebook considered the theory that COVID-19 escaped from a lab “misinformation” for months, until President Biden acknowledged that it was a possibility. Often, it seems that social media companies jump to the conclusion that non-mainstream theories are automatically incorrect, which leads to unnecessary censorship.
Penelope:
I think social media companies such as Twitter, Instagram and Facebook should be censoring posts with possible misinformation. When I say censor, I mean content warnings, such as “this post might include misinformation.” I think direct censoring, such as actually taking down the post, needs to be saved for things such as defamatory hate speech or speech that promotes or glorifies violence. However, although the definition of misinformation might be broad, it is the responsibility of social media companies to ensure that their users aren’t just consuming misinformation daily with no warning.
Alex rebuttal:
I agree that providing content warning labels is a better step than fully removing posts. However, these warnings are given more to views that Facebook deems “extremist,” according to Facebook spokesperson Andy Stone, even when there’s nothing factually incorrect about it. Posts or accounts deemed extremist often fall to the right of center, although their beliefs may not be as surprising as you may think. For example, an April 2021 Instagram post by the conservative account @freedomfights was marked as having “missing context,” when the post was simply a quote by Black Senator Tim Scott that “America is not a racist country.” Whether or not one agrees with this idea isn’t important; what is wrong is that debatable opinions from one side are being unnecessarily censored, providing a biased view of the world to young consumers.
Penelope rebuttal:
While it’s important that posts coming from different views and different topics aren’t censored or labeled as misinformation, it’s better for those posts to be labeled as misinformation than for posts that are actually misinformation to not be labeled as such. Posts that concern organizations such as QAnon, which grew and developed on Facebook, need to be shut down immediately. Shutting that misinformation down is more important than worrying that one or two senators or Congress members might get accidentally labeled as misinformation here and there. Those issues can be developed and readjusted as Facebook and other companies continue to fight misinformation and grow and develop their misinformation team. However, in the beginning, we must allow for those kinks in the system to ensure that the actual misinformation is censored.
Is there a point at which free speech becomes dangerous?
Alex:
I don’t believe that there’s a point at which social media allowing people free speech becomes dangerous. After all, it’s free speech that allows for democracy, and conversely, it’s censorship that allows for authoritarian rulers or political parties to gain an unhealthy amount of power. This idea can be seen throughout history in the Chinese Cultural Revolution, Fascist Italy and countless other disasters. Of course, the example we’re dealing with is nowhere near as extreme, but allowing people to speak their minds publicly is an American principle outlined in the First Amendment under the freedom of speech and freedom of assembly. In this day and age of online communication, social media companies should try to uphold these values and not block posts, even if they believe the facts are misleading. Of course, it’s a completely different story if posts directly encourage violence towards any individual or group.
Penelope:
I think there is a point at which free speech becomes dangerous, which is when people promote misinformation and spread blatant lies. I agree that, throughout history, censorship has led to countries not allowing their citizens full rights. But when our forefathers wrote the American Constitution and the Declaration of Rights, they were not writing it knowing that Facebook would have a billion users who could spread posts and share whatever they wanted whenever they wanted. We need to focus on our country in its modern era. So when we see the Facebook Whistleblower case where evidence shows post after post negatively impacts teenage girls, or the reports following the 2016 presidential election of Facebook of misinformation outperforming actual news stories, we need to do something about it. When social media companies have the power to do that, there comes a point where we need to regulate them.
Alex rebuttal:
I completely agree that misinformation is not a good thing, and in no way do I think people should spread false information. But whether any of us like it or not, it’s a fundamental right to be able to share whatever opinions you may have, even if it may come across as absurd. Historically, many ideas that we agree with today — men and women are of equal intelligence, gay couples deserve equal rights — were historically cast off as insane, and if they were prevented from reaching the public, change would have never happened organically. Today, allowing one faction of society to control almost all information is dangerous to our growth as a nation, even if none of the ideas that are being censored seem like positive changes to most people.
Penelope rebuttal:
While it’s true that misinformation is protected under the First Amendment, unless it’s defamation or libel, that’s not the question. The question is whether it’s gotten violent, and it has. We’ve seen that with the January 6 Capitol riots. Part of the reason for the riots was because of major groups such as QAnon, or far-white groups that spread lies and baseless accusations. So the point that there’s a select group of intellectuals controlling the narrative, and that that group of intellectuals will someday lead to something potentially being censored, is sharply contrasted by the fact that there are currently groups and people causing violence and causing death and harm because of misinformation.
Should Trump have been banned from Twitter following his posts on Jan. 6?
Penelope:
Yes, clear and simple as that. I think that right now, the issue we have with social media censorship is we see it as black and white when it’s clearly not. If some random person on Twitter tweeted what Trump tweeted on January 6, I don’t think that person deserves to be banned from the platform. But when you’re Trump, when you have the following you have, you have the authority you have as President of the United States, when it’s January 6, when you’re speaking in front of a crowd of thousands of people in front of the Capitol, and you tweet things such as “THE REPUBLICAN PARTY AND, MORE IMPORTANTLY, OUR COUNTRY, NEEDS THE PRESIDENCY MORE THAN EVER BEFORE – THE POWER OF THE VETO. STAY STRONG!” you’re clearly inciting a riot, you’re clearly inciting violence, and you deserve to be banned from the platform.
Alex:
I agree that what Trump said on Jan. 6 was wrong and incited violence. And yes, Twitter did the right thing by banning him at the moment, but that doesn’t mean they should have banned him indefinitely. Being such a high-profile person, he should have been reinstated under strict warnings that another violent Tweet would lead to a permanent suspension. Plus, the double standard is apparent once again — during the 2020 Minnesota protests, Democrat Maxine Waters called on protesters to break curfew and become “more confrontational” with police, and she didn’t receive any rebuke from social media companies. While I believe that the Black Lives Matter protests had significantly more merit than the “Stop the Steal” protests, there still shouldn’t be a double standard for inciting violence.
Penelope rebuttal:
I actually agree with you in the sense that I don’t think he deserves a full, lifetime ban for two reasons. First of all, if there is an opportunity that presents itself where he can redeem himself and show he deserves to have a platform again, perhaps through a trial run of getting his account back, I agree that I think he deserves that chance. Secondly, he shouldn’t have a lifetime ban because knowing what he is saying to his followers, knowing the messages he sends them is important to know on a mainstream media platform. So for example, when he was on Twitter, every single person, every single news outlet was covering what he said on his Twitter. What he says on different sorts of smaller platforms, like Parlor, or his own personal website, is not covered or cared about as broadly. We don’t know what he’s saying to his fans. We don’t know what he’s saying to his followers. It seems almost more dangerous to have him secluded from the rest of the internet world.
If there is a need to regulate free speech, should it be government regulated or company regulated?
Penelope:
At this point, Facebook and other social media platforms have abused their privileges for too long, and now government intervention is necessary. Having government subcommittees focused on preventing misinformation, the spread of violence and hate speech on social media is necessary. Government legislature could also enforce different laws on social media companies, such as requiring a certain percentage of their funds to be allocated towards teams that address misinformation and improving the platform with more misinformation warnings.
Alex:
To me, it’s a very situation-based predicament. Companies have the right to regulate anything that goes on their platform, although I’ve already outlined my belief on when they should and shouldn’t censor posts. But at the same time, I agree that the government should put more effort into combating violent misinformation, as toxic online movements are often a root cause of violent outbursts in society.
Number of years in The Coat of Arms: 2
Favorite aspect of journalism: it gives my illustrations a good platform to relay a greater message.
Interests outside of school: playing volleyball, drawing and watching cartoons.
Class of 2023