The YouTube Algorithm, Discrimination, and Internet Free Speech

YouTube has a content problem and, once again, the LGBTQ+ community is on the receiving end.

Carlos Maza, an openly gay Vox writer and YouTube host of the Strikethrough series, recently spoke up about the many attacks made against him by conservative commentator Stephen Crowder.

Maza states that Crowder has been harassing him for years, tweeting a compilation of these threats. In his tweet, Maza says: “Since I started working at Vox, Steven Crowder has been making video after video “debunking” Strikethrough. Every single video has included repeated, overt attacks on my sexual orientation and ethnicity.” 

In these clips, Crowder makes a mockery not only of Maza, but of LGBTQ+ individuals as a whole. The attacks mostly consist of slurs, with the most notable being: “lispy sprite”, “angry little queer”, and “gay Mexican”. That last line is especially ignorant, considering that Maza is Cuban-American.

On other occasions, Crowder employs a caricatured ‘gay voice’ while talking about the LGBTQ+ community and pantomimes oral sex with his microphone. If that wasn’t explicit enough, he has also donned a “Socialism is for F*gs” on many occasions, which is now a part of his official merchandise.

All of this should rightfully warrant, at the very least, action from YouTube’s part. However, YouTube’s initial response was to deny that Crowder had violated any of their policies.

YouTube’s official Twitter account posted: “As an open platform, it’s crucial for us to allow everyone–from creators to journalists to late-night TV hosts–to express their opinions w/in the scope of our policies. Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site.” 

When hate speech is treated as a matter a mere opinion, it becomes clear that more must be done in the name of queer liberation.

To make matters worse, YouTube showed a failure to understand the issue by essentially excusing Crowder’s behavior fully. Following the initial response, YouTube cited that Crowder never explicitly incited audience attacks against Maza. But giving him the benefit of the doubt is impossible in this situation. Another one of the many homophobic shirts he sells as part of his merchandise boasts the remark “Carlos Maza is a F*g”. Once again, this begs the question: with such blatant discrimination, how has it taken this long for the issue to be picked up?

It all comes down to two things: deception and the YouTube algorithm. Like Crowder, most on the far-right dial their hatred down through the use of diplomatic euphemisms. Many white supremacists, for instance, would take to calling themselves ‘white nationalists’ or even ‘ethnocentrists’. All this ever does is rebrand the same kind of regressive mindset that fueled most historical fascist agendas. 

In Crowder’s case, because his content mainly focuses on ‘debating’ and ‘debunking’ Maza’s opinions, YouTube had merely categorized his videos as analytical, rather than as instances of oppressive violence.

This reaction is quite odd. After all, while YouTube advertises itself as a neutral platform, its policies explicitly forbid hate speech, which they define as “content promoting violence or hatred” against a person or group’s sexuality, race, and gender. Along with this, YouTube has enforced a zero-tolerance policy for “behavior intended to maliciously harass, threaten, or bully others”, which should include content posted to humiliate or “make hurtful and negative personal comments/videos about another person.” 

Crowder’s long streak of harassment was a clear violation of these policies. However, only after continued controversy and a final call-out from Maza himself did YouTube CEO Susan Wojcicki apologize to the LGBTQ+ community for the lack of action taken. Crowder’s channel has since been demonetized. 

This act should not be taken as a strain of progress, however. Instead, it should be a reminder that intolerance is still an issue of the present, and that the algorithm still needs to be improved.

YouTube’s system does little to protect LGBTQ+ creators. The video-sharing website only recently received criticism for its demonetization and age-restricting of SFW videos made by gay and transgender creators. The site has also done little to filter out companies sponsoring homophobic ads (and ironically enough, often before videos with pro-LGBTQ+ content). The site has assured viewers that the algorithms have been edited to prevent such incidents, but it continues to be an issue to this day.

With this, YouTube has shown an active effort to improve website content. Following mixed reactions to the Crowder case, they have taken to clearing white supremacist content from their platforms, joining the fight of sites such as FaceBook and Twitter.

These controversies go to highlight the ongoing Internet debate surrounding content regulation on social media platforms. Some proclaim that the increase in control prohibits their free-speech and is biased against the right, whereas others applaud the action that has been taken to make online spaces more inclusive. Additionally, an increased restriction on conservative content will most likely provoke bans on more liberal content in tandem. Policing free speech on social media never ceases to be a slippery slope.

Another reason the issue is a lot more complicated than many may make it out to be is because algorithms and user-reporting interfaces can never be perfect. While these mechanisms are relatively reliable and help sift through the vast amount of content uploaded to the site, tightening restrictions is always a challenge. If the algorithm gets too strict, it may inadvertently flag innocent satirical and comedic videos, or even videos simply discussing hate speech. 

YouTube’s overreliance on technology goes to prove that, as with any issue, nuance is critical. Content regulation is harder than most make it out to be. Its impact stretches towards not only users and influences, but the platform as a whole. YouTube’s business schematics emphasise  increased engagement because that is where they make most of their money. As a result, they often see provocative content as a necessary evil. On the other hand, private companies have more leeway in picking and restricting the content they advertise. This model is where another point of conflict arises- whether profit margins and inclusivity are dichotomous or simultaneously reclaimable.

Profit aside, social media platforms, at their very core, are obligated to enable voices coming from the most diverse array of views possible. Due to this, they must try their best to be politically neutral. However, they are also obligated to establish a line between sharing an opinion and repressing the voice of another. 

As Wojcicki herself has said, “context really, really matters” in the creation of policies and algorithms. The YouTube algorithm determined Crowder’s purpose was to debate Maza and Vox on political issues and, as a result, allowed his videos to remain visible. But whoever is to blame in this situation doesn’t matter because of 1.) the complexity of the issue and 2.) the LGBTQ+ community and other minorities are silenced nonetheless.

To clarify, these complaints are not meant to let YouTube bear the grunt of all the fault. Many factors come into play, and a majority of the content comes without the knowledge or approval of the company itself. Just as most political forums feature several sources and perspectives, so should YouTube.  If liberal creators are allowed to have a voice and an audience, conservatives must be as well. And while not all LGBTQ+ people are liberals and not all conservatives are close-minded, neutrality does not and should not beget hatred.

Social media sites must continue to take against the explicit and prolonged harassment of individuals. YouTube has the duty of reducing online intolerance as far as their status as a neutral platform enables them to.