YouTube's Vague 'Misinformation' Guidelines Have Become a Political Problem | Opinion

YouTube's misinformation guidelines are coming in for some long overdue criticism in light of a few recent episodes that exposed just how ridiculous they are. A few weeks ago, a video put out by the bipartisan January 6 Committee was pulled from the platform because it included former President Donald Trump telling falsehoods about the election. These violated the platform's "election integrity policy," according to a YouTube spokesperson, even though the former president's comments were being shown in order to rebut them. "We enforce our policies equally for everyone," the spokesperson told the New York Times.

A similarly Kaftkaesque situation happened in March, this time to The Hill's YouTube show "Rising." The channel was suspended for seven days for violating the platform's rules around election misinformation after the hosts had played a clip of Donald Trump saying the election was rigged—in order to immediately refute it.

I suppose it's commendable that YouTube is adheering to its pledge to stop the spread of misinformation even when it's coming from liberal sources seeking to debunk it. But it's also absurd to have a standard for barring misinformation that precludes being able to show why it's wrong.

Unfortunately, these aren't one offs. They are a systemic problem when it comes to YouTube's guidelines for misinformation. YouTube's election integrity policy is part of its larger umbrella of misinformation guidelines, but all are very loose in defining the content worthy of being taken down. Among other things, YouTube barrs content that "contradicts local health authorities' or WHO guidance on certain safe medical practices," content that "claims that fake ballots were dumped to give one candidate enough votes to win a state in a past U.S. presidential election," and "claims that the U.S. 2020 presidential election was rigged or stolen." This is clearly applicable to almost every speech by Donald Trump, the likely GOP nominee in 2024, meaning that YouTube could easily use its guidelines as a basis for taking down all of the videos featuring one side's political candidate.

YouTube
LIONEL BONAVENTURE/AFP via Getty Images

One reason YouTube's censorship policies haven't received the scrutiny over censorship that other platforms have is because a large amount of content on YouTube is not political, leading it to be perceived as a less serious medium for commentary. Most elites and powerful institutions are not active on YouTube—unlike Twitter and Facebook.

But this lack of elite gatekeepers also makes the platform an effective vehicle for political commentators to subvert traditional forms of mass media. Independent creators can reach hundreds of thousands if not millions of people all on their own in the vast YouTube realm. While the platform's algorithm does favor institutional media programs, a regular online creator still has the capacity to build a large following doing anything from serious historical analysis to unboxings of Pokemon cards.

The YouTube galaxy of original, independent content has turned it into a global cultural powerhouse full of passionate, online communities, upending traditional forms of entertainment to become a landscape full of vibrant creators that thrive in an open environment.

And it's the future. YouTube has approximately 2.6 billion users but the largest age base of users is 18-25 years old. A 2019 survey showed about one third of American children aspire to be vloggers, presumably on YouTube. Particularly with younger generations, the large cultural power of the platform is undeniable.

As a place where the future of the news is likely migrating, YouTube's guidelines for what it censors are emerging as a crucial battleground in the fight for access to unbiased information.

Victims of the platform's guidelines range across the political spectrum. Independent leftist journalist Jordan Chariton told socialist publication Jacobin that multiple videos of his cameramen debating Trump supporters on false election claims were taken down from his channel. So was Chariton's on the ground footage from the Jan. 6 Capitol riot, despite its widespread usage in mainstream media coverage. YouTube also permanently suspended Fox News host Dan Bongino after he doubted the effectiveness of mask usage to combat COVID-19.

It's true that when YouTube uses its guidelines to limit monetization or take down videos, creators can appeal the decision to a human operator, and in my experience with operating a prominent independent YouTube channel, objections are taken seriously by company staff, and the appeals process is straightforward. But the process to appeal only proves how wide the scope is of what counts as misinformation on the platform, and how easily it can lead to mistakes, as in the case of the January 6 Committee.

YouTubers and their audiences deserve better guidelines.

The gigantic, worldwide audience on YouTube deserves to be trusted with a wide range of political opinions, no matter who is right or wrong.

James Lynch is a producer at Breaking Points.

The views expressed in this article are the writer's own.