A Click Away From Conspiracy

We%E2%80%99re+spending+more+and+more+time+with+our+eyes+glued+to+a+screen.

Photo Credit: Artur Debat

We’re spending more and more time with our eyes glued to a screen.

We’ve all experienced the ways YouTube can take you places you never expected. Its auto-play function means that you don’t even have to click to be pulled into the next video of an endless stream. One minute you might be watching a video on biology, yet ten minutes you might find yourself watching a video of a chinchilla giving itself a dust bath. It’s brilliantly and terrifyingly engaging.

However, YouTube does more than merely waste valuable time. It selects content for us based on its calculation of our preferences, choosing videos that reflect our own views. These recommendations make up 70 percent of YouTube’s total viewing time. That means that YouTube is choosing what we watch more than we are.

Strange videos can often prompt people to stay watching, preying on our natural curiosity. Student Lili Serio noticed the prevalence of peculiar, yet intriguing videos that YouTube recommends. “Even the people I’m subscribed to rarely have videos that show up on my recommended feed,” Serio says, “Instead, it’s a lot of clickbait type stuff, usually similar to videos that I’ve watched only once or twice in the past”.

This isn’t to say that there’s a lack of quality content on YouTube. According to Serio, there are two sides to YouTube: “the content creation side” and the one that features videos like “Girl Kidnapped for 2 months and Found Alive” or “A Year in a Hut: Daily Life”. There’s a vast array of options on the site, including high quality educational videos as well as sources of information on current events and politics, but that doesn’t mean we gravitate towards the most informative or beneficial videos.

With regards to politics, these suggestions can create an echo-chamber for our beliefs to intensify. When we watch videos that indicate our partisan preference, the algorithm tends to suggest videos that take an increasingly extreme stance. This means we will be showered in similar perspectives, without gaining exposure to those of others. Because there is such a lack of control on inflammatory or completely false content, YouTube can mold our opinions with misinformation.

An investigation by the Wall Street Journal found that the site often  directs viewers toward “channels that feature conspiracy theories, partisan viewpoints and misleading videos.” Zeynep Tufekci of the New York Times found that repeatedly searching Donald Trump rallies led to the suggestion of videos that featured “white supremacist rants” and “Holocaust denials,” while repeatedly searching Bernie Sanders and Hillary Clinton ultimately led to the recommendation of videos of “a leftish conspiratorial cast” featuring conspiracy theories about “secret government agencies.” In an age of intensifying political polarization, reinforcing our biases can exacerbate the divides that already exist.

A recent post on YouTube’s official blog attempts to address this problem. Facing rising heat its spread of inaccurate information, the YouTube team declared that they will “begin reducing recommendations of borderline content and content that could misinform users in harmful ways – such as such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.” Time will tell if these claims will create substantial change, but it’s a step in the right direction.

Part of the issue is that YouTube isn’t structured, or even truly intended, to be a reliable source of news. While Google’s algorithms are designed to promote reputable sources for news coverage, YouTube operates entirely differently. As a source of entertainment, it is designed to keep eyes glued to the screen, not to inform.

More and more extreme content captures our attention and sparks us to continue watching, whether or not it’s even a conscious choice. Although YouTube may adjust its algorithms to discourage the proliferation of blatantly false information, it’s core purpose — to keep people watching — likely won’t change.

Although it can be a resource for both enjoyment and learning, it’s crucial to be aware of how YouTube influences the content we take in. Giggling at cat videos is one thing, but viewing questionable political coverage can manipulate our ability to logically process the events and policies of our world.