We all know how easy it is to spend hours watching videos on YouTube. Why do we go down that rabbit hole? Mostly because of a combination of computer programming and marketing know-how called ALGORITHMS.

TEACHERS: Learn more about this topic and how you might teach it with your students via one of our free summer PD courses: https://teach.kqed.org/misinformation-course-collection/

What are algorithms?

Algorithms are sets of instructions (created by people) that decide what content you see, and the order it’s listed, when you search online.

How do recommendation algorithms work on YouTube?

YouTube’s algorithm captures data about videos you watch, including how long you watch. They recommend other videos based on that viewing history. They optimize advertising by selling this data to companies so they can better target you for their products.

How do these algorithms play a role in spreading misinformation?

Digital platforms like YouTube are the gatekeepers of information, whether they intended to be or not. And media-savvy con artists and extremists groups—like conspiracy theorists and white supremacists—can take advantage of YouTube’s algorithms to push their agendas.

What are data voids?

Propagandists often take advantage of a lack of content on a certain topic—a “data void.” This happens a lot with breaking news. It takes time for legit media outlets to fact-check and verify events and create content. Until then, Youtube will often only have false or misleading information.

What can I do to avoid falling for misinformation?

The more precise you are with search keywords, the more likely you are to get relevant information. If your results look suspicious or click-baity, you might be in a data void. YouTube can be a good place to start your research, but you should use a wider range of sources.

SOURCES:

“Data Voids: Where Missing Data Can Easily Be Exploited,”by Michael Golebiewski, Microsoft Bing and danah boyd, Data & Society

“Caleb Cain was a college dropout looking for direction. He turned to YouTube.” (The New York Times)

“YouTube to Remove Thousands of Videos Pushing Extreme Views” (The New York Times)

“After New Zealand Massacre, YouTube’s Algorithm Still Promotes Islamophobic Videos” (Huffington Post)

“We Followed YouTube’s Recommendation Algorithm Down The Rabbit Hole” (Buzzfeed)

“News Use Across Social Media Platforms 2018” (Pew Research Center)

“Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018” (Pew Research Center)

“DeepMind is asking how AI helped turn the internet into an echo chamber” (MIT Technology Review)

“’Fiction is outperforming reality’: how YouTube’s algorithm distorts truth” (Guardian)

“YouTube, the Great Radicalizer” (New York Times Op Ed)

YouTube Algorithms: How To Avoid the Rabbit Hole 21 June,2019Annelise Wunderlich

Author

Annelise Wunderlich

Annelise is a documentary filmmaker, educator, and Youth Participation Manager at KQED. Her films have aired on national and regional public television outlets, and she teaches film studies at Diablo Valley College. She loves very spicy food, and traveling to places where people make it.

Sponsored by

Become a KQED sponsor