The news: YouTube has managed to drastically reduce the number of conspiracy theory videos it recommends, but the total is creeping back up again, according to a new study.
The study: Researchers trained an algorithm to judge the likelihood that a video on the site contained conspiracy theories by looking at the description, transcript, and comments. They examined eight million recommendations over 15 months. They found that shortly after YouTube announced it would recommend less conspiracy content in January 2019, the numbers did indeed gradually drop—by about 70% at the lowest point in May 2019. However, the number of conspiracy videos YouTube’s algorithm recommends has steadily risen again since then. These recommendations are now only 40% less common than when YouTube started its crackdown.
A reminder: YouTube’s recommendations are responsible for almost three-quarters of the more than one billion hours people spend watching videos on YouTube every day.
Uneven progress: Progress has varied by the type of video, the researchers found. YouTube has managed to almost completely scrub some conspiracy theories from its recommendations, including those claiming the US government helped organize 9/11, or that the earth is flat. However, some continue to flourish, including videos espousing climate change denial. The researchers told the New York Times that these findings suggest YouTube has decided which misinformation it will and won’t permit, although it’s yet to disclose any such policy publicly. This is legal, given that YouTube is a private forum and so free speech laws don’t apply, a US appeals court recently ruled.
The algorithm: We don’t fully understand how YouTube’s recommendation algorithms work, and the company regularly tweaks them. YouTube says the recommendation engine’s goal is to “help viewers find the videos they want to watch, and to maximize long-term viewer engagement and satisfaction.” Although previous studies have established that YouTube has played a role in helping radicalize people, it’s proved hard to establish exactly how the site works. That’s partly because it’s virtually impossible to replicate an individual user’s experience there. This study experienced that limitation too: it was able to study the site only from the perspective of someone who’s logged out, which is not how most people use YouTube.