August 16, 2021

The Youtube recommendation system

YouTube’s recommendation system carries the risk of filter bubbles from alternative facts, conspiracy theories and false reports.

What are the first companies that come to mind when you think of search engines? 

You probably think of Google, Yahoo!, Bing, Startpage, of course, and more. In fact, as the second largest search engine, YouTube is one of the most popular sites for search queries. With more than 3 billion searches per month, the platform is bigger than Bing, Yahoo!, Ask and AOL combined.

The video platform is so popular because it prepares information and content as videos and is therefore easily accessible and entertaining. Plus, Youtube keeps you on the platform by suggesting relevant content through its learning algorithm. The more videos you watch, the more hints you give the algorithm as to which content it should suggest to you. The algorithm reacts to your consumer behavior: Your search terms, whether you watch the whole video, the topics and cover images that make you click on the video. 

And before you know it, you can find yourself in a filter bubble of alternative facts, conspiracy theories, false reports and tirades of hate. This problem is not unknown and is not limited to the video platform. Other social networks such as Facebook and Twitter are also facing these problems. Nevertheless, YouTube is one of the most problematic cases. But where exactly is the problem?

Which tasks should an algorithm fulfill?

First of all, it is important to keep in mind what tasks an algorithm should actually perform. 

Up to 500 hours of video are uploaded to YouTube every minute. So much content first has to be sorted and organized for the viewer. Without an algorithm, it would hardly be possible to find relevant content at all in such a flood of information. So the main task of an algorithm is to organize information. 

In the interest of the platform, users should stay on the site for as long as possible and be satisfied with the range of videos. That’s why you want to suggest content that is as relevant as possible that will get people to watch more videos that interest them. Therefore, the algorithm bases its selection on characteristics of videos that a person has viewed.

Of course, there are also guidelines that videos must comply with in order to be available on the platform. Inappropriate or even harmful content is not welcome. Another task of the algorithm is therefore to recognize such videos, to filter them out and to block or delete them.

And here we come to the first critical point in the case of YouTube: The platform has set up such rules and guidelines for videos. But as studies show, the algorithm does not really stick to it and regularly ignores content that actually has no business on the platform.

The problem with YouTube’s recommendation system

In 2020, the Vodafone Foundation commissioned a survey in which respondents were asked to assess which social media platforms are largely responsible for the spread of false information in Germany. The result: 88% of respondents named YouTube as one of the largest sources of disinformation, right behind WhatsApp (92%) and Facebook (89%).

The non-profit organization Mozilla has made similar allegations against YouTube and accuses the platform of having been ensuring that false information relating to health, politics and hate speech has been spreading undisturbed worldwide for years. In a large-scale study, they support this claim and show why this algorithm causes great damage.

Youtube video: Why Does YouTube Recommend Conspiracy Theories? Mozilla Explains: Filter Bubbles

What the recommendation algorithm can do

With the awareness campaign “YouTube Regrets”, the non-profit organization has collected countless stories from users in which they report on how the algorithm has permanently damaged them. Most of these user stories start with a harmless video and end with the algorithm presenting videos with alarming and mentally stressful content. Here are a few examples:

  • Initial video search: “funny” breakdown videos.
    Recommended algorithms: clips of fatal accidents and explosions.
  • Initial video search: serious coverage of the Apollo 11 mission.
    Algorithm Recommendations: Conspiracy Theories Concerning Hitler’s Escape, September 11th, and Others.
  • Initial video search: tap dance videos.
    Algorithm Recommendations: Distorted body perceptions that lead to eating disorders.
  • Initial video search: Vlog of a drag queen.
    Algorithm recommendations: LGBTQ hostile and inflammatory videos.
  • Initial video search: Vikings and Nordic paganism.
    Algorithm Recommendations: Racism and White Supremacy.

Some larger stories have even garnered media attention.

  • The New York Times reports on many cases in which young adults in the United States have radicalized themselves through the algorithm. 
  • In Brazil, YouTube’s recommendation system ensured that division and radicalization spread among the population.
  • In Hong Kong, the algorithm went viral with a Chinese propaganda video containing false information about the demonstrations.

YouTube itself knows about its problem, as it is repeatedly confronted with it from outside, but hardly does anything about it. This is shown by a recent case:

In May 2021, Politico reported on how a French conspiracy video about the corona pandemic remained available on YouTube and Facebook for months. It even spread multiple times on YouTube and reached over 1.1 million views. Only after multiple reports did YouTube delete the video because it violated its own guidelines. These guidelines have been more strictly revised since the pandemic to counter the spread of false information.

In Germany, too, the filter bubble is on YouTube and is causing confusion on political issues in view of this year’s general election. The AlgorithmWatch initiative is currently investigating in a project which political videos are shown to whom in the run-up to the election. This is to better understand what influence the recommendation algorithm has on political opinion formation. Volunteers can take part in the project until August 25 and voluntarily donate their data on user behavior on YouTube for research purposes via the data donation platform “DataSkop”. The results are to be published in September before the election. You can take part here (The website and client only available in German).

The algorithm is the problem

To show the extent of YouTube’s failure to act, the Mozilla report collected data from over 37,000 users. The results reveal the following main problems:

  • 71% of the videos submitted that people regret watching are featured videos.
  • In 40% of the cases, the algorithm recommends videos that were later deleted from YouTube – sometimes without justification.
  • English-speaking countries are preferred: the guidelines there have been more strictly revised. In non-English-speaking countries, videos with marginal content are deleted less often and can spread more widely. It was noticeable that incorrect information regarding COVID-19 in particular occurred most frequently in non-English-speaking countries.
  • An opaque and problematic algorithm: in 43% of the videos submitted, study participants stated that the content of the recommended videos was unrelated to the videos they saw previously.
  • Video recommendations with borderline content leave people with permanent damage. This emerges from the experience reports in which those affected suffer long-term stress or even damage.

Here’s how the problem can be addressed:

First of all, it should be noted that algorithms are not bad per se. On the contrary: they are actually very useful and have become an integral part of our digital life. Nevertheless: Algorithms should always be used in the interest of the general public and improve experiences, not worsen them. That is why the way they work must be made transparent in order to be trustworthy.

Researcher Sandra Wachter says to the Deutschlandfunk,

“Algorithms cannot solve social problems. Algorithms are based on human decisions, which can be a problem in criminal justice or when awarding jobs, for example. Therefore, algorithms have to be checked regularly.”

Based on the investigation, Mozilla recommends the following measures to solve the problem:

  • Platforms should be required to publish regular transparency reports and disclose precise information about recommendation algorithms therein.
  • Users should be able to switch off personalized recommendations on platforms.
  • Developments of risk management systems within the platforms that deal exclusively with the recommendation algorithm.
  • Laws are needed that generally stipulate transparency in AI systems and thus protect independent researchers.

How can users currently protect themselves?

The recommendation algorithm cannot be switched off completely. Still, there are a few specific things you can do to protect yourself.

  • Turn off autoplay. This prevents the algorithm from deciding which video to play automatically next.
  • Clear your previous search history. This allows you to influence the fact that previous search queries are no longer considered by the recommendation system for further suggestions. Individual search queries can also be deleted on the data management page, which no longer affect the algorithm.
  • Turn off notifications from YouTube. You can turn off all notifications in the settings. This means that no further videos will be recommended to you via the notification function.
  • Watch videos in your browser’s private mode. In this way, the video and the associated search query are not taken into account by the recommendation algorithm for further video suggestions. But be careful: you are not anonymous in private mode. This only means that the pages currently accessed are not saved in the history. Your data will still be transmitted.
  • Report posts when you discover inappropriate videos. Have you come across borderline videos that contradict the platform’s guidelines and spread misinformation? Don’t hesitate to report the video. The more people use this feature, the more likely YouTube will remove the video.
  • Use Startpage to search for videos. You will also see YouTube videos in the video results, but your search query remains hidden from YouTube and cannot be used for further video recommendations.

 

Was this article helpful?

Go Private

Make Startpage your
default search engine

Set as default