Čo je
Does YouTube fight misinformation videos? Analysing video removal over 3 years
In 2021 (more than 3 years ago), we performed an analysis of how the YouTube recommendation algorithm encloses users in misinformation bubbles, and how one can burst such bubbles. The findings were summarised in a paper published at the RecSys 2021 conference, gaining the best paper award.
As part of this study, we have spent a lot of effort annotating videos based on whether they promote or debunk misinformation or are unrelated to it. This included both the videos used to set up the study (seed videos) but also the ones encountered during the study. Besides allowing us to perform the study, having a dataset of annotated videos has allowed us to (at least partially) validate claims by the YouTube platform that it will actively fight misinformation videos on the platform.
Since preparing the dataset, we have checked whether the videos in our dataset are still available on the YouTube platform every single week. And here are the results.
Promoting videos disappeared faster and in larger quantities than debunking or neutral videos. Although all of the videos are slowly being removed from the platform, there is an obvious difference between videos promoting misinformation and the ones debunking misinformation (or neutral ones). In three years, around 6.4% of debunking (8% of the seed videos; 5% of the ones encountered during the study) and 6.3% of neutral videos were removed from the platform. This minimal difference indicates that this is the expected rate of removal for a typical YouTube video. However, for the promoting videos, we have observed a significantly higher rate – around 21.1% of all promoting videos (or 32.2% of seed and 10.3% of the encountered videos) have been removed in the three-year span.
Based on this, we can conclude that YouTube fights against videos that promote misinformation. However, the rate of removal can still be viewed as quite low, removing only 20% of such videos over 3 years still leaves a large number of misinformation videos on the platform.
This fight against misinformation videos can be further showcased by analysing the reasons why the video is no longer available. For the neutral videos, the most common cause is switching the video to private (46%) or no reason indicated (30%). The reasons that can be viewed as direct action from the platform are removing the account (14%), copyright infringement (4.5%), terms of condition infringement (2%), or being against guidelines (0.5% containing sexual content in all cases). The remaining videos were removed by the users themselves (2%).
For the debunking videos, the reasons are very similar, with the standard or neutral reasons being high – switching the videos to private (65%), no reason indicated (23.5%) – while the direct action from the platform being lower – copyright infringement (4%), removing account (2%), terms and conditions (2%). On the other hand, there was a larger number of videos removed by the user (4%).
However, for the promotional videos, we see different ratios in the reasons. The more common or neutral reasons are slightly lower – switching to private (18%) and no reason indicated (20%). At the same time, the reasons that can be viewed as a direct action of the platform are significantly higher, such as removing the account (35%), breaking the guidelines (21.5%, with 2% being due to hate speech), and terms of condition (4%).
This further reinforces the observation that the YouTube platform actively fights videos promoting misinformation, either by removing the account altogether or removing specific videos that break the guidelines. However, there is still a significant number of videos that have no explicit action from the platform – up to 38% – while a large number of videos promoting misinformation still remain.
To conclude, YouTube does indeed fight videos promoting misinformation, but at a slower rate than one would expect, as the misinformation videos may have a significant impact on society as a whole.

