Experimental results show that the ``rabbit hole'' phenomenon of chasing conspiracy theories and fake news on YouTube could hardly be confirmed



Many people have had the experience of endlessly watching videos displayed by YouTube's recommendation algorithm. The phenomenon in which the genres of videos to watch are limited by following this recommendation algorithm is called the 'rabbit hole,' and it is said that viewers are more likely to get addicted to videos that talk about extreme content or conspiracy theories. Masu. However, a research team led by Professor Brendan Nyhan of Dartmouth University has reported experimental results that show that there is almost no phenomenon of people falling into extreme content or conspiracy theories through the ``rabbit hole'' of the recommendation algorithm.

Subscriptions and external links help drive resentful users to alternative and extremist YouTube channels | Science Advances

https://doi.org/10.1126/sciadv.add8080



The World Will Never Know the Truth About YouTube's Rabbit Holes - The Atlantic
https://www.theatlantic.com/technology/archive/2023/08/youtube-rabbit-holes-american-politics/675186/

It is said that the number of alt-right and conspiracy theorists based on YouTube has increased dramatically in the wake of the 2016 US presidential election. When you watch videos from these channels, YouTube's recommendation algorithm will recommend similar videos to you. The trouble is, even if you weren't necessarily looking for extreme content, once you accidentally click on extreme content, the algorithm will continue to recommend extreme videos. It has been pointed out that the number of people who hold conspiracy theories and extremist ideas mainly on the Internet is increasing due to YouTube's ``rabbit hole'' phenomenon.



YouTube denies the ``rabbit hole'' phenomenon, and announced in 2019 that it had adjusted its recommendation algorithm to reduce the display of harmful fake news and extreme content in recommendations. At the same time, YouTube is taking actions such as blocking shared ad revenue programs from YouTube creators who don't follow our hate speech policies.

A research team led by Professor Nyhan surveyed 1,181 people about their existing political stances and used a browser extension to monitor all YouTube activity over several months from the end of 2020. They found that only 6% of participants had watched extremist videos. We also found that most of them had intentionally subscribed to at least one extremist channel. And it became clear that these people often accessed extreme videos from external links rather than from within YouTube.

Professor Nyhan said: ``The viewing patterns did not show a 'rabbit hole' phenomenon. Rather than naive users suddenly and unknowingly drifting towards hateful content, 'We can see that people with strong sexist and racist views are seeking similar content.' The American monthly magazine The Atlantic states, ``Professor Nyhan et al.'s research remains important in that it proposes a concrete and technical definition of the 'rabbit hole' phenomenon. Words have been used in a variety of ways, not only in general conversation but also in academic research.Professor Nyhan and his team of researchers describe the ``rabbit hole'' phenomenon as ``when people follow a recommendation, they find it more difficult to understand than previously seen.'' 'I defined it as 'to reach an extreme type of video.''



However, Professor Nyhan pointed out that YouTube has not stopped directing people to extremist videos through its recommendation function, and continues to allow users to publish extremist videos, and that this research will benefit YouTube. He cautions that this does not result in 'total indulgence.'

On the other hand, The Atlantic points out that the research by Professor Nyhan et al. ``does not explain what happened on YouTube before 2020, when the data was collected.'' Professor Nyhan and colleagues argue in their paper that ``susceptible people may have already become radicalized before 2019,'' but the ``rabbit hole'' phenomenon occurred before YouTube adjusted its recommendation algorithm. The Atlantic argues that it has not been possible to verify whether this has happened, and that it cannot be said that the 'rabbit hole' phenomenon almost never occurs. The Atlantic says, ``At the end of the day, extremist content still exists on YouTube, and some people still watch it. 'Did it appear first?' raises the question.

in Web Service, Posted by log1i_yk