Does YouTube's 'Recommended Video' Algorithm Bias People's Thoughts?
and drive people into radicalism seeking radical reforms in social structure.
YouTube uses an algorithm to determine and display 'recommended videos' when a user watches a video. Research has shown one view of the concern that continued watching of this 'recommended video' would bias ideology
Examining the consumption of radical content on YouTube | PNAS
If YouTube's algorithms radicalize people, it's hard to tell from the data | Ars Technica
Homa Hosseinmard and colleagues at the University of Pennsylvania and colleagues from research firm Nielsen and others watched about 21 million videos from 2016 to 2019, about 9 million channels, and anonymized data from about 300,000 users. I got. Most of these channels weren't political, so Hosseinmard and colleagues conducted four studies from 2020 to 2021 that labeled the various YouTube channels as 'alt-right, ' 'socialism,' and so on. Quoted. Find the same of these labeled channels from Nielsen's data and relabel them with your own labels 'Far Left', 'Left', 'Center', 'Right', 'Far Right' and 'anti-woke'. I did it.
The word 'woke ' in 'anti-woke' was originally used to mean 'referring to racial prejudice and discrimination,' and is nowadays among people who refer to the issue of Black Lives Matter. In addition to being used in, the meaning of the word has expanded, and it is also being used for problems such as women's problems and social inequality. And 'anti-woke' refers to people who oppose the rise of 'woke' as 'overdoing'. The study said that the label 'anti-woke' was focused on 'opposition to the progressive justice movement,' but the videos on these channels are not necessarily clearly political. Was not limited.
In the end, Hosseinmard et al. Labeled a total of 997 channels. Videos on these channels accounted for 3.3% of total viewing time.
According to Hosseinmard et al., People who watched the videos posted by these channels tended to stick to the same type of video. For example, users who watched 'left' and 'far left' content in 2016 may have continued to watch 'left' and 'far left' content at the end of the 2019 survey.
So, Hosseinmard and colleagues investigated the number of viewers for each label, what kind of videos the viewers watched, and how much time they spent watching. Of the six labels, the one with the highest number of viewers was the 'left', which alone accounted for almost half. The next most common was 'centrist'. During the period from 2016 to 2019, which is the survey period of the original data, the number of viewers and viewing time of 'left', 'center', 'right', and 'anti-woke' increased. This suggests that more and more users are using YouTube instead of broadcast media, Hosseinmard and colleagues say.
Considering the viewing time, the number of viewers on the far right did not change much in the four-year survey, but the viewing time of each increased. On the other hand, although the number of right-wing viewers increased, the viewing time did not change much. In addition, anti-woke showed the highest growth rate of any label, and by the end of the survey, it had reached the viewing time of less than the left and more than the middle road.
Hosseinmard and colleagues said, 'If YouTube's algorithms lead people to extreme ideas, the number of viewers should increase later in the research period, but it didn't actually increase. It has been shown that there is a possibility that there is no tendency to lead people to extreme ideas. '
'The far-right content has a slightly more user-attracting nature, so viewers spend more time,' Hosseinmard and colleagues said. 'Anti-woke has the property of attracting more users, which is believed to have increased the number of viewers,' he said.
Hosseinmard and colleagues found no evidence that YouTube is driving users radically, but foreign media Ars Technica said of the study, 'The study is focused on viewing on desktop browsers only, on mobile devices. It doesn't take into account watching the video, and because YouTube's algorithm hasn't been able to determine what kind of video it actually recommended, the recommended video perspective involves speculation. '
in Web Service, Posted by log1p_kr