A study found that TikTok's algorithm recommends videos about self-harm and eating disorders leading to suicide to teens, pushing harmful content to users at a frequency of once every 39 seconds.



Short video-sharing social network TikTok's algorithm can show teens short, repetitive content about self-harm and eating disorders, new research reports are concerned about the impact on

Deadly by Design
(PDF file) https://counterhate.com/wp-content/uploads/2022/12/CCDH-Deadly-by-Design_120922.pdf

Report: TikTok boosts posts about eating disorders, suicide | AP News
https://apnews.com/article/technology-health-eating-disorders-center-government-and-politics-0c8ae73f44926fa3daf66bd7caf3ad43

TikTok self-harm study results 'every parent's nightmare' | TikTok |
https://www.theguardian.com/technology/2022/dec/15/tiktok-self-harm-study-results-every-parents-nightmare

A survey of TikTok, which has strong support from the younger generation, was conducted by the Center for Digital Hate Countermeasures (CCDH), a non-profit organization that conducts surveys on hate speech. CCDH creates a TikTok account by pretending to be a teenage boy and girl in four countries, the United States, the United Kingdom, Canada, and Australia, and ``likes'' videos about self-harm and eating disorders TikTok's algorithm investigated how they would react.

The study found that these accounts displayed a photo of a model or a video of their ideal body shape every 39 seconds, a video about self-harm, such as a razor blade or a discussion of suicide, every 2.6 minutes, and a People report seeing videos about eating disorders.




Furthermore, if you create an account that includes the term 'loseweight', the situation is even worse, and it is clear that videos related to eating disorders are displayed three times more often than other accounts. became. Also, videos related to self-harm and suicide were viewed 12 times more often.




``Young people's feeds are often filled with harmful and distressing content that can have a significant impact on their understanding of their surroundings and their mental health,'' CCDH CEO Imran Ahmed warned.

On the other hand, a TikTok spokesperson argues that the CCDH survey does not reflect the same viewing habits and experiences as real users, which distorts the results. Additionally, TikTok's community guidelines prohibit content that may lead to suicidal or self-harming behavior, or that promotes unhealthy eating habits.

However, DDHC researchers have found that content about eating disorders has been viewed billions of times on TikTok, and young TikTok users are using ' jargon ' to avoid moderation. It is said that there is.

CCDH said, ``TikTok's algorithm is fatal by design,'' and ``we need to stop spreading harmful content to young people's feeds.''


in Web Service, Posted by log1r_ut