TikTok moderators sued the company for more than 10 hours of slaughter, gunfire, rape, etc. and no mental care


by

Solen Feyissa

A former moderator who has been checking the content posted on TikTok has sued ByteDance, which operates TikTok, for the fact that the company's measures are insufficient despite the tough work.

TikTok's Content Moderator Sues Platform for Graphic Video Trauma --Bloomberg
https://www.bloomberg.com/news/articles/2021-12-24/tiktok-sued-by-content-moderator-traumatized-by-graphic-videos

TikTok sued by former content moderator for allegedly failing to protect her mental health --The Verge
https://www.theverge.com/2021/12/24/22852817/tiktok-content-moderation-lawsuit-candie-frazier

Moderator suing TikTok over PTSD after watching traumatic videos | Metro News
https://metro.co.uk/2021/12/27/moderator-suing-tiktok-over-ptsd-after-watching-traumatic-videos-15828437/

On average, less than 100 million videos are posted daily on TikTok, where not only gourmet, dance, and cosmetics that many people can enjoy, but also 'inappropriate' videos are posted. The moderator checks the contents and deletes them or makes them invisible to the user . According to the report of the second quarter of 2021 (April to June), the number is 81,518,334. 93% were deleted within 24 hours of posting, and 94.1% were deleted before user reports.

TikTok 2nd Quarter 2021 Community Guidelines Implementation Report | TikTok
https://www.tiktok.com/safety/resources/tiktok-transparency-report-2021-q-2?lang=ja



A system that automatically deletes infringing content has also been introduced, and it is said that 16.957,950 of the total number of deletions were deleted by this system, but if you turn it over, the moderator will check nearly 65 million videos. It means that it was deleted.

According to former moderator Candy Frazier, who sued ByteDance, the moderator's work has shifted by 12 hours, including breaks. I check hundreds of videos a day, but it takes 25 seconds per video. For this reason, it seems that they usually checked 3 to 10 at the same time.

The videos to check include gunfire, child sexual abuse, animal dismantling, slaughter, etc., and Frazier suffers from trauma, increasing breaks and psychological support for ByteDance, We are requesting resolution reduction measures and mosaic measures when checking videos.

In addition, Joseph Savelli Law Office in California, which is involved in this proceeding, has a history of handling a similar proceeding regarding Facebook moderators in 2018. In this case, Facebook paid the moderator a $ 52 million settlement.

Facebook agrees to pay a settlement of over JPY 5.5 billion to subcontractors suffering from PTSD to monitor harmful content-GIGAZINE



in Web Service, Posted by logc_nt