TikTok Moderator Testifies He Was Trained On Child Sexual Abuse Videos



A former employee of Teleperformance, a third-party company that checks TikTok content, testified that he was exposed to child pornography as part of his job, and that hundreds of people had access to materials recording child pornography. did. Experts are puzzled that this violates the law to minimize the disclosure of materials.

TikTok moderators say they were trained with child sexual abuse content - The Verge

https://www.theverge.com/2022/8/5/23294017/tiktok-teleperformance-employees-shown-csam-moderation-report

A former Teleperformance employee claims to have been asked to review a spreadsheet called 'Daily Required Reading' (DRR), which describes TikTok's moderation criteria. Hundreds of images that violated TikTok's guidelines, such as naked children and abused children, were recorded in this spreadsheet. In addition, hundreds of TikTok and Teleperformance employees were able to access this content from inside and outside the office, and it was pointed out that there was a possibility of information leakage.

Such Child Sexual Abuse Content (CSAM) is illegal and has strict rules for what to do with it if found. If anyone finds CSAM, it should be removed immediately and reported to the National Center for Missing and Exploited Children (NCMEC), a nonprofit organization that helps children.

Although law allows companies that report to NCMEC to retain the material for 90 days to provide information to law enforcement, federal law requires companies to limit the number of employees with access to content to a minimum. store materials in a secure location and permanently destroy content upon request by law enforcement.

However, this suspicion has far exceeded the limit, and one employee asked the FBI that this act was 'a criminal act that spreads CSAM.'



According to the IT news site The Verge, TikTok spokesperson Jamie Favazza denied having shown sexual exploitation content to employees in an

interview with Forbes, but TikTok He also said he has not confirmed whether the same is true for all of its partners.

“Content of this nature is abomination and should not exist on or off our platform,” Favazza said. Training materials on TikTok have strict access restrictions, do not contain visual examples of CSAM, and are researched and reported to NCMEC by our dedicated child safety team. I do,' he said.



in Web Service, Posted by log1p_kr