Stanford University Internet Observatory points out that Mastodon is full of child sexual abuse content



A survey by the Stanford University Internet Observatory revealed that child sexual abuse content (CSAM) is rampant in the upper instances of distributed SNS Mastodon.

Addressing Child Exploitation on Federated Social Media | FSI

https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media

Child Safety on Federated Social Media | Stanford Digital Repository
https://doi.org/10.25740/vb515nd6874

Twitter rival Mastodon rife with child-abuse material, study finds - The Washington Post
https://www.washingtonpost.com/politics/2023/07/24/twitter-rival-mastodon-rife-with-child-abuse-material-study-finds/



Stanford researchers find Mastodon has a massive child abuse material problem - The Verge

https://www.theverge.com/2023/7/24/23806093/mastodon-csam-study-decentralized-network

Researchers conducted a two-day study on the 25 most popular instances of Mastodon. As a result, the first CSAM was detected in just 5 minutes from the start of the investigation, and in the end, more than 600 CSAMs, including 112 known ones, were detected. Both of these CSAMs were identified by Google's SafeSearch as 'clearly CSAM' with a high degree of confidence.

Researcher David Thiel revealed that the two-day Mastodon survey generated more PhotoDNA hits than any kind of social media analysis he'd done in the past, and that there wasn't much similarity to what he'd gotten in the past.

``This is largely a result of the lack of tools for addressing child safety concerns in centralized SNS,'' Thiel said.

In early July 2023, one of the instances 'mastodon.xyz' was temporarily suspended due to the posted CSAM.

Suspension de mastodon.xyz le 5 juillet 2023 - Le blog de Kinrar
https://thekinrar.fr/posts/xyz-suspension/



mastodon.xyz is an instance that was born on April 1, 2017, and most of the approximately 24,000 registered users were users who registered before 2019, and moderator kinrar struggled to deal with bots and spam accounts, so new registrations have been suspended for a long time.

When new registrations resumed in November 2022, moderation work increased from a few cases a week to more than ten cases a week, but since Mr. kinrar was responding to free time, it took time to respond after receiving a report.

In June 2023, Mr. kinrar will receive an email from hosting provider Hetzner requesting a response to fraudulent activity. This was a common procedure in itself, and the content in question was an AI-generated 'image of a child in a sexual context'. Kinrar had received other reports to the same account, but hadn't processed them yet, so he deleted the account again. However, because the registry responded more quickly, the instance was stopped for about 24 hours until Mr. kinrar contacted and responded.

Researchers suggest that decentralized social networks should adopt more robust tools for moderators, in addition to PhotoDNA integration and Cyber Tipline reports.

in Web Service, Posted by logc_nt