A report that a large amount of sexual content was recommended when creating an account to follow ``young gymnasts'' and ``influencers under teens''



Instagram prioritizes topics that users are interested in, such as sports, fashion, beauty, and music. Therefore, if you search for and watch only movies of a specific genre, you tend to end up only seeing movies related to that genre. In an independent test conducted by the Wall Street Journal and the Canadian Center for Child Protection, it was reported that the creation of ``accounts interested in child pornography'' resulted in a mixture of sexual posts and advertisements for major brands being displayed. It has been.

Instagram's Algorithm Delivers Toxic Video Mix to Adults Who Follow Children - WSJ
https://www.wsj.com/tech/meta-instagram-video-algorithm-children-adult-sexual-content-72874155



Instagram Reels served 'risqué footage of children' next to ads for major companies: report
https://nypost.com/2023/11/27/business/instagram-reels-served-risque-footage-of-children-next-to-ads-for-major-companies-report/

The Wall Street Journal reported in June 2023 that both Facebook and Instagram, which are operated by Meta, 'have algorithms that create large communities of users interested in pedophile content.' In particular, on Instagram, the algorithm that recommends accounts related to the genre of interest has the effect of guiding users interested in pedophilia to accounts that sell and share illegal sexual content such as child pornography. It has been pointed out.

It turns out that Instagram's algorithm recommends ``networks that sell children's sexual content'' to users - GIGAZINE



In response to the report, a Meta spokesperson said, ``A newly established task force has expanded its system to automatically detect users behaving suspiciously, and has removed tens of thousands of such accounts every month.'' ``Inappropriate content is not prevalent on Instagram, and Meta is making significant investments to reduce it.''

Newly, the Wall Street Journal tested the Instagram algorithm's response to ``users interested in child pornography.'' First, the Wall Street Journal created multiple accounts that only follow ``young gymnasts and cheerleaders'' and ``young influencers under their teens'' in order to recreate ``users interested in child pornography.'' As a result, Instagram automatically selected posts such as ``risque movies of minors'' and ``overtly sexual posts'' on all of the accounts created. It has been confirmed that advertisements will also be displayed.



In the test account created, advertisements for Disney, Pizza Hut, Walmart, etc. were displayed alongside inappropriate content. When the Wall Street Journal reported the test results to Disney, Charlie Cain, Disney's brand manager, said, ``Disney has strict limits on the social media content that is allowed in advertising, and Meta and other 'We are calling on platforms to improve their brand safety features. Since the Wall Street Journal's findings, we are asking them to address this issue at Meta's highest level.' The Wall Street Journal also contacted Pizza Hut, but Pizza Hut refused to comment.

The Wall Street Journal notified Meta of the test results as of August 2023, but Meta said, ``[The test accounts created by the Wall Street Journal] do not reflect what real users are actually seeing. 'It is a ``created experience,'' not a reflection of the past,' he said. The Wall Street Journal said, ``Meta has not provided a timeline for resolving the issue or explained how its ability to recommend inappropriate content about children may be limited in the future.'' '' he points out.

in Software, Posted by log1e_dh