Image bookmark service'Pinterest' employs moderators who check child pornography day and night
'A Permanent Nightmare': Pinterest Moderators Fight to Keep Horrifying Content Off the Platform | by Sarah Emerson | Jul, 2020 | OneZero
According to the daily newspaper The New York Times , Pinterest, which started in 2010, was the first user group to be teenagers, not women or home designers in the Midwest. Ben Silberman, CEO of Pinterest, has set a goal for Pinterest to make recipes for inspiration, self-improvement and salted caramel cookies safe and happy.
Aiming for people-friendly social media, Pinterest is aggressively stepping up misinformation to protect the health of the platform. In 2018, we
In addition, Pinterest, which also operates a bridal business, has become a hot topic by converting to a policy of quitting advertisements that make former slave farms in the United States look like a wonderful wedding hall.
A big problem with such Pinterest is that a large amount of racist images and images corresponding to child pornography are pinned. On Pinterest, there is a case where a problematic pin is treated as “exclude from search result instead of deleting”, in which case it may remain in Google search results. Therefore, we tried to keep the site clean by introducing automatic tracking technology etc. for child pornographic images.
However, a set of guidelines and algorithms is not the only solution. In addition to algorithms, human moderators also participate in work on Pinterest. Moderators will decide what to do with the offending pin, investigate suspicious users and their networks, and examine content that is difficult to determine.
According to a former moderator, the moderation work on Pinterest allowed him to choose what to moderate such as pornography, hate speech, and violent portrayals. The moderator looks at the image selected by the detection algorithm and the image notified by the Pinterest user, selects whether to hide or delete the pin, and determines the penalty for the pinned user. According to moderator OneZero, as of 2018, each moderator could see up to 8000 images per day.
For example, a board named 'Summer Season' contained hundreds of images of girls in bikinis. Moderators need to determine if the girl's image is a 'family photo' or a 'photo taken by a pervert who loves children.'
Also, a board was reported that pinned images showing many bloodshed scenes. This was the board where some medical students were pinning photos of surgery in preparation for the exam. Of course, the usage is not related to the crime when judged by the context, so the use is allowed, but the moderator sets the board so that it is hidden from other users.
Moderators often have trauma because they need to see a lot of child pornography and violent images. This is not limited to Pinterest, but the problem of mental health damage of moderators who check a lot of pornographic videos and violent videos on YouTube is also a problem.
Health hazards of surveillance staff checking violent movies on YouTube are a problem-GIGAZINE
Pinterest employees, including moderators, are given free food, alcohol, comfortable and quiet rooms, free arcade games, and other benefits. However, it seems that there were many moderators who came into the gap between tough work content and cheerful welfare. 'The moderator isn't a long-term job,' said OneZero, a former moderator of Pinterest, who said, 'Pinterest wasn't good enough to support her mental health.'
Pinterest also seems to be working to address the mental health problems of moderators, and a spokeswoman said he's improving machine learning models for certain categories. In addition, moderators can receive a 30-minute counseling session with a therapist once every 6 weeks, and they also provide a monthly business trip massage fee.
by Roxanne Ready
A Pinterest spokesman said, “On Pinterest, moderators undergo a variety of specialized training because we can't expect to immediately recognize words or hate symbols that hurt anyone.” If the moderators find that they are struggling with a particular subject, they turn off the workflow, and if they find it hard to keep working, we can work with them to do things like non-content oriented projects. We'll introduce you and give you time to find new positions on and off Pinterest.'
The former moderators say they aren't malicious to Pinterest, but they claim that 'Pinterest underestimated them.' One of the former moderators said, 'For me, the size of Pinterest itself and the DNA of Pinterest itself will be a problem. As things get bigger, the goal as a product is 'everyone participates and monetizes'. If so, the discovery platform Pinterest will be a nightmare forever.'
in Web Service, Posted by log1i_yk