A function to filter 'naked photos you don't want to see' on Instagram is being tested



It turns out that Instagram is testing a feature that checks if messages have attached images that could be nude photos. It is reported that this may provide protection against unpleasant harassment, such as sending naked photos.

Instagram's finally working on protecting users from unsolicited nude photos - The Verge
https://www.theverge.com/2022/9/21/23365079/instagram-meta-cyberflashing

This time, it was Mr. Alessandro Paluzzi, a developer famous for leaks and reverse engineering, who discovered the function being tested on Instagram. On Twitter, he published a screenshot of what appears to be an Instagram app screen explaining the 'Nudity protection' feature.



According to the image posted by Mr. Paluzzi, Nudity Protection is a function that checks whether the image included in the chat is a nude photo with the technology of the terminal, and if there is a possibility that it is a nude photo, it will be hidden by blurring or mosaic. About. This 'technology' refers to the iOS function that uses AI to detect nudity.

Added ability to scan messages about nudity on iPhone, protect children without infringing privacy - GIGAZINE



It is emphasized that Instagram does not have access to the photos as photo checking is done by iOS functions. Also, this function can be switched on and off at any time from the setting, and even if it is on, the user can choose whether to see the photo or not.

Meta, which runs Instagram, confirmed to IT news site The Verge that the leaked image is of a feature under development. Meta says the feature, which is still in its early stages of development, will allow users to automatically filter direct messages containing objectionable content, such as nude photos.

Meta spokeswoman Liz Fernandez said of the new features in development: 'We are working closely with experts to ensure that these new features protect people's privacy while ensuring that the messages users receive are protected. We aim to be in control,' he said.

Instagram's existing anti-harassment measures reportedly fail to address 90% of malicious images sent to female celebrities, and even fail to completely remove swear words like 'b*tch.' That's what I'm talking about. Another study found that 33% of women under the age of 35 have been sexually harassed online. Therefore, there is a demand for a new function to prevent harassment by using offensive images.

in Web Service, Posted by log1l_ks