Adds ability to scan messages about nudity on iPhone, protects children without invading privacy



In August 2021, Apple announced `` a function to scan iPhone photos and messages to prevent child sexual exploitation '', and although some features were

protested for privacy issues, in December, the United States It was introduced to iOS only in Japan. According to The Guardian reported on April 20, 2022, this child protection safety feature will soon be introduced to iPhones in the UK and is expected to expand further in the future.

Apple to roll out child safety feature that scans messages for nudity to UK iPhones | Apple |
https://www.theguardian.com/technology/2022/apr/20/apple-says-new-child-safety-feature-to-be-rolled-out-for-uk-iphones

About communication safety in Messages - Apple Support
https://support.apple.com/en-us/HT212850

Apple announced on August 5, 2021 local time that the function for `` extended protection for children '' is `` when child sexual exploitation data ( CSAM ) is exchanged in the messaging app. Three items were listed: 'Show warning', 'Inspect data stored in iCloud photos to detect CSAM', and 'Enable reporting on CSAM from Siri and search functions'. However, it has been criticized that ``checking the contents of the user's iCloud photos is a violation of privacy,'' and it has been reported that voices of concern have been gathered within the company, and 90 human rights groups have released protest letters. In September of the same year, we announced the postponement of the release of new features due to a large backlash.

Apple announces that it will scan photos and messages on the iPhone to prevent sexual exploitation of children, and protests from the Electronic Frontier Foundation and others that ``compromise user security and privacy''-GIGAZINE



With the iOS 15.2 update on December 13, 2021, the 'message communication safety' function was introduced only for iPhones in the United States. However, in response to many criticisms, ``It is turned off by default, and you can select the introduction of the function from the settings'', ``It can be set only for children's iPhones with family group settings'', and ``Message application'' Analyzing the attachment above to detect nudity is done only on the iPhone, and is designed to prevent Apple from accessing the photos.” Apple claims that it has solved the privacy problem.

If you turn on the 'Message Communication Safety' function, when a nude image is detected in a message, it will be blurred to make it invisible, and you will be able to 'contact someone', 'block the contact', etc. of children will see options for asking for help.



Also, even if you choose to send and receive nudity after seeing the warning, we will reconfirm whether the child really wants to do so, and display a call or help guide for cases where the child is trying to do it without wanting it. To do. The detection of these nude images and the display of warnings are done only on the iPhone by machine learning, and the contents of the images and messages are not transmitted to Apple or the parent's iPhone.



When such a feature is introduced not only in the United States but also in the iPhone in the United Kingdom, The Guardian reported on April 20, 2022. According to Apple's statement sent to The Guardian, 'The feature is designed to keep nudity detections from leaving the device. Apple has no access to messages and no notifications are sent to parents or others.' The Guardian said that it was confirmed that the originally planned function of ``notifying the user's parents when choosing to send and receive nudes'' was not included in the latest update. Apple says the feature will roll out as part of a software update in the coming weeks and will be available in the Messages app on iPhones, iPads and Macs.

in Mobile,   Security, Posted by log1e_dh