Apple releases new detailed explanation material about 'child pornography detection system in iPhone' flooded with criticism



In response to a flood of criticisms of Apple's newly announced 'Attempt to Prevent the Spread of CSAM (Child Sexual Abuse Material)', we have released a new detailed explanatory material. According to media reports, Apple has determined that the effort is 'misunderstood.'

Apple to only seek abuse images flagged in multiple nations --Security --Cloud --Software --iTnews
https://www.itnews.com.au/news/apple-to-only-seek-abuse-images-flagged-in-multiple-nations-568625

Apple Outlines Security and Privacy of CSAM Detection System in New Document-MacRumors
https://www.macrumors.com/2021/08/13/apple-child-safety-features-new-details/

On August 5, 2021, Apple tried to prevent the spread of CSAM by 'displaying a warning when sending and receiving sexually explicit images with the message app' and 'searching for CSAM in Siri.' Announced that it plans to introduce three types of measures: 'Display a warning when you do' and 'Scan images stored in iCloud to check if there are known CSAM images'.

Apple announced that it will scan iPhone photos and messages to prevent sexual exploitation of children, and protests from the Electronic Frontier Foundation and others that 'it will impair user security and privacy' --GIGAZINE



However, of these three types, the measure of scanning images stored in iCloud has been blamed for damaging the security and privacy of users. In addition to voices of opposition not only from outside the company but also from inside the company, he said, 'By replacing the database for image matching with that of CSAM images, it could become a system that censors virtually any kind of image.' The criticism was also widely reported.

It is pointed out that Apple's 'measures to scan iPhone photos and messages' will lead to strengthened monitoring and censorship around the world --GIGAZINE



In response to these voices, Apple has released a new 'detailed explanatory material' in addition to the existing official announcements and official FAQs.

Protections Against Attacks and Misuse of Apple's Child Safety Features --Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf
(PDF file) https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf



The content of this detailed explanatory material is basically the same as the official announcement that has already been announced, and the main problematic measure is 'Instead of actually scanning the photos in the iCloud photo library, the fingerprint of the photo is used. It was said to 'match with the fingerprints of known child pornographic images', but a new threshold of 30 fingerprint matches for performing visual confirmation by humans is planned, but this will be done in the future. Thresholds may be reduced. ”“ For each version of iOS / macOS system that supports this function, support documents including the root hash of the CSAM hash database will be published on the website. ” I have.

In an interview with Craig Federighi, senior vice president of software engineering, conducted by The Wall Street Journal on August 13, 2021, he said, 'The biggest cause of concern to people was the company's response to the problem. The method of announcing the plan was bad, 'he said, and it is believed that this announcement aims to dispel concerns about issues such as privacy by explaining the details of the plan again.

in Mobile,   Web Service, Posted by darkhorse_log