It is pointed out that Apple's child pornography regulation will increase the risk on the contrary



To prevent the spread of 'data on child sexual exploitation (CSAM)', Apple announced in August 2021 the introduction of features such as the ability to detect CSAM from data stored in iCloud. Privacy experts have pointed out that the function, which has been criticized by many people, 'instead causes the spread of CSAM.'

Influencing Photo Sharing Decisions on Social Media: A Case of Paradoxical Findings | IEEE Conference Publication | IEEE Xplore

https://ieeexplore.ieee.org/document/9152698

Could Apple's child safety feature backfire? New research shows warnings can increase risky sharing
https://theconversation.com/could-apples-child-safety-feature-backfire-new-research-shows-warnings-can-increase-risky-sharing-167035

On August 5, 2021, Apple will provide a function to scan data from 'Messages app', 'Photos stored in iCloud', 'Siri and search function', and warn users who are trying to share CSAM. We announced our efforts. While there was support from child protection groups and others, there were criticisms from various parties including the Electronic Frontier Foundation, saying that there were concerns about privacy protection.

Apple announced that it will scan iPhone photos and messages to prevent sexual exploitation of children, and protests from the Electronic Frontier Foundation and others that 'it will compromise user security and privacy' --GIGAZINE



In response to strong opposition, Apple announced a postponement of feature introduction on September 3rd.

Apple announces postponement of 'function to detect child pornographic images in iPhone' --GIGAZINE



Indiana University psychologist Kurt Hugenberg and colleagues say that the act of 'warning' that Apple was trying to do can often backfire. In fact, Hughenberg et al. Also thought that the act of warning was appropriate to prevent inappropriate content from being shared, and conducted one experiment, but the result was unexpected. matter.

Hughenberg and colleagues showed more than 400 people various pictures and asked questions such as 'whether you like the picture' and 'whether you want to share the picture with someone'. We also asked some groups of participants, 'Do you still want to share with someone considering the privacy of the subject?'

As a result, it was found that the group of participants who received the attention of 'considering privacy' was more likely to want to share the photo than the group who did not receive the attention. In addition, including the results of the questionnaire, 'People who have been warned to consider privacy have become aware that the relationship between themselves and the subject is weak, and are interested in endangering the privacy of others. It seems to be lower. '



From this experiment, Hughenberg and colleagues said, 'Some people may be more likely to share their content with someone when they are warned by Apple.' This is especially true for adolescents, who often share sexually explicit photos because they want to be considered 'cool' by their peers or rebelled against their parents. Berg et al. Point out.

Hughenberg and colleagues said, 'It's possible that Apple's warnings will be received as a'badge of honor'. Like the'forbidden fruit', the more you warn, the more risky sharing is attractive. There is a danger of being caught. '



in Mobile,   Security, Posted by log1p_kr