Experts blame EU plans for 'scanning encrypted private messages' to prevent child sexual abuse
Child sexual abuse online has become a major issue in recent years, but there is controversy over the question, 'How do you detect child pornography?' Privacy specialists on a new regulation proposed by the European Commission , the EU's policy enforcement body, in May 2022: 'Service providers are obliged to scan users' private messages upon public request.' Homes have accused them of 'shameful civilian surveillance,' 'the most horrifying thing I've ever seen,' and 'declaration of war on end-to-end encryption.'
New EU rules would require chat apps to scan private messages for child abuse --The Verge
https://www.theverge.com/2022/5/11/23066683/eu-child-abuse-grooming-scanning-messaging-apps-break-encryption-fears
“War upon end-to-end encryption”: EU wants Big Tech to scan private messages | Ars Technica
https://arstechnica.com/tech-policy/2022/05/war-upon-end-to-end-encryption-eu-wants-big-tech-to-scan-private-messages/
The EU Commission is planning automatic CSAM scanning of your private communication – or total surveillance in the name of child protection.
https://tutanota.com/blog/posts/eu-surveillance-csam/
On May 11, the European Commission proposed a new EU law to prevent child sexual abuse online. New regulations require service providers such as SNS and messaging apps to detect, report, and delete child sexual abuse content on services as requested by public agencies, and providers are technology to detect content. It is said that you need to build.
Fighting child sexual abuse
https://ec.europa.eu/commission/presscorner/detail/en/ip_22_2976
In a statement, the European Commission reported 85 million photos and videos of child sexual abuse in 2021 alone, further increased by the pandemic of the new coronavirus infection (COVID-19). Pointed out. He argues that the company's self-regulation that has been done so far is not enough, and that clear rules with strict conditions and protective measures are needed.
Of course, preventing child sexual abuse is an important issue, but privacy experts have strongly criticized the new rules that the EU is aiming to introduce. In the Q & A about the bill published by the European Commission, the exchange of child sexual abuse content and grooming (act of punishing minors for sex crimes) are also carried out in encrypted communication and services. It has been shown that the provider is also obliged to detect communications using encryption technology. Foreign media Ars Technica said, '(This bill) is essentially ordering companies to break end-to-end cryptography by the necessary technical means,' the European Commission said. He pointed out that he wants to disable the conversion.
Alec Muffett , who headed end-to-end encryption for Facebook Messenger, described the EU bill as 'a declaration of war against end-to-end encryption' and 'all on every platform in the name of protecting children.' Requesting access to your private message. '
Good Morning! In case you missed it, today is the day that the European Union declares war upon end-to-end #encryption , and demands access to every persons private messages on any platform in the name of protecting children: https: // t.co/6nbypgA9ct
— Alec Muffett (@AlecMuffett) May 11, 2022
Under this regulation, foreign media The Verge scans all messages and photos of some users when communication services such as WhatsApp, Signal and Facebook Messenger receive 'detection orders' from EU countries, known or new. He explained that he had to look for evidence of child sexual abuse content and grooming, and that he needed an AI system for scanning to achieve this. In 2021, Apple was accused of compromising privacy by announcing the ' Child Pornography Detection System in the iPhone, ' but Apple's system focuses solely on detecting known child sexual abuse content. Therefore, the European Commission's bill, which includes unknown content and grooming, will have a broader impact.
90 human rights groups have released an open letter protesting 'scanning iPhone photos and messages', fears that children's rights will be violated-GIGAZINE
'This document is the most horrifying thing I've ever seen. CSAM (Child Sexual Abuse Content),' said Matthew Green , a cryptographer and associate professor of computer science at Johns Hopkins University in the United States. We are proposing a new mass monitoring system that reads private text messages to detect'glomming'instead of detecting. ' In a subsequent tweet , he even said, 'This is about the most advanced mass surveillance equipment deployed outside of China and the Soviet Union. It's not an exaggeration.'
This document is the most terrifying thing I've ever seen. It is proposing a new mass surveillance system that will read private text messages, not to detect CSAM, but to detect “grooming”. Read for yourself. Pic.twitter.com/ iYkRccq9ZP
— Matthew Green (@matthew_d_green) May 10, 2022
Jan Penfrat of the Belgian-based international digital advocacy group European Digital Rights called the leaked draft of the European Council 'a law to break into everyone's private chat,' 'this is free. It's like a shameful civilian surveillance law that isn't quite suitable for a democracy. '
Here is the first leak I've seen from the Commission's upcoming 'we-will-break-into-everyone's-private-chats' law. #CSAM
— Jan Penfrat @ [email protected] (@ilumium) May 10, 2022
This looks like a shameful general #surveillance law entirely unfitting for any free democracy. Https://t.co/QF07WzSG0v
In the background of severe criticism from experts, if there is a 'system that detects child sexual abuse content in all private messages', the same system is completely different from child sexual abuse content. There is the problem that it can be applied to things.
'If the law forces telecommunications providers to implement client-side scanning, then theoretically any tool can do that,' said Matthias Pfau , co-founder of end-to-end encrypted email service Tutanota . You can detect things. ' In other words, it is easy to detect 'terrorists,' 'traffickers,' 'drug traffickers,' etc., using tools initially introduced to detect 'child sexual abuse content and grooming.' .. In addition, Pfau warns that some powerful nations are at risk of further scrutiny, including 'opposition politicians' and 'dissident journalists.'
And what Pfau points out as 'an important issue that is completely ignored by the European Commission' is that even backdoors made for great purposes are abused by malicious attackers. The point is that there are things. Pfau said someone hacked the provider's scanning system, inserting child sexual abuse content and grooming language into the device of someone who wanted to lose credibility, and stealing scanned data to exploit it in cyber attacks. Pointed out that there is a possibility of grooming. 'In the end, it must be clear to all of us that a'backdoor only for good people'is impossible.'
Hacker News, a social news site, also raises concerns that the definition of child sexual abuse content is so broad and ambiguous that the scanning system cannot consider the proper context. For some users, if a 'picture of their little child' shows a naked figure, of course parents do not have it as child sexual abuse content, but the AI system makes such a decision. It is difficult to do, and it is possible that it may be detected and reported uniformly as child sexual abuse content.
It's scary, because CSAM (child sexual abuse material) is very, very broad, and ... | Hacker News
https://news.ycombinator.com/item?id=31352691
Related Posts:
in Software, Web Service, Security, Posted by log1h_ik