Why is the 'Chat Control Act,' which censors all private chats and files at the government's direction, dangerous and meaningless?



The EU is pushing ahead with a proposed child sexual abuse regulation bill (Chat Control Law), which would require all communications and files to be scanned in the name of protecting children. Privacy Guides, a privacy and security news outlet, explains why the Chat Control Law is dangerous and ineffective in protecting children.

Chat Control Must Be Stopped, Act Now! - Privacy Guides

https://www.privacyguides.org/articles/2025/09/08/chat-control-must-be-stopped/



◆What is chat control law?
The Chat Control Act is a package of bills that would require all service providers, including text messages, emails, social media, cloud storage, and hosting services, to scan all communications and files, including those that are end-to-end encrypted, for what the government considers to be 'abusive content.'

At the time of writing, the Chat Control Act is intended to detect child sexual abuse content (CSAM). However, if the Chat Control Act is passed and all communications and files are censored, there is a risk that in the future it will not only restrict CSAM but also other crimes and political speech. Furthermore, the fact that all data is collected and scanned poses the risk that personal data could fall into the hands of criminals or be leaked online through hacking or other means.

Chat control legislation has been proposed in the EU many times before, and has come close to being passed at the time of writing. However, a final vote is scheduled for the European Parliament on October 14, 2025, and national governments will decide on their positions on September 12, so the threat of chat control legislation passing is once again looming.



◆What are the dangers if the chat control law is passed?
Privacy Guides outlines the risks that would arise if chat control laws were passed.

1: Breaking end-to-end encryption
Censorship of all communications means that secure, private communications are no longer possible, leaving vulnerable people, crime victims, whistleblowers, journalists, activists, and everyone else exchanging confidential files and communications unprotected.

2. Expanding the scope of censorship
Once a large-scale censorship system is in place, authorities could expand its scope to detect not only CSAM but also drug use, participation in protests, anti-government political activity, negative comments about leaders, etc. Europol, the EU's law enforcement agency, has already called for an expansion of chat control programs.

3. Criminal attacks
For chat control laws to be effective, they would need to have a backdoor that allows them to censor all communications. If something has a backdoor, it's almost certain that sophisticated criminals will be able to access and steal information. Criminals can access not only each service individually, but also the entire database stored by authorities. This, of course, includes CSAM that has been filtered out by censorship, including files exchanged between teenagers and their romantic partners. As a result, there is a risk that the introduction of chat control laws could even help criminals efficiently collect sexual images of children.

4. Risk of false positives
Operating such a large-scale, non-transparent censorship system inevitably results in a certain number of false positives. While AI-based detection technology is certainly highly accurate, the chat control law covers the entire EU population of approximately 450 million people, so even a small percentage of false positives could result in a huge number of people being falsely labeled as pedophiles. For example, parents who save photos of their children bathing in the cloud, mothers who ask relatives for pictures of breastfeeding techniques, and teenagers who consensually exchange photos with their partners could all be investigated as sexual predators.



5. A severe lack of resources
The amount of content being flagged across the system can be staggering, inevitably stretching the resources of agencies tasked with investigating it. This content can also contain a significant number of false positives, which can take up valuable investigator time and potentially prevent investigators from investigating actual sexual abuse cases.

6. Victims are less likely to seek help
When a mass censorship system is in operation, victims of offline sexual abuse may feel that 'someone will scan their messages,' making it difficult to report. Furthermore, even if a victim or witness sends evidence of sexual abuse to someone, they may be flagged and risk being treated as if they are the perpetrator rather than the victim. 'Unfortunately, many people will decide it's safer not to report,' Privacy Guides said.

7. Self-censorship is becoming more common
If it becomes known that all messages are being censored, not only those involved in child sexual abuse but also everyone else may feel that 'my message might also be seen,' and may stop them from seeking help or discouraging themselves from disclosing their experiences. This could have a negative impact on people who belong to marginalized groups such as LGBTQ+ people, or who are victims of crime.

8. Weakening democracy
Large-scale censorship systems allow governments to spy on opposition parties. State-sponsored censorship is already practiced in some countries, and there have been many examples of state-sponsored censorship in the past. It's entirely possible that governments in EU countries will step up their surveillance of dissidents, political activists, and journalists in the future. If chat control laws are implemented, they could easily censor content simply by expanding the scope of content they detect.

9. Violating the EU General Data Protection Regulation (GDPR) and other laws
While

GDPR and other privacy laws have strengthened data protection for everyone living in the EU, chat control laws could render them useless.



◆Do chat control laws help protect children?
In response to the question, 'Will chat control laws protect children?' Privacy Guides responded, 'No. We cannot emphasize this enough. Far from protecting children, these regulations will harm everyone around the world. Anyone who claims otherwise is naive or misinformed.'

The first major problem is that large-scale censorship systems using AI are inevitably subject to false positives. Even if a magical system were developed that could detect CSAM with 99% accuracy, considering that the EU's total population is approximately 450 million, a 1% false positive would have a negative impact on millions of people living across the EU. Furthermore, according to a report by the Swiss Federal Police, approximately 80% of the automated reporting systems were false positives, meaning that the false positive rate could be in the tens of percent range, not just 1%.

Additionally, in Germany, of the approximately 20% of valid reports , over 40% are directed at children themselves , including teenage couples who consensuallly send selfies to each other. If these exchanges between children are flagged and investigators begin investigating, there's a risk that embarrassing photos that were otherwise private could end up being seen by multiple unrelated investigators. 'The number of children harmed by chat controls, likely resulting in lifelong trauma, would be catastrophic,' Privacy Guides warned.

The Chat Control Act would create systems that collect private data, potentially making those systems and databases targets for criminals. As of the time of writing, large-scale data breaches at governments and private companies around the world occur almost every year, highlighting the difficulty of completely protecting private data online and sharing only the data necessary for investigations with law enforcement.

What's even more problematic is that the majority of child sexual abuse isn't committed by unknown third parties, but by adults close to the child who have offline contact with them. It's also known that two-thirds of CSAM is created at a child's home , and chat control laws won't help prevent this. In fact, there's a risk that victimized children themselves will be subject to monitoring, preventing them from seeking outside help.



Chat control laws affect people outside the EU too
If the Chat Control Act is passed on the final voting day on October 14, the outcome will affect people all over the world, including those outside the EU. Just as the GDPR has affected companies around the world, including Japan, and improved privacy protection efforts, the Chat Control Act could have the opposite effect.

Because end-to-end encryption only works if both ends are protected, if someone outside the EU communicates with someone inside the EU, messages and files sent by those outside the EU will also be unprotected. This could force companies outside the EU to remove privacy features, lower encryption levels, or even terminate services related to Europe. Furthermore, given that the Five Eyes countries (the United States, Canada, Australia, New Zealand, and the United Kingdom) are also reportedly in favor of chat control laws, this could open the door to mass censorship systems spreading beyond the EU.

Fight Chat Control, an organization working to prevent chat control laws, has created a tool that allows people living in the EU to contact their government officials and voice their opposition to chat control laws.

Fight Chat Control - Protect Digital Privacy in the EU
https://fightchatcontrol.eu/#contact-tool

Privacy Guides also emphasized that even those outside the EU can actively discuss chat control legislation on social media to raise awareness of its dangers and campaign against it. 'We need your help in this fight. To protect democracy, privacy, and all human rights, we cannot afford to lose this battle,' Privacy Guides said.

・Continued
A site developed by a single engineer that sends spam to lawmakers and other stakeholders is circumventing EU chat control laws - GIGAZINE



in Note, Posted by log1h_ik