Surveillance tools aimed at monitoring children's behavior and preventing suicide are helpful but have unintended consequences



Some governments install monitoring software on mobile devices distributed to children for educational purposes. Proponents of monitoring software say it can help identify and support problem children, but the New York Times has compiled several examples of unexpected problems that users do not anticipate.

The Brave New World of AI-Powered Self-Harm Alerts - The New York Times

https://www.nytimes.com/2024/12/09/health/suicide-monitoring-software-schools.html



Does spying on laptops really prevent high school suicides?

https://reason.com/2024/12/09/does-spying-on-laptops-really-prevent-high-school-suicides/

The New York Times reported on an example of software installed in a Missouri school that scrutinizes what children type into their devices and flags any problematic words. A girl who wrote a poem was mistakenly flagged as attempting self-harm and reported to the police, who thought she might actually harm herself.

The case was quickly cleared, but the child was extremely upset and 'traumatized,' the report said. There are plenty of examples of 'false positives' like this one, with a hunting report, a historical study of the Ku Klux Klan, and even a quote from an Oscar Wilde play being mistakenly flagged.



In the United States, federal law requires schools to use content filters on devices given to children. In addition to content filters that block what you don't want them to see, it is optional to introduce more advanced input monitoring tools, but according to the New York Times, almost half of elementary school students in the United States are being monitored. In the UK, technology

guidance has also been laid out, and filtering and monitoring are currently required.

Of course, there are cases where the introduction of monitoring tools has saved lives. In one case, a 17-year-old girl was caught sending an email to a friend saying she was thinking of committing suicide, and was able to get help from a counselor. The girl said, 'The counselor is like a mother to me now.'

In another case, a counselor rescued a girl by detecting the search content of 'How much medicine do I need to die?', and the girl later began working as a rescuer. The Neosho School District, a small town that had a relatively high number of child suicides at one time, is said to have introduced monitoring tools and put in place a system to protect children, and the police in the area said, 'There are a lot of false reports, but if it can save one child, it's worth it.'



The New York Times pointed out, 'The challenge with any monitoring tool is accuracy. Then there's the issue of follow-up: how well can they detect true danger, and what kind of care can schools provide to kids who are in danger?'

in Software, Posted by log1p_kr