Most of the ``lover AI chatbots'' that pretend to be lovers to comfort lonely people collect large amounts of user data.



In recent years, AI chatbots such as ChatGPT have become rapidly popular, and AI chatbot services that pretend to be an imaginary lover to comfort single people are also appearing one after another. However, security researchers at Mozilla, which develops browsers such as Firefox, have warned that many of these AI lover chatbots do not protect user privacy.

*privacy not included | Shop smart and safe | Mozilla Foundation

https://foundation.mozilla.org/en/privacynotincluded/articles/happy-valentines-day-romantic-ai-chatbots-dont-have-your-privacy-at-heart/



Your AI Girlfriend Is a Data-Harvesting Horror Show
https://gizmodo.com/your-ai-girlfriend-is-a-data-harvesting-horror-show-1851253284

An AI chatbot that pretends to be your lover will not leave you no matter how persistent you are, and will respond according to the context. If users pay a fee, they can talk to the AI until they are satisfied with it, and during that time, they can receive various advice, receive encouragement, and even recreate playful interactions between lovers.

However, when interacting with a lover AI chatbot, you may feel so close to the AI that you may accidentally reveal your personal information. Additionally, since AI asks a variety of questions to facilitate conversation with the user, there is a possibility that the user may end up revealing all of their personal information just by answering the questions exactly as they are asked. Therefore, it is important to know what kind of privacy protection the lover AI chatbot service provides and whether the contents of the conversation may be used.

Therefore, Mozilla's research team conducted a review on user privacy protection and functionality of various AI chatbots that act as imaginary lovers. The review targets were 11 types of AI chatbot services: EVA AI , iGirl , Anima (girlfriend version) , Anima (boyfriend version) , Romantic AI , Chai , Talkie , Genesia , CrushOn.AI , Mimico , and Replika .



As a result of the analysis, only Genesia among the 11 species was determined to disclose information regarding security vulnerability management and encryption and meet minimum security standards. The research team warns that other apps are at risk of personal information leaks, breaches, and hacking. In addition, these AI chatbots use a large number of tracking trackers and share user data with third parties for targeted advertising.

The research team reports that all AI chatbots, except for EVA AI, either explain that the personal information they collect will be sold or used for targeted advertising, or do not adequately explain this in their privacy policies. . For example, CrushOn.AI's privacy policy states that it may collect very broad and personal information, such as 'sexual health information,' 'prescription drug use,' and 'information about gender identity and gender reassignment surgery.' It seems that it was written.

The research team also pointed out that 54% of AI chatbots do not give users the right to delete their personal data, and other AI chatbots do not necessarily have the ability to delete the contents of their conversations with the AI. Romantic AI states that ``communication via chatbot belongs to the software,'' and users are not able to freely delete conversation data.

In addition to security and privacy concerns, doubts remain about the very functionality touted by AI chatbots. In the past, there have been cases in which a man who was interacting with Chai's AI chatbot committed suicide due to the AI's support, and a man who was inspired by Replika's AI chatbot to attempt to assassinate the Queen of England. There have also been reports of arrests. However, companies that develop AI chatbots avoid liability by claiming in their terms of service that they are not responsible for losses caused by AI.

Trial reveals that man inspired by conversational AI was plotting to assassinate the Queen of England - GIGAZINE



'Put simply, AI girlfriends are not friends. They are marketed as enhancing mental health and well-being, but they can provide addiction, loneliness, and toxicity,' said Misha Rykov, a researcher at Mozilla. , specialized in exfiltrating as much data as possible.'

in Software,   Web Service,   Security, Posted by log1h_ik