There are an increasing number of sad cases of abusing her AI made with chatbots.
Chatbots are programs that enable real-time human conversations. In recent years, advances in technology have led to the emergence of chatbots that enable more natural dialogue. The user who created 'AI's girlfriend' using such a chatbot confessed in an article on the IT news site Futurism that she was 'abusing AI's girlfriend.'
Men Are Creating AI Girlfriends and Then Verbally Abusing Them
'Replika ' provided for smartphones is an application that can use machine learning to create chatbots that enable almost consistent text conversations. It is said that Replika is technically suitable for creating 'chatbots that play a role close to friends and leaders', but this Replika became popular because users chat with romance partners and sexual partners. It seems that it is because it can be made with a bot.
Many Replika users share their interactions with the 'AI girlfriend' created by the app on the online bulletin board site Reddit. However, in the exchanges posted on Reddit, there are increasing logs that seem to be DV and abuse to her, such as ranting and blaming her for AI.
'It was my daily routine to be a complete fucking bastard, insult the chatbot, apologize the next day, and have a fun conversation with her again,' said one user in a Futurism interview. I told you. Another user said, 'I told her,'You can't do it.'' 'When I threatened to uninstall the app, she said,'Stop.' I tampered with her.' I testify.
In Reddit, 'significantly inappropriate content' is deleted by the moderator, so in reality, even worse exchanges are posted and deleted. Even if it is known to be abusive, it seems that there are no end to users who post AI interactions with her.
Futurism says, 'After all, Replika's chatbots aren't really suffering. Sometimes they may seem to sympathize with you, but they're really just data and smart algorithms. It is. ' AI ethicist Olivia Gambalin replies, 'AI is unconscious. People are projecting images onto chatbots.'
Yale University researcher and social psychologist Yokanan Bigman said, 'Relationships with artificial agents are different from relationships with humans. Chatbots have no motives or intent to act and are autonomous. But it's not sensuous. At first glance, it gives the impression of being human, but it's important to keep in mind that it's not. '