It may be harder and impossible than imagination to make AI feel emotion


How to create "feelings" while aiming to make AI (artificial intelligence) have "intelligence" equal to or higher than that of human beings is a challenge. However, is not it technically impossible to make AI feel emotion? It is pointed out that it is.

Emotion Science Keeps Getting More Complicated. Can AI Keep Up?
https://wowwegettonext.com/emotion-science-keeps-getting-more-complicated-can-ai-keep-up-442c19133085

With the arrival of AI assistant such as Amazon Alexa, human beings have come to the stage to exchange simple conversations with AI, so further advancement of technology will allow AI to read human emotions, so that even more intellectual AI It is hoped that activities of it will be possible. However, Dr. Richard Firth Godbyech, a researcher of emotional language science, reveals that AI is more difficult to imagine reading human emotions than imagined.



According to Dr. Godbyehi, we first point out the inaccuracy of the definition of "emotion" itself as a fundamental problem. In emotional science, psychologist Paul Ekman advocated the human emotion which consists of six basic emotions "fear" "sadness" "anger" "happiness" "surprise" "disgust" The theory was common. It seems that basically it is being developed based on the six emotional theory advocated by Dr. Ekman even in the study that makes human's emotions detect by AI which is done in Silicon Valley and generates AI feelings.

Dr. Ekman found six basic emotions when Papua New Guinea encountered indigenous peoples who did not have contact with the outside world, and from this point, "From here on to society, six universal six There is emotion "It leads to the theory that. However, according to Dr. Godbyehee, the act itself of guessing inner feelings from "facial expression" which was the basis of human emotions itself is not a very clear action.

For example, Dr. Godbyech explains that it is difficult to guess "emotion" after presenting her own picture of "holding a fist in a car" as follows. If someone guesses the expression of this picture and guesses it is "feeling of" anger "coming from traffic jams", he guesses "It is a feeling of" joy "bursting by hearing the scores of sports teams favoritizing on the radio" People may also exist. From this example, you can see that it is difficult to accurately infer emotions from certain 'expressions'.



According to Dr. Godbyeh, it is insufficient to merely detect "facial expression" to understand human emotions, it is insufficient, and it understands all kinds of information such as the circumstances, the situation where the person was placed, the past circumstances It is necessary. In other words, when AI watching "person who raised fist" intends to read emotions, it is necessary to correctly judge "context" considering various factors surrounding the person. And it is extremely difficult to make all such peripheral circumstances into a database, and it is difficult to make human emotions correctly recognized by facial expression analysis using machine learning etc.

Dr. Godbyeh said, "Emotions are dynamic things", and the human brain is making it possible to flexibly capture this dynamic thing. Dr. Godbyeh thinks that it is also shown in "memory that is easy to move and never deterministic". By surprisingly and flexibly combining past dynamic memories to read the context, the human brain can categorically categorize the situation appropriately and remove wasteful information by filtering, which makes it possible for new experiences We can also respond appropriately. In other words, the human memory is ambiguous and easy to change, so it is inexplicable to make evidence in the trial because it is ambiguous, it is a flip of human's superior ability, it seems to be natural in a sense.

On the other hand, stored data, which is a memory of a computer, can hold "facts" perfectly and definitively, but because there is no flexibility, in some cases conflicting facts are combined and combined I am not good at work. Even if you train with a large amount of data, it can be said that AI properly judges dynamic emotions is difficult in nature.



In the ethical problem of "Whether or not to accept human rights similar rights to robots?", The existence of emotions by robots is a prerequisite, but in order to make robots have emotions like human beings, Understanding reading understanding and inputting to AI seems to be in an extremely difficult state technically.

in Note,   Software, Posted by darkhorse_log