Suspected that YouTube AI misidentified chess 'white / black' as racism and closed the channel
The world's largest chess-related YouTube channel, which boasts more than 1 million subscribers, was suddenly shut down in June 2020 because it 'delivers harmful and dangerous content.' .. According to a study by computer scientists at Carnegie Mellon University, this mysterious channel closure may have been caused by ' AI misidentifying the words white and black in chess as racism. ' I found it expensive.
AI May Mistake Chess Discussions as Racist Talk | Carnegie Mellon School of Computer Science
AI mistakes'black and white' chess chat for racism | The Independent
https://www.independent.co.uk/life-style/gadgets-and-tech/ai-chess-racism-youtube-agadmator-b1804160.html
Antonio Radić, also known as agadmator , runs a YouTube channel that provides commentary on the board of chess, and has more than 1 million subscribers. On June 28, 2020, Radić's channel was suddenly shut down by YouTube shortly after posting a movie explaining the board where chess player Hikaru Nakamura won his fifth Grandmaster .
Accounts and channels will be restored after 24 hours. Radić speculated that 'the movie only talked about chess, but it may have been banned by mentioning the' white 'and' black 'pieces of chess.' In addition, Mr. Radić said that even at the time of writing the article, the reason why the channel was closed from YouTube was not explained.
In response, a research team led by Ashique Khuda Bukhsh , a researcher at Carnegie Mellon University's Institute of Language Technology, gave a hate speech with over 680,000 comments extracted from five popular chess-related YouTube channels. Sorted using two types of AI trained for automatic sorting.
When the research team randomly sampled 1000 comments from comments judged to be hate speech and manually checked the contents, 82% of them did not contain malicious expressions. In addition, the misjudged comments included terms commonly used in the chess world, such as 'White,' 'Black,' 'Attack,' and 'Threat.' It turned out.
According to Mr. Khuda Bukhsh, there are other misjudgments by AI. For example, when AI distinguishes between 'lazy dogs' and 'active dogs' in image recognition, many pictures of active dogs often have a large grassland in the background, so a lot of green grass appears. If so, it seems that there are many cases where it was judged that it was a picture of an active dog.
'This misclassification can occur because the datasets used to learn YouTube's AI contain few chess cases,' Khuda Bukhsh said. 'Detecting racist languages.' If you rely on AI, these accidents can happen. '
Related Posts:
in Web Service, Posted by log1i_yk