Far-right groups and neo-Nazis are using 'AI Hitler' to attract young people online



As generative AI develops and becomes more widespread, there are growing concerns that AI-generated content will be used to spread misinformation and extreme political ideology. The Washington Post reported that far-right and

neo-Nazi internet users are already using an AI Adolf Hitler created using AI voice cloning technology to spread extreme political ideology to young people online.

How neo-Nazis are using AI to translate Hitler for a new generation - The Washington Post
https://www.washingtonpost.com/technology/2024/09/27/neonazis-ai-hitler-videos/



According to a report released on September 22, 2024 by the British think tank Institute for Strategic Dialogue (ISD), online content praising Nazi leader Hitler has been increasing rapidly as historical revisionism that denies the Nazi Holocaust has grown.

In particular, content praising, defending, or translating Hitler's speeches into English was viewed approximately 25 million times on social media platforms such as X, TikTok, and Instagram in the month following August 13. The content was mainly Hitler's speeches translated into English by AI voice, and it has been reported that it had a particularly wide reach on X.

Experts say generative AI tools, which can create lifelike photos, voices and videos in seconds, are giving extremist groups more opportunities to spread dangerous ideas, putting more young people at risk of being exposed to extremism and posing moderation challenges for platform operators.

In fact, a video of Hitler speaking in English, created with an AI voice clone, has received comments such as, 'I miss you, Uncle Adolf,' 'He was a hero,' and 'Maybe Hitler wasn't a bad guy after all.' 'This allows for a new kind of emotional engagement that is more appealing to new generations,' said Emerson Brooking, a senior fellow at the Atlantic Council, an international security think tank.



Far-right and neo-Nazi groups on Telegram and dark web forums have praised AI-generated speech videos as an easy and effective way to spread Hitler's ideology to young people. 'This kind of content spreads

the Red Pill to a massive audience at lightning speed. In terms of propaganda, nothing compares to it,' the self-proclaimed fascist website American Futurist wrote on a public Telegram channel, likening it to the truth pill in the film ' The Matrix .'

In addition, a neo-Nazi content creator who goes by the name OMGITSFLOOD explained how to create a Hitler video using ElevenLabs ' AI voice cloning tool and video software during a stream on the video sharing site Odysee. OMGITSFLOOD said that he created a video of Hitler speaking in English about Jews who profit from the capitalist system in just about five minutes. In addition, OMGITSFLOOD said that Hitler was 'one of the greatest leaders of all time' during the stream, and said that he wanted to inspire young people who could become neo-Nazis.

After The Washington Post reported the matter to ElevenLabs, the company banned OMGITSFLOOD from its platform. 'We prohibit the use of our tools to create violent, hateful or harassing content,' Artemis Seaford, ElevenLabs' vice president of AI safety, told The Washington Post.

Nowadays, AI content can be easily created without any specialized programming knowledge, and all that's needed to create a 'video of Hitler speaking in English' is a few seconds of video clips from YouTube. 'It's so easy to put out this kind of content now,' says Abby Richards, a misinformation researcher at the left-leaning nonprofit watchdog Media Matters for America. 'The more you post, the more likely it is to be seen by a lot more people than ever before.'



It has also been pointed out that AI-generated Hitler videos that are popular on TikTok, X, Instagram, etc., lack the characteristics often seen in Nazi propaganda. For example, one video contains only a silhouette of Hitler instead of a photo, and another video features an English speech accompanied by slow instrumental music. These videos do not contain terrorist or extremist logos, making it extremely difficult for technology companies to crack down on them.

Jared Holt, a senior researcher at ISD, pointed out that not everyone who watches or shares the AI Hitler video agrees with the content of the speech, and may simply find it 'funny' or 'extreme.' However, Holt warned, 'When people encounter this content repeatedly within the broader political context, it can lead to numbing or normalization.'

◆ Forum is currently open
A forum related to this article has been set up on the official GIGAZINE Discord server . Anyone can post freely, so please feel free to comment! If you do not have a Discord account, please refer to the account creation procedure explanation article to create an account!

• Discord | 'Have you ever had an AI try to change your way of thinking?' | GIGAZINE
https://discord.com/channels/1037961069903216680/1291694221224316999

in Software,   Web Service, Posted by log1h_ik