Teenagers are suing xAI over Grok for creating child sexual abuse content.



Three teenagers have filed a lawsuit against

Grok , an AI assistant provided by Elon Musk's AI company xAI, alleging that it generated child sexual abuse content (CSAM) .

Tennessee tens sue Musk's xAI for allowing Grok to undress them - UPI.com
https://www.upi.com/Top_News/US/2026/03/16/teens-sue-musk-xai-grok/9401773691425/

Teens sue Musk's xAI, saying Grok made sexual images of them as minors - The Washington Post
https://www.washingtonpost.com/technology/2026/03/16/teens-sue-musk-xai-grok/

Elon Musk's xAI faces child porn lawsuit from minors Grok apparently undressed | TechCrunch
https://techcrunch.com/2026/03/16/elon-musks-xai-faces-child-porn-lawsuit-from-minors-grok-allegedly-undressed/

xAI is being sued by teens who say Grok created CSAM using their photos
https://www.engadget.com/social-media/xai-is-being-sued-by-teens-who-say-grok-created-csam-using-their-photos-200102733.html

Since the end of 2025, Grok's image editing features have become easily accessible on X (formerly Twitter), leading to the spread of 'sexually explicit images created with Grok' on the internet. In response to this problem, Indonesia and Malaysia temporarily blocked access to Grok. Elon Musk of xAI condemned government censorship, but ultimately banned the use of image editing features to turn images into swimsuits, underwear, or nudes .

Authorities in India, France, and Malaysia have launched investigations into the issue of the image editing function of the AI generator 'Grok,' which can also be used on X, being able to generate sexually explicit images of children and women - GIGAZINE



On March 16, 2026, three anonymous plaintiffs filed a lawsuit in California federal court, alleging that xAI should be held accountable for enabling Grok to generate identifiable CSAMs of minors. The three plaintiffs claim they want to file a class-action lawsuit on behalf of all those who have had CSAMs generated in Grok.

The plaintiffs argue that xAI failed to take 'basic precautions to prevent the generation of pornographic images of real people or minors,' as other AI companies do.

TechCrunch, a technology media outlet, points out that if AI models enable the generation of nude and erotic content from real-life images, it will be 'virtually impossible' to prevent the creation of sexually explicit content featuring children. Furthermore, TechCrunch notes that Musk publicly advertised Grok's ability to generate sexual images and depict real people in revealing clothing, and that 'this will be a major point of contention in the lawsuit.'



One of the plaintiffs claims that photos from alumni reunions and graduation albums used in advertisements were digitally altered to make them appear nude using Grok. This plaintiff says they learned about the nude photos of themselves and other minors being spread on Discord via Instagram.

The second plaintiff claims that the police informed him that images containing sexual depictions of himself were generated by a third-party mobile app that used Grok. The plaintiff's lawyers argued that xAI is responsible for the problem because third-party use of Grok requires xAI's code and servers.

The third plaintiff said that she learned that her own CSAM (Censored Satellite Image) had been generated when investigators contacted her saying that her own pornographic images had been found on the arrested suspect's cell phone.

The plaintiffs claim they have suffered extreme distress as a result of the dissemination of CSAM. Furthermore, they are seeking civil sanctions against xAI based on laws aimed at protecting sexually exploited children and preventing corporate negligence.

in AI, Posted by logu_ii