Elon Musk’s firm, xAI must be held accountable for permitting its AI fashions to supply abusive sexual photographs of identifiable minors, three nameless plaintiffs argued in a lawsuit filed Monday in California federal court docket.
The three plaintiffs, who’re aiming to show this into a category motion lawsuit, are looking for to signify anybody who had actual photographs of them as minors altered into sexual content material by Grok. They allege that xAI didn’t take primary precautions utilized by different frontier labs to stop their picture fashions from producing pornography depicting actual folks and minors. The case, JANE DOE 1, JANE DOE 2, a minor, and JANE DOE 3, a minor versus X.AI Corp and X.AI LLC, was filed within the U.S. District Courtroom of California Northern District.
Different deep-learning picture turbines make use of numerous strategies to stop the creation of kid pornography from regular images. The lawsuit alleges that these requirements weren’t adopted by xAI.
Notably, if a mannequin permits the technology of nude or erotic content material from actual photographs, it’s nearly not possible to stop it from producing sexual content material that includes kids. Musk’s public promotion of Grok’s potential to supply sexual imagery and depict actual folks in skimpy outfits options closely within the swimsuit.
The corporate didn’t reply to a request for remark from TechCrunch.
One plaintiff, Jane Doe 1, had footage from her highschool homecoming and yearbook altered by Grok to depict her unclothed. An nameless tipster who contacted her on Instagram advised her that the pictures had been circulating on-line, and despatched her a hyperlink to a Discord server that includes sexualized photographs of her and different minors she acknowledged from college.
A second plaintiff, Jane Doe 2, was knowledgeable by prison investigators about altered, sexualized photographs of her created by a third-party cell app that depends on Grok fashions. A 3rd, Jane Doe 3, was additionally notified by prison investigators who found an altered, pornographic picture of her on the telephone of a topic that they had apprehended. Attorneys for the plaintiffs say that as a result of third-party utilization nonetheless requires xAI code and servers, the corporate must be held accountable.
Techcrunch occasion
San Francisco, CA
|
October 13-15, 2026
All three plaintiffs, two of whom are nonetheless minors, say they’re experiencing excessive misery over the circulation of those photographs and what it may imply for his or her reputations and social life. They’re asking for civil penalties underneath an array of legal guidelines meant to guard exploited kids and stop company negligence.
