The mother of one of Elon Musk’s children is suing xAI for deep fake photos
While X removed Grok’s ability to create illicit digital images in the social media space, Grok’s standalone app is another story. It reportedly continues to produce “nude” deepfakes of real people. And now, Ashley St. Clair, a conservative political strategist and mother of one of Elon Musk’s 14 children, has sued xAI over her non-consensual Grok-produced porn.
In court filings, St. Clair accused XAI’s Grok chatbot of creating and distributing illegal content of her “as a child in a string bikini, and as an adult in sexually explicit images, covered in semen, or wearing only bikini floss.” In some cases, the chatbot is said to have produced bikini-clad deepfakes of St. Clair based on a picture of her at the age of 14. “People took pictures of me when I was a child and took off my clothes. They also took off my clothes and bent me down with it. In the back is my child’s backpack which he is wearing now,” he said.
“I also see pictures where they inflict bruises on women, beat them, tie them up, mutilate them,” said St. Clair. The guard. “These sickos had to go deep into the internet, and now they’re in a mainstream social media app.”
St. Clair said that, after she reported the photos to X, the social network responded that the content did not violate any policies. In addition, he says X left the photos posted up to seven days after he reported them. St. Clair said that xAI then retaliated by digitally creating undressed deepfakes of her, thereby “making [St. Clair] social media mockery.”
He accused the company of revoking his X Premium subscription, authentication token, and ability to monetize content on the platform. “xAI has been banned again [her] in repurchasing the Premium,” the St. Clair court filing said.
On Wednesday, X said it had changed its policies so Grok could no longer produce child pornography or nudity “in those places where it’s illegal.” However, Grok’s standalone app reportedly continues to undress and pose for porn when asked to do so.
Neither Apple nor Google has removed the Grok app despite the apparent policy violation. (Anna Moneymaker via Getty Images)
Apple and Google have so far done nothing. Despite weeks of outrage over deepfakes, neither company has released X or Grok apps from their app stores. Both the App Store and the Play Store have policies that expressly prohibit apps that generate such content.
Neither Apple nor Google responded to multiple requests for comment from Engadget. That includes a follow-up email sent on Friday, about the Grok app continuing to “nude” images of real women and other people.
While Apple and Google failed to act, many governments did the opposite. On Monday, Malaysia and Indonesia banned Grok. On the same day, the UK regulator Ofcom opened an official investigation into X. California opened its own on Wednesday. The US Senate even passed the Sedition Act a second time after the incident.
“If you’re a woman, you can’t send a picture, and you can’t talk, or you risk this abuse,” said St. Clair. The guard. “It’s dangerous, and I believe this is done by design. You have to feed the AI personality and thoughts, and if you do things that affect women in particular, and they don’t want to participate in it because they are targeted, then the AI will be inherently biased.”
Speaking about Musk and his team, he added that “these people believe they are above the law, because they are. They don’t think they will get into trouble, they think it has no consequences.”



