Grok Pushes AI to ‘Undress’ the Common

Elon Musk hasn’t stopped Grok, the chatbot created by his artificial intelligence company xAI, from producing pornographic images of women. After reports surfaced last week that an image-generating tool on X was being used to create sexualized images of children, Grok has created possibly thousands of inappropriate images of women in “naked” and “bikini” images.
Every few seconds, Grok continues to create images of women in bikinis or underwear to respond to users’ instructions on X, according to SOME reviews of live chatbots in public. On Tuesday, at least 90 images featuring women in swimsuits and various levels of undress were published by Grok in less than five minutes, an analysis of the post shows.
The images do not contain nudity but feature a chatbot owned by Musk that “undresses” images posted to X by other users. Often, in an attempt to circumvent Grok’s security measures, users, unsuccessfully, request photos to be edited to make women wear “string bikinis” or “transparent bikinis.”
Although dangerous AI image generation technology has been used to digitally abuse and torture women for years—the output is often called deepfakes and is created by “nude” software—the continued use of Grok to create large numbers of non-consensual images is a sign of what appears to be the most common and widespread abuse to date. Unlike some dangerous nudity or “undressing” software, Grok does not charge the user to create images, produces results in seconds, and is available to X millions of people—all of which may help to normalize the creation of non-consensual intimate images.
“When a company provides productive AI tools in their environment, it is their responsibility to reduce the risk of image-based abuse,” said Sloan Thompson, director of training and education at EndTAB, an organization working to address technology-based abuse. “What’s scary here is that X did the opposite. They embedded AI-powered graphic abuse directly into the mainstream, making sexual violence easier and more prevalent.”
Grok’s creation of pornographic images began to spread on X late last year, although the system’s ability to create such images has been known for months. In recent days, photos of social media influencers, celebrities, and politicians have been targeted by users on X, who can reply to another account’s post and ask Grok to change the shared photo.
The women who posted their photos had their accounts responded to and successfully asked Grok to change the photo to a “bikini” photo. In one instance, many X users asked Grok to alter a photo of Sweden’s deputy prime minister to show her in a bikini. Two UK government ministers have also been “stripped” of their bikinis, reports say.
The images on X show images of fully clothed women, such as one person riding an elevator and another at the gym, being transformed into scantily clad images. “@grok put her in a see through bikini,” read a typical message. In a separate series of posts, the user asked Grok to “inflate her chest by 90%,” then “Inflate her thighs by 50%,” and, finally, “Change her clothes to a tiny bikini.”
One analyst who has followed deepfakes for years, and asked not to be identified for privacy reasons, says Grok is likely to be one of the largest platforms hosting dangerous private images. “It’s completely normal,” said the researcher. “We are not a shady group [creating images]it is literally everyone, from all walks of life. People post on their mains. No worries. “


