Grok flooded X with over 3 million sexualized images in just 11 days, analysis finds
The deluge of photorealistic images included almost two million depicting women and almost 25,000 that contained children, according to the Center for Countering Digital Hate
Elon Musk’s AI-chatbot Grok flooded the social media site X with over three million “sexualized images” including depictions of women and children in just 11 days, new analysis has found.
The deluge of photorealistic images included almost two million showing women and almost 25,000 that contained children, according to the Center for Countering Digital Hate.
Users of the platform, which is also owned by Musk, were presented with a feature on December 29 that allowed them to alter real photos, including the ability to remove clothes, put them in bikinis and pose in sexual positions – sparking global outrage.
As a result, the feature was restricted to paid users on January 9, and further technical restrictions on editing people to undress them were added on January 14.
The center’s estimates were calculated by analyzing a random sample of 20,000 images from the wider total of 4.6 million produced by Grok’s image-generation feature during that time period.

Based on its data sample, the center estimated that Grok generated 3,002,712 photorealistic sexualized images in total during the 11-day period, which translates to an average of around 190 per minute.
Separate analysis by The New York Times “conservatively” estimated that at least 41 percent of the posts likely contained sexualized images of women – equating to around 1.8 million.
The center noted that since it did not analyze original images and prompts it was not possible to tell how many were edited images, or originals created by Grok through user prompts. It was also not possible to tell if the images were already sexualized, or if they were created with the consent of the people depicted.
Examples of sexualized images generated by Grok included numerous images depicting people wearing transparent or micro-bikinis and women wearing only saran wrap or transparent tape.

Public figures identified in sexualized images included Selena Gomez, Taylor Swift, Billie Eilish, Ariana Grande, Ice Spice, Nicki Minaj, Christina Hendricks, Millie Bobby Brown, Swedish deputy prime minister Ebba Busch and former U.S. vice president Kamala Harris.
In addition, from their sample, the center estimated that 23,338 sexualized images created by Grok in the period were of children, or individuals “clearly under the age of 18.”
The total number reflects an estimated average pace of one every 41 seconds.
Sexualized images of children identified in the sample generated by Grok include a selfie of a schoolgirl who was placed into a bikini, six other young girls put in bikinis and images including four child actors.
Researchers took steps to avoid accessing or reviewing images depicting Child Sexual Abuse Material or child pornography, according to the organization.

As of January 15, the center said that 29 percent of the images remained available on the platform, despite claims from Musk that he was “not aware of any naked underage images generated by Grok.”
Following the feature’s initial release, governments and advocacy groups around the world reacted with horror.
In the U.S., a coalition of 28 digital rights, child safety, and women’s rights organizations petitioning Apple and Google to drop Grok from their app stores and California Attorney General Rob Bonta described the feature as “shocking.”
Meanwhile while the European Union’s executive Commission slammed its “illegal” and “appalling” behavior with the U.K. Government branding the AI-generated images as “weapons of abuse” and threatening to ban X altogether.
The Independent has reached out to X for comment on the researchers’ findings.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments
Bookmark popover
Removed from bookmarks