Technology

Elon Musk’s Grok AI Used to ‘Undress’ Women Without Consent

xAI and Grok logos are seen in this illustration taken, February 16, 2025. REUTERS/Dado Ruvic/Illustration

A woman has told the BBC she felt “dehumanised and reduced to a sexual stereotype” after Elon Musk’s AI chatbot, Grok, was used to digitally remove her clothing. The BBC has uncovered multiple examples on the social media platform X where users have prompted the chatbot to “undress” women—generating images of them in bikinis or sexualised situations without their permission. When asked for comment, xAI—the company behind the tool—replied with an automated message stating: “legacy media lies.”


‘As Violating as a Real Nude’

The controversy surfaced after Samantha Smith posted on X about her image being altered. Her post was met with a wave of similar stories from other victims, while some users responded by asking Grok to generate even more sexualised versions of her.

“Women are not consenting to this,” Ms Smith told the BBC. “While it wasn’t me that was in states of undress, it looked like me and it felt like me. It felt as violating as if someone had actually posted a nude or a bikini picture of me.”

Grok is an AI assistant available to users on X. While it is designed to provide context or reactions to posts, its image-editing feature allows users to modify uploaded photos. Despite xAI’s own policy prohibiting the depiction of people in a “pornographic manner,” the tool has faced repeated criticism for its lack of safeguards.

The Legal Crackdown

The UK government is currently moving to criminalise the creation and supply of “nudification” technology.

  • The Home Office: A spokesperson confirmed new legislation will ban these tools. Under the proposed law, anyone supplying such technology could face substantial fines and a prison sentence.
  • Ofcom: The media regulator stated that tech firms are legally required to “assess the risk” of illegal content. While it did not confirm an investigation into X, it reiterated that creating or sharing non-consensual deepfakes is illegal under UK law.

Analysis: A Culture of ‘Impunity’?

This is not the first time Grok has been linked to explicit content; the tool was previously used to create a sexually explicit viral clip of pop star Taylor Swift.

Clare McGlynn, a law professor at Durham University, told the BBC that platforms like X possess the technical ability to stop this abuse but choose not to.

“The platform has been allowing the creation and distribution of these images for months without taking any action,” Professor McGlynn said. “They appear to enjoy impunity.”

Under the Online Safety Act, platforms must take “appropriate steps” to reduce the risk of users encountering illegal content and must remove it quickly once reported. However, critics argue that the dismissive nature of xAI’s responses suggests a lack of urgency in addressing the harms caused to women on the platform.

About the author

Africa

Add Comment

Click here to post a comment