A platform that enables deepfake pornography has no place in a just society

Photo by Kelly Sikkema via UnSplash.
It’s Jan. 2, 2026. You open X (formerly Twitter) to find a sexually explicit photo of yourself. A photo you never took. Once Grok — X’s built in generative AI model — was updated to be able to edit photos, everything changed for online safety. Everyone with an X account had the ability to ask Grok to edit photos — without the consent of the person pictured. Requests made to Grok included putting users in small bikinis and degrading poses, covering them in bruises or “doughnut glaze,” and placing gags in their mouths, among others. This digital abuse was not limited to adults. Many users also used Grok to edit images of minors, producing child sexual abuse material (CSAM).
Initial reports said “hundreds” or potentially thousands of users had non-consensual images generated of them. Images that were publicly generated for the world to see. But initial reports did not grasp the severity of the issue. Later, a report from the Center for Countering Digital Hate (CCDH) estimated three million sexualized images were generated in just 11 days. The Guardian reported that by Jan. 8, as many as 6 000 requests were being made to Grok every hour to put women in bikinis. One report, published by the European investigative non-profit AIForensics, found that 53 per cent of all images assessed between Dec. 25 and Jan. 1 were sexualized photos, with 81 per cent of individuals presenting as women.
Eventually, Grok was restricted, so only premium users could ask it to edit photos. The public @Grok X account was also heavily restricted, though the Guardian reports users are still able to generate sexualized images of both women and children through the Grok app.
Elon Musk, CEO of xAI, said the images were caused by lapses in Grok’s safeguards. Even if this is true, it raises the question: should a multi-billion dollar company that cannot implement adequate safeguards to prevent non-consensual sexual content — including CSAM — be allowed to develop such a tool in the first place? Is a company that let the issue go on for so long, while leaving open loop-holes, really fit for a just society?
Sweden’s Prime Minister, Ulf Kristersson, called the images “a kind of sexualized violence.” He’s right. But it’s not just sexualized violence, it’s also an ideological weapon that makes digital spaces unsafe for women, while empowering abusers. It’s a tool to strip women of not just their clothes, but of any semblance of online safety.
The digital sphere should be a space where women can express themselves, speak up about social issues that affect them, and feel a sense of belonging, without the threat of real-world violence. Now, many digital spaces are at risk of becoming a safe-haven for incels, while silencing women through intimidation and the constant threat of sexualization and abuse. Experts who study abuse warn tools like Grok give abusers a new avenue for harming and controlling their victims, and victims of such tools report that calling out the bad behaviour only makes it worse, with more users then flooding in, to suggest their own sexualized edits. It’s a zero-sum game where women are deprived of control over their digital presence for the benefit of incels, predators, and abusers.
Grok’s photo editing feature quickly drew international response, with discussions of banning X taking place around the globe, however, not all countries have been willing to act on it. Disgustingly, Canada is not currently considering banning X, according to Evan Solomon, Canada’s minister of AI and digital information. Canada’s National Observer reported that Elon Musk is “applauding” the country’s decision not to ban X. Even worse, Canada’s pension plan — which had 15.9 million contributors in 2023 — has invested $416 million in xAI.
On Feb. 3, French authorities raided X offices, following an investigation that began in January 2025. French authorities are investigating X for alleged unlawful data extraction and alleged complicity in the possession of CSAM. The UK is also investigating X over the sexualized deepfakes created by Grok and shared on the platform.
It’s time to ask ourselves, what does X bring to the table? It’s a platform run by a man who has publicly performed what many described as a “Nazi salute,” who has been criticized for pandering to Neo-Nazis, and who fired over 80 per cent of X’s content moderation and safety team, causing hate to run rampant. Communications between Musk and Jeffrey Epstein were also recently released by the U.S Department of Justice, including emails arranging for Musk to visit Epstein’s island.
X is a platform that gave abusers a tool to attack and control women and girls online. It’s a platform that allowed at least three million non-consensual sexualized images to be generated. It’s a platform that has no place in a society that cares for women and marginalized communities. It’s time to move away from X, forever.






