Skip to content

How Non-Consensual Images Exploit Women for Years Without Consequences

From Phica.net to AI deepfakes, women's stolen images fuel a thriving online abuse industry. Why does justice take so long?

The image shows a collage of four different facial expressions of a man and a woman, with the words...
The image shows a collage of four different facial expressions of a man and a woman, with the words "deepfake" and "original" written on it. It appears to be a screenshot of a computer screen, suggesting that the image is a tutorial on how to create a realistic facial expression.

How Non-Consensual Images Exploit Women for Years Without Consequences

The spread of non-consensual intimate images online has grown into a widespread issue, with platforms profiting from the abuse of women's privacy. Recent cases highlight how slow legal action and technological gaps leave victims exposed for years. Researchers and legal experts warn that algorithms and poorly regulated tools are making the problem worse.

In 2017, Italian sociologist Silvia Semenzin first reported Phica.net, a site where men shared explicit images of women without their consent. Despite her efforts, the platform continued operating for five more years. It was only in 2022, after repeated media pressure, that authorities finally shut it down. By then, the site had amassed over 700,000 users and included a dedicated section titled 'Revenge Porn'.

Mati Teggi, a woman from Lugano, found her own images distributed on Phica.net without permission. The experience caused her significant distress, a situation shared by countless others. Swiss lawyer Roy Bay confirmed that using someone's image without consent is illegal, but victims often face delays in getting content removed.

The issue extends beyond individual cases. AI Forensics found that 81% of deepfakes created by Grok's image tools target women. After public criticism, Grok restricted its image-manipulation features, but the damage had already spread. By late 2025, X (formerly Twitter), under Elon Musk's ownership, launched a new image-editing tool that led to another surge in deepfake nudes.

Law enforcement has taken action, closing dozens of non-consensual image-sharing sites since 2017. However, exact numbers remain unclear due to incomplete records and ongoing investigations. Platforms like Mia Moglie and Phica.net demonstrate how this abuse has turned into a profitable business model. Researcher Philip Di Salvo argues that algorithms and digital platforms are not neutral—they actively enable gender-based violence by design.

The shutdown of sites like Phica.net came years after initial reports, leaving victims unprotected in the meantime. Legal and technological measures have struggled to keep pace with the rapid spread of deepfake and non-consensual content. Without stronger enforcement and platform accountability, the exploitation of women's images online is likely to persist.

Latest