Deepfake Pornography Surges as Women Face Rising Digital Abuse in Germany
Deepfakes—AI-generated or altered images, audio, or videos—are growing harder to detect. Advances in technology have made them more realistic and easier to create. This surge has led to rising cases of deepfake pornography, with women bearing the brunt of the harm.
The issue has prompted calls for stronger legal protections, as current German laws offer limited support for victims.
Deepfakes use artificial intelligence to mimic real people, places, or events. While some are harmless, others—like non-consensual pornography—cause serious damage. Victims, once mostly celebrities, now include ordinary individuals facing severe personal and professional consequences.
In Germany, at least 15-20 documented cases have emerged since 2023, with incidents at universities and a steady rise in reports through 2025. Women are the primary targets, while men are the overwhelming perpetrators. Support organisations like HateAid and the 'Violence Against Women' helpline assist victims, who can also file police reports or request removals from Google searches.
Germany lacks specific deepfake laws, though existing privacy and image rights regulations sometimes apply. A proposed 'digital protection against violence act' aims to address the gap. Meanwhile, the EU's AI Act mandates labelling of synthetic content, while the Digital Services Act requires platforms to remove illegal material, including deepfake pornography.
Researchers are fighting back by using AI to detect AI-generated content. Yet, as technology evolves, spotting deepfakes remains a growing challenge.
The spread of deepfakes, particularly pornographic content, has exposed legal and technical weaknesses. Victims can seek help, but broader protections are still under development. With the EU enforcing stricter rules and Germany planning new laws, efforts to combat the issue are gaining momentum.