CHRIS ROPER: How AI evil outruns the sheriff

Advances in the way AI can be used to create deepfake porn have developed so quickly that few nations, if any, have yet thought to put laws in place to make it illegal

Imagine this happening to you. You’re a mother, and overnight your daughter becomes withdrawn, treats you with disgust and rebuffs any form of affection. You worry that something traumatic has happened, something bad that she refuses to talk about. Months later, you discover what it is. She has found and watched a hard-core porn movie that features you as the star.

This is the opening vignette in a fascinating piece of investigative journalism by Der Spiegel, and it happened to a well-known German actress.

From one day to the next, says the magazine, the daughter was transformed. “She came across the video simply because she entered my name into Google,” says Anita Hübig (a pseudonym). “The video was only a few minutes long, but it plunged our family into a deep crisis for many months.” 

The daughter thought the porn film had been shot when her mother was younger, and was convinced it was real. But in fact the video is a deepfake created using AI: the mother had never made a porn movie, and her face had been combined with the body of another actress.

According to Der Spiegel, AI is driving a new form of sexualised violence against women. The many apps available can easily turn an ordinary photo into a nude one. If you pay a little more, the software can pose the victim in different sexual positions. “All that is needed is a snapshot of the victim or a link to their Instagram profile. Anonymous users are provided with a high-resolution, often deceptively real image. There is no check whether the person depicted has given their consent.” 

Women are the victims in 99% of cases. “AI programs that create nude images often cannot even process men’s bodies. The advertising slogan of one app was: ‘Imagine wasting your time taking her out on dates when you can see her naked with our website.’” 

To give you an example of one of these services, the ClothOff app trumpets: “Undress any photo with Deepnude AI for free! Upload a photo, wait a few seconds and get result!” The ClothOff website, Der Spiegel tells us, got 27-million clicks in the first half of 2024, and the site claims that 200,000 photos are generated every day. You just click that you’re older than 18, agree that you’ll only generate nude pictures of yourself, and off you go. Your claims aren’t actually checked. 

As you can imagine, the people offering these services do their best to conceal their identities. Der Spiegel’s journalists uncovered several people behind the network, by analysing the source codes of websites and information from leaked databases.

“This revealed a comprehensive network of deepfake apps with Eastern European backers. The analyses reveal for the first time how big the problem of deepfake videos on the internet has become and that there is an urgent need for action. Those affected, whom Spiegel spoke to, are often left alone to deal with the digital abuse. The number of perpetrators who use such apps runs into the tens of thousands.” (I’m using Google to translate the original German, so any clumsiness is the fault of whatever the current AI is that Google Translate uses, and not Der Spiegel.) 

I urge you to read the original Der Spiegel story, which is rich in detail. For example, the largest website for AI-generated sex videos of celebrities is “MrDeepFakes”. In a perfectly judged semiotic metaphor for the deranged people who make stuff like this, the site’s logo shows a grinning cartoon face of Donald Trump and a mask that looks like the symbol of the Anonymous hacker movement. Der Spiegel says the site is viewed more than 6-million times a month, and currently hosts 55,000 fake intimate videos, with tens of thousands more having been uploaded and deleted over time. 

But the bit that really jumped out was this: “In Germany, protection by the state is thin. There are no specific legal regulations. Anyone who uses AI to create intimate images of an adult victim may not even be violating any provisions in the criminal code. Only prosecution under data protection law would be possible, a construction that has apparently not yet been tested in any case.”

Anyone who uses AI to create intimate images of an adult victim may not even be violating any provisions in the criminal code

—  Der Spiegel 

So we don’t even have laws that can bring justice to victims of deepfake porn. This reminds me of another German example, the cannibalism case of 2001. 

In March that year,  a former computer technician called Armin Meiwes advertised on the internet for “a well-built 18- to 25-year-old to be slaughtered and then consumed”. He found a volunteer in Bernd-Jürgen Brandes, described as “a 43-year-old bisexual engineer from Berlin”. The Guardian describes what happened next. 

“The two men went up to the bedroom in Meiwes’s rambling timbered farmhouse. Mr Brandes swallowed 20 sleeping tablets and half a bottle of schnapps before Meiwes cut off Brandes’s penis, with his agreement, and fried it for both of them to eat. Brandes — by this stage bleeding heavily — then took a bath, while Meiwes read a Star Trek novel. In the early hours of the morning, he finished off his victim by stabbing him in the neck with a large kitchen knife, kissing him first. The cannibal then chopped Mr Brandes into pieces and put several bits of him in his freezer, next to a takeaway pizza, and buried the skull in his garden. Over the next few weeks, he defrosted and cooked parts of Mr Brandes in olive oil and garlic, eventually consuming 20kg of human flesh before police finally turned up at his door.” 

As patriotic South Africans, we can only feel a flush of pride at the next detail. “Meiwes told detectives that he had consumed his victim with a bottle of South African red wine, had got out his best cutlery and decorated his dinner table with candles.” Brandes had tasted of pork, he added.

I’d love to know the name of the wine that he drank, so that I can check if the estate has updated its pairing notes. Meiwes apparently became a vegetarian in prison. 

Salacious details aside, the purpose of this retelling is to point out that there was no law against cannibalism in Germany. Meiwes was convicted of manslaughter in 2004, and then, in a retrial in May 2006, convicted of murder and sentenced to life imprisonment. But there were no charges of cannibalism, because that’s not illegal in Germany. It’s also not illegal in the UK and the US, by the way, though of course the processes necessary to procure a human body to eat are illegal. 

There are uneasy parallels with the fact that there is no specific law against creating deepfake porn by grafting the faces of non-consenting women onto AI-generated images. Der Spiegel does point out that investigators can take action against those who distribute abusive sex recordings, either for defamation or for violations of the Art Copyright Act, which protects the right to one’s own image. But, they say, the laws are clearly not keeping pace with technological developments. And the evil men behind the apps and production of deepfake porn are fairly safe from being held accountable. And payments are anonymised by, for example, using bitcoin. 

A recent report by a UN agency and Europol’s European Cybercrime Centre says “criminals and organised crime groups have been swiftly integrating new technologies into their modi operandi, thus creating not only constant shifts in the criminal landscape worldwide but also creating significant challenges for law enforcement and cybersecurity in general”. There’s even an official term for their business model: crime-as-a-service. “The business model, which allows non-technologically savvy criminals to procure technical tools and services in the digital underground that allow them to extend their attack capacity and sophistication, further increases the potential for new technologies such as AI to be abused by criminals and become a driver of crime.” 

AI is already being used to commit terrible crimes that society’s safeguards are inadequate to address. There’s also the matter of how long it takes to enact new laws in most societies, stacked up against the rapidly accelerating pace of AI innovation. It’s going to be a wild ride, and a lot of people are going to suffer. 

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Comment icon