Deepfakes: How an AI app inserts women’s faces into porn

Published by: MRT

Published on:

Deepfakes: How an AI app inserts women's faces into porn

The website impresses with its simplicity: a white background with a large blue button on it that the user can use to upload a picture of a face. Under the button, four AI-generated faces offer the opportunity to test the service. Above it, the headline: “Turn anyone into a pornstar by using deepfake technology to insert the person’s face into an adult video.” All you need is the picture and a click on the website button.

MIT Technology Review will not use the name of the service, direct quotes or screenshots in order to avoid additional traffic on the site. In this text we use the name “Y” as the pseudonym for the service. The deepfake researcher Henry Ajder, who followed the development and rise of synthetic media on the Internet, discovered and brought it to our attention.

At the moment, Y still exists relatively unnoticed, but has a small, quite active user base that gives the developer feedback in online forums. Researchers had already feared that such an app would one day be developed. In the field of deepfakes, this application crosses an ethical limit that no other service has crossed before.

AI-generated synthetic media, i.e. deepfakes, have always been used to create pornographic depictions of women. It is understandable that this often has bad psychological consequences for those affected. The original Reddit inventor who popularized the technology swapped the faces of female celebrities in porn videos. Research firm Sensity AI estimates that to date between 90 and 95 percent of all deepfake videos on the Internet are non-consensual porn. About 90 percent of these videos feature women.

With the advancement of technology, numerous easy-to-use no-code tools have appeared that users can use to “undress” the female bodies in the images. Many of these services have since been taken offline, but the code still exists in open source repositories and keeps reappearing in new forms. The most recent of these pages, discovered by researcher Genevieve Oh, had over 6.7 million visits. It has not yet been taken offline.

There have already been other single-frame face-swapping apps such as ZAO or ReFace that put users in selected scenes from mainstream films or pop videos. And an app called “DeepNude” used pictures of women to virtually “take them off”, that is, to replace the clothes of the woman in the picture with a computer-generated naked body. The pornographic face-swapping app Y now goes a step further. It’s “tailor-made” to create pornographic images of people without their consent, says Adam Dodge, founder of EndTAB, a nonprofit that educates people about technology-based abuse. This makes it easier for creators to improve the technology for this particular use case, and attracts people who otherwise wouldn’t have thought of creating deepfake porn. “Every time you specialize in this way, there is a new corner of the internet that draws in new users,” says Dodge.

Y is very easy to use. As soon as a user uploads a photo of a face, the site opens a library of porn videos. The vast majority can be seen women, but also a small handful of men, mostly in gay porn. The user can then choose any video. A preview of the result can be seen within seconds. The user has to pay to download the full version.

The results are far from perfect. Many of the exchanged faces are obviously fake as the faces shimmer and distort when rotated from different angles. However, for a casual observer some are subtle enough to miss, and the evolution of deepfakes has already shown how quickly they can no longer be distinguished from reality. Some experts are of the opinion that the quality of the deepfake does not matter as the psychological damage to the victims can be the same in both cases. In addition, many people do not yet know that this technology exists, so that even poor quality assemblies can fool people.

Y presents itself as a safe and responsible tool for exploring sexual fantasies. The language on the website supposedly encourages users to upload their own face. But nothing prevents them from uploading other people’s pictures. Comments on online forums suggest that users have already done just that.

The consequences for women and girls who are targeted by such activities can be devastating. On a psychological level, these videos can feel just as hurtful as the so-called revenge porn – real intimate videos that are filmed or posted without consent. “This type of abuse – when people misrepresent your identity, name and reputation and change it in such a hurtful way – gets you to the core,” said Noelle Martin, an Australian activist who was hit by a deepfake porn campaign .

The effects can last a lifetime. The images and videos are difficult to remove from the Internet, and new material can be created at any time. “It affects people’s relationships; it affects job hunting. Every single interview you go to, this could be brought up. Potential love affairs,” says Martin. “To this day, I’ve never managed to completely remove the images. That’ll be out there forever. No matter what I do.”

Non-consensual deepfake porn can also have economic and professional repercussions. Rana Ayyub, an Indian journalist who was the victim of a deepfake porn campaign, was subsequently harassed online so badly that she had to minimize her web presence and thus the public profile she needed for her work.

The UK government-funded Revenge Porn Helpline recently received the case of a teacher who lost her job after fake pornographic pictures of her were circulated on social media and brought to the school’s attention, says Sophie Mortimer, who heads the service. “It’s just getting worse, not better,” says Dodge. “More and more women are being attacked this way.”

More from MIT Technology Review

More from MIT Technology Review

More from MIT Technology Review

Y’s ability to create deepfake gay porn is limited, but it poses an additional threat to men in countries where homosexuality is criminalized, says deepfake researcher Ajder. This is the case in 71 countries around the world, eleven of which punish the offense with death.

Ajder has discovered numerous deepfake porn apps over the past few years. He tried to contact Y’s hosting service and force him to shut down. But the researcher is pessimistic about the development of similar tools. Another website has already popped up that seems to be trying to do the same. In his opinion, a more sustainable solution would be to ban such content from social media platforms and perhaps even prohibit its creation or consumption. “That means that these sites are treated the same way as the material on the dark web,” he says. “Even if it is driven underground, at least it will disappear from the eyes of ‘normal’ people.”

Y did not respond to several requests for comment at the press email indicated on the website. The registration information associated with the domain is also blocked by the data protection service Withheld for Privacy. On August 17, after MIT Technology Review made a third attempt to contact the developer, the website posted a notice on its home page that it was no longer available to new users. As of September 12th, this notice was still online.


(jle)

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.