DeepNude, a ‘deepfake ecosystem’ popular on Telegram, has generated over 100,000 non-consensual, explicit nude images of women and underage girls
With the power of technology, the world’s greatest minds have the capability to achieve almost anything. But, while some work to make life easier for those with disabilities, or to help end a global pandemic, others are focussed on a less noble cause: abusing and objectifying women.
An anonymous programmer launched an app called DeepNude last year, which used deepfake technology to remove clothing from photos of women. It is described as offering “the superpower you always wanted”, enabling users to upload photos of women (the technology didn’t work on men) which it would then strip and turn into believable fake nudes. It cost $50 to download on various app stores.
The app was taken offline just hours later by its creator, with a bizarre backpedal. “We don’t want to make money this way,” they wrote in a statement. “The world is not yet ready for DeepNude.”
A year since, the app has grown expontentially in popularlity and usage. The software has been discovered on Telegram, where it has been employed as part of a ‘deepfake ecosystem’, in which over 104,000 fake nudes of women have been created. The ecosystem centres on an AI bot which generates nudes on request, sending galleries of new images to tens of thousands of subscribers each day. Chat members can make specific requests of the bot in its messaging groups. Some of the photos are believed to depict underage girls.
Horrifying deepfake app called DeepNude realistically removes the clothing of any woman with a single click. It costs $50 and only works on women. (The black bars were added after using the app.) pic.twitter.com/5KS36FPTqZ
— Mike Sington (@MikeSington) June 27, 2019
Sensity, a deepfake detection company, published its findings on this issue in October. Speaking to Dazed, Giorgio Patrini, the company’s CEO and chief scientist, says: “We haven’t observed DeepNude technology embedded in a chatroom bot before, with its dramatic potential for user accessibility and scale of production. In fact, this is singularly the largest incident of this kind for the number of victims.”
Researchers report that the number of fake nudes on Telegram grew by 198 per cent between April and June 2020. This statistic is in line with the news that revenge porn was on the rise during lockdown, with cases of intimate image abuse increasing by 22 per cent this year. In its report, Sensity addressed the “broader threat” that “individuals’ ‘stripped’ images can be shared in private or public channels beyond Telegram as part of public shaming or extortion-based attacks”.
Though Sensity estimates that more than 104,000 women have been targeted by the bot, the actual number is likely to be much higher. This is owed to the simplicity of the technology. All users have to do in order to generate a fake nude is text their chosen image to the bot, which will send an unclothed image back, free of charge. These photos are then shared with affiliated Telegram channels. An anonymous poll conducted in one of these channels found that 70 per cent of users were from Russia, Ukraine, Belarus, Kazakhstan, and the ex-USSR countries; many said they discovered the bot from Russian social media network VK.
“No intervention or takedown has occurred, and, with that, we don’t know who still today may possess or have shared the images created by the bot” – Giorgio Patrini, Sensity
Despite going public with its findings almost two months ago, Sensity says it “never got a reply” from Telegram. “Those channels are still active on the platform,” continues Patrini, “although most are now avoiding openly advertising the bot or its creations. Importantly, though, the bot itself is still functioning as before. No intervention or takedown has occurred, and, with that, we don’t know who still today may possess or have shared the images created by the bot.”
A number of groups have changed their names so they can avoid being identified, while other channels have disappeared completely, or had the nude images shared in them removed. As reported by WIRED, at the end of October, the bot became inaccessible on iPhone and iPads, which showed an error message saying it “violates” Apple’s guidelines.
Patrini says the bot has been able to thrive specifically on Telegram due to the platform easily allowing the integration of “bots and other automations” into their chatrooms, explaining that the main reason “is technical”.
Although he believes you can currently distinguish the fake nudes from real nudes, Patrini declares that this “won’t last long”. “Technology is improving and getting commodified at an accelerating pace,” he continues. “You don’t need to be a machine learning researcher to operate it; the software is open-source and comes with tutorials and large communities of developers.”
The possible earnings associated with this kind of deepfake technology could also be spurring more people to abuse it. “People are getting ingenious on understanding how deepfake technology can be turned into a business opportunity, with a legit or illicit connotation,” explains Patrini. “For example, the bot in our investigation is monetising by providing paid users with a faster creation of images without any watermarks on them.”
However, as deepfake software improves, so does detection technology. Patrini says the detection technology will “play a more fundamental role in the future, in helping us understand what we see and in automatically discovering deepfakes online”. As women feature in 96 per cent of all deepfake videos, preventative and reactive technology to stop the creation of these photos and videos is vital in the ongoing fight for gender equality.
“This is a problem that will unfortunately affect each of us as private individuals,” concludes Patrini. “Our face traits can be stolen and repurposed into visual material that attacks our reputation and can be weaponised for public shaming, blackmail, or extortion.
“It’s important that each of us is aware that any image or video that we post online may be visible to the whole world, and that bad actors may be using it.”
READ MORE
Inside the disturbing rise of ‘deepfake’ porn
This horrific deepfake app generates realistic nudes in seconds