At first, the panic was that AI (artificial intelligence) was going to take our jobs, but now there’s some concern it’s going to make us nude.
Online data analysts Graphika has found AI-generated ‘undressing’ technology has moved from a niche area to major online businesses despite the legal, moral and ethical concerns surrounding the issue.
But what are we talking about here? Well, ‘nudify’ apps and services use AI to create what is technically called non-consensual intimate imagery (NCII) by ‘undressing’ images. AI can manipulate photos and video of real people to make them appear nude, almost always without their consent and almost always the subjects are women.
Colloquially, it’s known as deepfake porn, and it’s becoming astonishing popular.
Millions of visitors
Graphika found that 24 million people visited 34 NCII providers in September alone.
Additionally, the number of links advertising such services on social media increased by more than 2400 per cent.
Graphika says the primary driver of this growth is the increasing capability of AI.
“You can create something that actually looks realistic,” said Santiago Lakatos, an analyst at Graphika, noting that previous deepfakes were often blurry.
The Graphika report said AI allowed a larger number of providers to easily and cheaply create photorealistic NCII at scale.
“Without such providers, their customers would need to host, maintain, and run their own custom image diffusion models – a time-consuming and sometimes expensive process,” the report stated.
Graphika found that instead of being an underground movement, NCII providers are using traditional business tactics, including advertising, influencer marketing, customer referrals and online payments. Several providers offer sales through major companies such as Visa and Mastercard.
Unknowing victims
All that technical jargon is confronting, but at the base of all this is people using technology to create porn of unwilling and often unknowing victims.
A few jurisdictions have made the practice illegal, but the very nature of the internet and AI makes it hard to prosecute any perpetrators. And often the victim doesn’t even know the images have been created; if they do, they may not have the financial resources to pursue litigation. As there are often no existing laws, victims usually have to sue using civil proceedings.
It was initially popular to target celebrities, but is trickling down into everyday life.
“We are seeing more and more of this being done by ordinary people with ordinary targets,” Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, told the Financial Review. “You see it among high school children and people who are in college.”
According to ABC News (US outlet), a town in Spain was rocked when a group of schoolgirls received fabricated nude images of themselves using the technology.
Smartphones are weapons
“Today a smartphone can be considered as a weapon,” Jose Ramon Paredes Parra, the father of a 14-year-old victim told the outlet.
Police believe people known to the victims used social media pictures and uploaded them to a nudify app.
ABC News tracked down the app team behind the app who claimed the ‘service’ was only there to make “people laugh”.
“By them laughing on it we want to show people that they do not need to be ashamed of nudity, especially if it was made by neural networks,” the team explained via email.
Graphika concluded that the creation and dissemination of NCIIs will very likely lead to further instances of online harm, including targeted harassment campaigns, sextortion and the generation of child sexual abuse material.
Do you think Australia should ban such imagery? Do you think that’s possible? Why not share your opinion in the comments section below?
Also read: How virtual humans could brighten the lives of older Aussies
I have no problem with nudity and neither should anyone else. In any case as technology moves on this sort of thing is inevitable. Taking it one step further actors are now complaining that they can be replaced with digital characters. We are at the stage where any image at all can be created and animated. You can no longer believe any video you see and hear is a recording of something that actually happened.
It’s not so much that nudity is a problem, as it is the length that some creeps will go in order to illegally and unlawfully exploit others, that is the problem.
“Do you think Australia should ban such imagery? (sic) Do you think that’s possible? (sic)”
In so far as this can be and is being used for criminal purposes especially sexploitation of both adults and especially minors, yes, very much so and the sooner the better.