However, technological developments deepfake instead gave birth to an internet site—which we can’t name for the sake of convenience on the internet—which is able to create images of women naked with artificial intelligence (TO THE) since 2019.
In fact, within a few months of 2020, the site expanded its services to make money for partners who create nude characters. They also digitally have a feature to ‘remove’ clothes from non-nude photos to create pornography non-consensual.
Based on visitor analysis in SimilarWeb, the site had 50 million visitors from January to the end of October 2021. Its visitors peaked in August 2021 with 6.92 million visitors, and the site’s creator claims there are “hundreds of thousands” of images uploaded per day.
quote Wired, their visitors halved in October 2021 after getting media attention, due to the hosting offline and the cryptocurrency Coinbase (COIN) suspended its payment account.
However, the mysterious creator claims the algorithm has been updated several times, and a third version will be released earlier this year. He claims the upcoming new version will allow people to “manipulate target attributes such as bust size, and pubic hair.”
The anonymous man who is co-founder this site to Vice said, the site uses an algorithm called the Generative Adversarial Network (GAN) to generate naked content.
The majority of the images produced by this technology are white women aged 20 to 40 years. Then, artificial intelligence is trained on many photos of naked women so that it can produce new and unique versions of what it has learned to look like humans.
“That’s not because any choice is on our side, but simply because that’s how well-classified data sets end up being contained,” he explained.
“We are very careful to only use the public domain or buy data sources from reputable server providers. While we will add men in the future, the reality is there is not much demand for nude images of men.” Based on observations, there are indeed several photos of men, but the number is very small compared to photos of women.
The idea, explained the man, is that this networking site offers nude pictures for those who want to get them without violating the privacy of women in real life. However, on the other hand, what he develops becomes troubling for his own company, when the woman shown is real but her naked body is AI generated.
Ivan Bravo, creator of a custom pornography website, shares the same opinion at Wired. He said his website not only exploits women, but can also become a new style of pornography that can use male characters.
He said, will continue to do so because “it makes a good income” and “[penghasilan] that’s more than enough to support a family in a decent home here in Mexico.”
Whereas, deepfake has been used to humiliate and harass women from the very beginning, as the majority of the images produced target women. Evidently, in 2020, the business and technology research institute Sensity reported the findings of bots Telegram which uses deepfake for harassed more than 100,000 women, including minors.
Also, the women on the nude imagery site may not be real, but the blueprints could be based on nude images of the women who do it. Because, it is very common for pornography or nude artists exclusively to be found online after being stolen. He claims, his company can not avoid this problem.
“If we see that the results are from websites/monetized, websites revenge porn, online forums, or behind paywall, we exercise caution and discard such data as it may not be collected ethically.”
However, the risks they created were unavoidable and had taken a heavy toll on them. One of them is Kristen Bell, a Hollywood actress who invented deepfake naked with her face. “I’m being exploited,” he said in Vox, June 2020.
Other people are also the target of harassment images deepfake, many have expressed shock, do not want their children to see the image, and are currently struggling to remove it from the internet.
One of the cases was trying to contact the police, but nothing could be done to get rid of the internet problem completely, even though it’s an important job, explains British writer and broadcaster Helen Mort at MIT Tech Review.
“It really makes you feel helpless like being dumped in your place,” he argues. “[Mereka] punished for being a woman with a public vote of any kind.”
Several opinions offer other approaches to stop the dangers of sexual crimes on the internet, starting from legal, technical, and social steps. First, according to Seyi Akiwowo of the anti-violence against women organization called Glitch! is “We need to educate young people, adults, everyone, about the real dangers of using this and then spreading it.”
Then Mikiba Morehead, a cybersexuals researcher and a risk management consultant, argues that technology can also stop its spread. The method includes user identification, tagging through algorithms, which can then report material deepfake.
From a legal perspective, there are significant challenges to address them, explains Honza ervenka, a lawyer specializing in technology and non-consensual images. There needs to be clear laws regarding copyright, privacy claims, and artistic licenses that can be used to remove images from the web.
“The longer this regulatory vacuum continues, the more initiatives like this will accelerate, will become industrialized, and will become more difficult to regulate at a later stage,” he explained in a statement. Wired.
The most complicated, if the industry deepfake As pornography becomes more and more common, it will become difficult or impossible to track down and prosecute the makers of the technology that abuses and distributes it.
There should also be international regulations regarding cyberspace, because the creators, for example, based in Asian countries cannot be prosecuted in the United States or Great Britain without extradition.