This AI can undress any woman, and its success is terrifying.

AIs that can undress women are enjoying meteoric success, and raising serious concerns about the risk of harassment. Unfortunately, neither social networks nor governments seem determined to fight effectively against the scourge of DeepFakes porn…

L’artificial intelligence can boost productivity tenfold and unleash creativity… but it can also awaken the baser human instincts.

According to the researchers websites and applications using AI to undress women in photos are exploding in popularity.

In September 2023 alone, more than 24 million people visited such platforms according to social network analysis company Graphika.

On the websites of the various DeepFake services, the creators boast of attracting large numbers of users. Some claim to attract more than a thousand users a day

A phenomenon spreading on social networks

Many of these services use networks for their marketing. In the year 2023, the number of links promoting such apps has increased by 2400% on sites like X or Reddit.

They use AI to recreate an existing image so that the person appears naked. Most of the time, this treatment only works on women…

These applications are part of a more global trend, very worrying, of non-consensual pornography developed and distributed by advances in artificial intelligence: DeepFakes pornos.

Its increasing proliferation raises ethical and legal issuesas photos are often stolen from networks and distributed to other users. without consent, control or even knowledge of the victim.

How AI can remove women’s clothes

The rise in popularity is directly linked to the launch of open source AIs such as Stable Diffusion based on diffusion models. These tools make it possible to create images with a quality incomparably superior to what was just a few years ago.

And because of their open-source nature, these models are available free of charge and without any control of their creators. As Graphika analyst Santiago Lakatos explains, ” it’s possible to create something that looks really realistic “.

The great hypocrisy of GAFAM

On X, an advertisement for such an app features an image accompanied by text suggesting that users can create a nude image and send it to the person. This is literallyan incitement to harassment.

Similarly, an application advertised on YouTube appears in first position when the word “Nudify” is typed in.. However, a Google spokesperson confirms that the company does not allow sexually explicit content.

He same goes for Reddit whose spokesperson states that the site prohibits the non-consensual sharing of falsified sexually explicit material. Several domains have been banned from search results.

For its part, TikTok has blocked the keyword “undress (undress): a popular search term associated with this type of service. People attempting to perform this search on the application receive a warning.

Finally, Meta also blocked this type of keyword on its networks. However, the no platform has taken any real action to effectively combat this sordid phenomenon…

The democratization of porn DeepFakes in schools

Celebrity porn montages are far from a novelty on the Internetbut experts fear that advances in AI will make DeepFake software ever easier to use and more effective.

According to Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, ” we’re seeing more and more content produced by ordinary people with ordinary targets “. This is particularly the case among high school and university students

From many victims are not even aware that they are the subject of such images, and will probably never know. Even for those who do realize it, it’s very difficult to convince the authorities to investigate.

As in the United States, there is as yet no French law dealing specifically with DeepFakes created via AI. The distribution of pornographic montages is prohibited, with maximum penalties ofone year’s imprisonment and a fine of 15,000 eurosbut the legislative arsenal has not been updated to deal with this scourge linked to technological progress…