Apps and websites that use artificial intelligence to create fake images of naked women based on real photos are becoming increasingly popular online. This is evidenced by the results of a study by Graphika, writes Bloomberg.

This company analyzes social media. They claim that in September alone, 24 million people visited the “undressing” sites. At the same time, many such services use popular social networks for advertising.

For example, since the beginning of this year, the number of links advertising such apps on social networks, including X and Reddit, has increased by more than 2,400%. Services use AI to create a nude version of a regular image. Many of these services work exclusively with women’s photos.

These apps are part of a disturbing trend of creating and distributing pornography without human consent thanks to the development of AI. According to experts, the development of artificial intelligence technologies has made the use of such services easier and the result better.

“You can create something that actually looks realistic,” said Santiago Lakatos, an analyst at Graphika.

The international non-profit legal organization Electronic Frontier Foundation (EFF) noted that many victims are unaware that their photos have been used to create nude images. And for those who do, it is difficult to get law enforcement to investigate or find money for a lawsuit.