The FBI has warned of a rise in the use of artificial intelligence to create fake videos to stalk people for extortion purposes. This is reported by Ars Technica.

“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” officials wrote. “The photos or videos are then publicly circulated on social media or pornographic websites for the purpose of harassing victims or sextortion schemes.”

It explained that scammers often obtain victim photos from social media or other sources to use to create sexually themed images. Software and cloud services for creating fake videos are available on the Internet, and with the development of AI, their quality has improved significantly. Now, to create realistic fake videos, just an image of a person’s face is enough.

“Many victims, which have included minors, are unaware their images were copied, manipulated, and circulated until it was brought to their attention by someone else. The photos are then sent directly to the victims by malicious actors for sextortion or harassment, or until it was self-discovered on the Internet,” the FBI added.

The bureau urged people to be careful to prevent their images from being used to create fake content.

“Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity,” the FBI explained.

It was previously reported that two developers used OpenAI’s DALL-E 2 image generation model for creating forensics software that can generate “hyper-realistic” police photos of a suspect based on user input. The goal of the program is to reduce the time it usually takes to draw a suspect, which is about two to three hours.