YouTube video creators now have to label AI content if it looks realistic. The relevant changes were announced last year, and now they are coming into force, CNN reports.

Now, when uploading a video, authors must fill out a list that contains information about the video’s origin. For example, it asks whether the video depicts a real person saying or doing something that never happened.

The author will also have to indicate whether he or she has altered footage of a real place or event, and whether he or she is showing a realistic scene that did not actually happen.

YouTube ШІ

Disclosing this information has a purpose. It should help avoid confusion among viewers about whether the content they are watching is real. All of this is happening against the backdrop of the proliferation of new AI tools that allow for the quick and easy creation of text, images, video, and audio that are sometimes difficult to distinguish from the original.

Realistic AI content will be marked with special tags on YouTube. If the authors of such videos do not disclose information about the artificial origin of their videos, they will be sanctioned. Potentially, it means deleting content or suspending monetization.