YouTube is cracking down on content that “realistically imitates” deceased minors or victims of deadly or violent events by describing their deaths, writes TechCrunch.

To this end, the platform is updating its harassment and cyberbullying policy. Starting from January 16, this content will be removed from YouTube.

Such actions are related to the fact that some content creators use artificial intelligence to provide underage victims of high-profile cases with a child’s voice to describe what happened to them.

YouTube will not only remove such content. Violators will also receive warnings and will not be able to upload videos for a week. After three warnings, the user’s channel will be permanently removed from YouTube.

Last fall, it became known that YouTube would require authors to mark content that looks realistic and was created using artificial intelligence.

At the time, it was warned that the new rules would apply to content created using AI tools to more realistically reflect events that never happened. This also includes videos of people saying or doing something that never actually happened.