YouTube will send messages to users who post offensive comments. If that doesn’t help, they get a 24-hour timeout

Commenting on YouTube has never been an exceptionally polite environment (although that can be said of any part of the internet). Insults, threats, hate speech – “ordinary Thursday” for commentators. Previously, the service tried to show a warning at the time of writing a comment, which should have restrained the commenter a little. Now the service has resorted to a more serious step in the fight against insults.

YouTube will send a message to a user whose comment has been removed for violating the platform’s rules. If he continues to behave in this way, the commenter will not be able to write new comments for the next 24 hours.

The company says it conducted research before launching the practice, which showed that notifications and timeouts produced results.

Unfortunately, similar commenter “behavior improvements” are currently only available for English users, although the warning when typing a comment is also available for Spanish. In the future, the company plans to expand support for other languages.

“Our goal is to both protect creators from users trying to negatively impact the community via comments, as well as offer more transparency to users who may have had comments removed to policy violations and hopefully help them understand our Community Guidelines,” the company said in a statement.

Of course, if a user believes that a comment has been mistakenly deemed offensive, there is a feedback form. But it is not known whether it will return the deleted comment.

It is noted that thanks to AI, YouTube managed to remove 1.1 billion spam messages in the first half of 2022 alone. It also helps to track and remove bots in live chats. But AI has a hard time dealing with insults, as commentators use a lot of slang and deliberately misspell words. This only complicates work with other languages.