Instagram is preparing to launch a new security feature that will “blur” nude images in messages with underage users, The Verge reports.

In addition, the new feature will recommend not sending such photos and require additional confirmation. The feature will be enabled by default for teenage Instagram users. Notifications will also encourage adult users to enable the feature.

The effort follows ongoing criticism that platforms like Facebook and Instagram are harming their youngest users.

The new feature will be tested in the coming weeks, with a global rollout expected within the next few months. Meta reports that the feature uses machine learning on the device to analyze whether an image sent through Instagram’s direct messaging service contains nudity, and that the company will not have access to these images unless the system reports them.

When the protection is enabled, Instagram users who receive nude photos will see a warning message and the option to block the sender and complain.

“This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” says Meta.

Users who try to send nude photos will also see a message warning them about the dangers of sharing sensitive photos.