A study by The Wall Street Journal and Laura Edelson, a computer science professor at Northeastern University, found that Instagram shows teens more sexually explicit, violent, or harassing content than adults.

The test was conducted from January to April 2024. As part of the study, new accounts were created, and the age of the account holders was indicated as 13 years old. There were no subscriptions or searches, but after three minutes of watching Reels, these accounts began to recommend sexual content. When accounts skipped other clips but watched those racy videos to the end, Reels recommended even more explicit videos.

After 20 minutes of watching Reels, almost all of the videos in the feed had similar content, and some of these videos also offered to send nude photos to users who left a comment or otherwise interacted with these posts.

At the same time, similar tests conducted on TikTok and Snapchat showed that these platforms do not begin to recommend the same sexual content for underage users.

The Wall Street Journal has also obtained access to documents describing Meta’s internal tests of content recommendation on Instagram. In 2021, the company conducted the same tests and got similar results. A former employee of the company told about it.

Another internal Meta analyst disclosed that the company has long been aware of the problem with sexual content and videos showing violence on Instagram that are recommended to minors. Among other things, the company was also offered to create a separate recommendation algorithm for teenagers, but this idea was rejected.

It has also become known that underage Instagram users receive more recommendations from sexually explicit and violent videos than adults. Teenagers see three times as many sexual videos, 1.7 times as many violent videos, and 4.1 times as many bullying videos as users aged 30 and older.

The dangers of social media for minors are an active topic of discussion, including the impact of Facebook and Instagram. Meta is trying its best to regulate recommendations and remove unwanted content from teenagers’ feeds, including even introducing a block for nude photos that teens can receive, but tests show that this is definitely not enough.