According to research by Mozilla, even when users indicate on YouTube that they are not interested in certain types of videos, similar recommendations continue to appear, reports The Verge.
Using recommendation data from over 20,000 YouTube users, researchers found that the “not interested”, “dislike”, “stop recommending channel” and “remove from watch history” buttons were not highly effective at preventing referrals of similar content. So, even in the best case, these buttons still yield more than half of the recommendations that look like uninteresting videos to users.
To collect real data, Mozilla researchers recruited volunteers using RegretsReporter, a browser extension that imposes a generic “stop recommending” button on YouTube videos viewed by members. Users were then randomly assigned to groups, so different signals were sent to YouTube every time they clicked on the Mozilla button — dislike, not interested, don’t recommend the channel, delete from history, and a control group tfor whom no feedback was sent to the platform.
Data was collected from more than 500 million recommended videos, and research assistants created more than 44,000 pairs of videos — one “rejected” video and a YouTube recommended video. So the researchers independently evaluated the pairs or used machine learning to decide whether the recommendation was too similar to a video that the user rejected.
Compared to a baseline control group, sending “dislike” and “not interested” signals were only “slightly effective” at preventing bad referrals, preventing 12% of 11% of bad referrals, respectively. The “do not recommend channel” and “remove from history” buttons were slightly more effective, preventing 43% and 29% of bad recommendations. However, researchers say that the tools offered by the platform are still insufficient to eliminate unwanted content.
“YouTube should respect the feedback users share about their experience, treating them as meaningful signals about how people want to spend their time on the platform,” they write. span>
YouTube spokeswoman Elena Hernandez says this behavior is intentional because the platform doesn’t try to block all content related to the topic. However, Hernandez criticized the report, saying the study did not take into account how YouTube’s controls were designed.
“Importantly, our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” she said. “We welcome academic research on our platform, which is why we recently expanded Data API access through our YouTube Researcher Program. Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights.”
Hernandez says Mozilla’s definition of “similar” content doesn’t take into account how YouTube’s recommendation system works. A “not interested” option removes a particular video, while a “don’t recommend a channel” button prevents a channel from being recommended in the future, Hernandez says. The company says it does not intend to stop recommending all content related to a topic, opinion or speaker.