2773658

Instagram is recommending sexual videos to underage users, according to tests conducted by the Wall Street Journal and Northeastern University.

The tests, which were run over a seven-month period, showed that the social network continues to offer adult content to minors, despite Meta’s promises to provide a more age-appropriate experience.

Test accounts created for 13-year-old users were exposed to suggestive content from the very first minutes of use, as demonstrated by research conducted by The Wall Street Journal and Laura Edelson, a professor of computer science at Northeastern University in Boston, US. Instagram’s recommended videos, or Reels, included women dancing suggestively or posing in ways that emphasised their breasts, the WSJ explains.

Worse still, the researchers found that the platform recommended even more suggestive Reels when users watched this type of video in entirety and skipped past more politically correct content, despite the user’s age.

As such, in less than three minutes, adult content creators appeared in the feeds of the test accounts, the WSJ says. The researchers confirmed that after less than 20 minutes spent watching Reels, the test accounts’ feeds were dominated by promotions for these kinds of content creators, some offering to send nude photos to users who interacted with their posts.

Is Instagram more subversive than TikTok?

Contrary to what might be expected, Snapchat and TikTok did not promote the same sexualised content to underage users: “All three platforms also say that there are differences in what content will be recommended to teens,” Edelson explains in the WSJ. “But even the adult experience on TikTok appears to have much less explicit content than the teen experience on Reels.”

The Meta group was keen to qualify the results of these tests as not representative of its moderation: “This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” company spokesman, Andy Stone, told the WSJ.

Yet The Wall Street Journal reveals that internal tests were conducted by Meta employees, revealing similar issues in 2021. Separate analysis from 2022 shows that the company had long been aware that Instagram was showing more pornography, violence and hate speech to young users than to adults.

According to the report, teenagers saw three times more prohibited posts containing nudity, 1.7 times more violence and 4.1 times more bullying content than users over 30. According to the document, the tools put in place by the Meta group were not effective enough to overcome these issues.

The Reels algorithms work, in part, by detecting users’ interests based on which videos they watch for longer than others, and recommending similar content. According to Meta’s guidelines, sexually suggestive content should not be recommended to users of any age, unless it comes from accounts they have chosen to follow.

Users under 16 shouldn’t see sexually explicit content at all, Meta reiterated last January. While, in more than a dozen manual tests conducted by the Wall Street Journal and Northeastern University, underage accounts didn’t follow anyone or search for anything, in order to avoid any activity that might influence Instagram’s content recommendations, the platform still ended up recommending sexually suggestive content.

To address these dangers, the 2022 document suggested the creation of a special algorithm for teenagers. This measure was not adopted by the social networking giant. – AFP Relaxnews