Instagram’s algorithm serves videos to adults who they think might have a prurient interest in children, according to an investigation conducted by the Wall Street Journal. 

The social media app, which is owned by Meta, has an algorithm that shows users Reels, or short videos on topics that they have shown an interest in, most often topics like sports, fashion, comedy or cooking. 

But, the Journal looked at what the algorithm would recommend to accounts that only follow pages featuring teen and preteen influencers, such as gymnasts and cheerleaders, and found that the social media platform "served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands."

FLORIDA MIDDLE SCHOOL TEACHER TERMINATED, ARRESTED AFTER CHILD PORN WAS FOUND ON PHONE: POLICE

Instagram logo

The WSJ said the child-protection group, Canadian Centre for Child Protection, ran similar tests independently and got similar results.  (AP Photo/Michael Dwyer, File)

The paper reported that it set up the test accounts after noticing that many of the thousands of followers of such accounts included "large numbers" of adult men. In addition, "many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults," according to the report. 

When the Journal followed the accounts of adult men, the outlet said the Instagram algorithm "produced more-disturbing content interspersed with ads." The paper said the child-protection group, Canadian Centre for Child Protection, ran similar tests independently and got similar results. 

PSYCHIATRIST USED AI TO CREATE CHILD PORN, SENTENCES TO 40 YEARS IN PRISON

The paper reported that it set up the test accounts after noticing that many of the thousands of followers of such accounts included "large numbers" of adult men. (Thomas Trutschel/Photothek via Getty Images)

Specifically, the Journal found that during a series of videos recommended by Instagram, an advertisement for the dating app, Bumble, was featured between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting her shirt to expose her midriff. Another ad for Pizza Hut was followed by a video of a man lying on a bed with his arm around what was reportedly a 10-year-old girl, according to the caption. 

Other companies whose ads appeared next to inappropriate content in the Journal's investigation included Disney, Walmart, online dating company Match Group, Hims, which sells erectile-dysfunction drugs, and The Wall Street Journal itself, the outlet reported. When notified by the paper, several of the companies said Meta told them it was investigating the report and would pay for brand-safety audits from an outside firm. 

Vice President, Client Council and Industry Trade Relations of Meta, Samantha Stetson, told Fox News Digital that they company doesn't want this kind of content on its platforms and and brands don’t want their ads to appear next to it. 

"We continue to invest aggressively to stop it - and report every quarter on the prevalence of such content, which remains very low," she said. Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions."

 "These results are based on a manufactured experience that does not represent what billions of people around the world see every single day when they use our products and services," she added. "We tested Reels for nearly a year before releasing it widely - with a robust set of safety controls and measures. In 2023, we actioned over 4 million Reels per month across Facebook and Instagram globally for violating our policies."

CLICK HERE TO GET THE FOX NEWS APP

In June, The Wall Street Journal reported, along with researchers at Stanford University and the University of Massachusetts Amherst, that Instagram promoted a "vast network of accounts openly devoted to the commission and purchase of underage-sex content." After the article was published, a Meta spokesperson said a task force had been set up to expand its automated systems for detecting users who behave suspiciously to take down tens of thousands of such accounts each month.