Instagram, the photo-centric social media platform owned by Meta, helps link and promote a "vast" network of pedophile accounts that openly advertise illicit "child-sex material for sale," The Wall Street Journal reported.
Investigations by researchers at Stanford University, the University of Massachusetts Amherst and the Journal found Instagram does not just host illicit content on its platform: Its algorithms actually promote it. The platform connects pedophiles to content sellers through a niche recommendation system that enables people to search explicit hashtags. It then connects them to accounts that use the terms to promote "child-sex material for sale."
Instead of openly publishing illicit sex material, Instagram accounts that offer such content will often post "menus" instead that invite buyers to commission specific acts, including videos of children harming themselves or committing acts of bestiality, Stanford Internet Observatory researchers found.
According to the Journal, some accounts make children available for in-person "meet ups" for the right price.
A Meta spokesperson told the New York Post the company has restricted the use of "thousands of additional search terms and hashtags on Instagram" since the Journal's report and said Meta is "continuously exploring ways to actively defend against this behavior.
The company said it has also "set up an internal task force to investigate these claims and immediately address them."
"Child exploitation is a horrific crime," the spokesperson said in a statement. "We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it."
Meta pointed to the 27 pedophile networks it has dismantled over the past two years and said it is working on preventing its systems from recommending that potentially pedophilic accounts connect or interact with each other's content.
Alex Stamos, the head of the Stanford Internet Observatory who served as Meta's chief security officer until 2018, told the Journal a sustained effort would likely be needed to throttle even clear instances of abuse.
"That a team of three academics with limited access could find such a huge network should set off alarms at Meta," he said, adding the company has much more effective tools to find pedophile accounts than outsider researchers do. "I hope the company reinvests in human investigators."
The buyers and sellers of underage sex content are just one part of the sexualized child content landscape. Other Instagram accounts that seemingly belong to the platform's pedophile community gather pro-pedophilia memes or talk about their access to children, according to the Journal.
The number of accounts that exist mainly to follow child-sex content is at least in the hundreds of thousands, if not millions, current and former Meta employees who have worked on child-safety initiatives at Instagram told the Journal.
In January, Meta took down 490,000 accounts for violating its child safety policies, a company spokesperson said.
"Instagram is an on ramp to places on the internet where there's more explicit child sexual abuse," Brian Levine, director of the UMass Rescue Lab, which studies online child victimization and builds forensic tools to fight it, told the Journal.
While they found some similar sexually exploitative activity on other social media platforms, the Stanford researchers said the problem on Instagram is especially alarming.
"The most important platform for these networks of buyers and sellers seems to be Instagram," they wrote in a report set to be released June 7.
© 2025 Newsmax. All rights reserved.