Instagram algorithm boosted ‘vast pedophile network,’ alarming report claims

MAGA News Central: Making American Businesses Great Again
New York Post Header

Instagram’s recommendation algorithms linked and even promoted a “vast pedophile network” that advertised the sale of illicit “child-sex material” on the platform, according to the findings of an alarming report Wednesday.

Instagram allowed users to search by hashtags related to child-sex abuse, including graphic terms such as #pedowhore, #preteensex, #pedobait and #mnsfw – the latter being an acronym meaning “minors not safe for work,” researchers at Stanford University and the University of Massachusetts Amherst told the Wall Street Journal.

The hashtags directed users to accounts that purportedly offered to sell pedophilic materials via “menus” of content, including videos of children harming themselves or committing acts of bestiality, the researchers said.

Some accounts allowed buyers to “commission specific acts” or arrange “meet ups,” the Journal said.

When reached for comment by The Post, a Meta spokesperson said the company has since restricted the use of “thousands of additional search terms and hashtags on Instagram.”

The Thinking Conservative Editor’s Question: Why is this information coming out now. Is TikTok trying to take down Instagram?

The spokesperson added that Meta is “continuously exploring ways to actively defend against this behavior” and has “set up an internal task force to investigate these claims and immediately address them.”

“Child exploitation is a horrific crime,” a Meta spokesperson said in a statement. “We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.”

Meta pointed to its extensive enforcement efforts related to child exploitation.

The company said it disabled more than 490,000 accounts that violated its child safety policies in January and blocked more than 29,000 devices for policy violations between May 27 and June 2.

Meta also took down 27 networks that spread abusive content on its platforms from 2020 to 2022.

The Journal noted that researchers at both Stanford and UMass Amherst discovered “large-scale communities promoting criminal sex abuse” on Instagram.

When the researchers set up test accounts to observe the network, they began receiving “suggested for you” recommendations to other accounts that purportedly promoted pedophilia or linked out to outside websites.

Contact Your Elected Officials