instagram's algorithms connect ‘vast paedophile network’; meta creates task force

Instagram’s Algorithms Connect ‘Vast Paedophile Network’; Meta Creates Task Force

The popular photo-sharing application Instagram promoted and connected a “vast network of paedophiles” seeking illegal underage sexual content on the platform, according to a recent report.

According to a Wall Street Journal (WSJ) report, Instagram’s recommendation algorithms connected buyers and sellers of self-generated child sexual abuse material (SG-CSAM). Paedophiles advertised the sale of illicit “child-sex material” on Instagram. 

A joint investigation by the WSJ, the researchers at Stanford University, and the University of Massachusetts Amherst, revealed that Instagram recommended the accounts of paedophiles to buyers who wanted to buy child sex abuse materials. According to the WSJ report, more than 1,000 accounts were advertising self-generated child porn on Instagram. The report said, “The platform’s recommendation algorithms effectively advertise SG-CSAM.”

Subsequently, Meta, an American multinational technology conglomerate that owns Facebook, Instagram, and WhatsApp, set up a task force to probe those claims in the report.

A Meta spokesperson said in a statement, “We’re continuously exploring ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them.” 

Related Posts

Reportedly, around 31 social media accounts referred to the National Center for Missing and Exploited Children were still active on Instagram. 

One of the richest men in the world, Elon Musk, also expressed concern over the report. He said that the findings about Instagram were “extremely concerning.” 

In the research, it was found that 128 social media accounts were selling child sex abuse material on Twitter, which was less than a third of the number they found on the photo-sharing application Instagram. 

Over the past two years, Meta has taken down dozens of abusive networks from the platform. It disabled over 490,000 accounts for child safety violations.