Connect with us

Politics

REPORT: Instagram Algorithm ‘Connects’ P*dophiles With Kids, Promotes Their Networks

Published

on

A new investigation of Instagram has found the social network’s algorithm facilitates connections between pedophiles and sellers of illegal content dedicated to fulfilling their twisted desires.

The Wall Street Journal, in partnership with researchers at Stanford University and the University of Massachusetts Amherst, has exposed the vast reach of pedophilia purveyors who use hashtags to connect buyers and sellers offering a menu of options — anything from children in sexual situations with animals to offers for “meet-ups” between children and buyers offering the “right price,” writes the Journal.

Exposure on Instagram brings pedophiles beyond the confines of obscure file-sharing networks and onto one of the world’s most popular social media networks, one that has been criticized in the past for its negative effects on teenagers. Researchers were able to uncover vast networks of pedophiles without the aid of public safety surveillance systems or assistance from Instagram.

“That a team of three academics with limited access could find such a huge network should set off alarms at Meta. I hope the company reinvests in human investigators,” said Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018.

Promotion of such underage content is a violation both of parent company’s Meta’s terms of service as well as federal law.

Since receiving the Journal’s media inquiry, Instagram has claimed it removed thousands of pedophilia-associated accounts and stripped the ability for hashtags like #preteensex and phrases like “little slut for you” to be used when directing participants toward illicit content.

“Child exploitation is a horrific crime. We’re continuously investigating ways to actively defend against this behavior,” Meta said in a statement.

As part of the team’s reporting, Stanford researchers produced a quantitative analysis of Instagram features that help niche networks find and connect users while the UMass Rescue Lab built a framework for fitting pedophiles on Instagram into the larger online ecosystem. Together, both universities were able to quickly identify large-scale communities promoting criminal sex abuse.

Meta maintained that it is constantly battling the forces of pedophilia on Instagram and removed nearly 500,000 accounts in January alone for violating its child safety policies. The social network has more than 1.3 billion users.

free hat