Instagram lets monsters search for sick child abuse images on app, says probe

INSTAGRAM does not block searches for sick child abuse images, an investigation has found.

Its algorithms allow paedophiles to seek out accounts which commission and sell underage sex content.

An investigation has found that Instagram does not block searches for child abuse imagesGetty

Meta’s ex-chief security officer Alex Stamos said the findings should be of grave concernReuters

A probe by the Wall Street Journal and university researchers found Insta allows searches for terms linked to underage sex.

Paedos can then connect to users offering depraved material.

Meta, which owns Instagram, has removed 27 paedophile accounts in the past two years.

But ex-chief security officer Alex Stamos, who co-led the investigation, said the findings should be of grave concern.

He said: “That a team of three academics with limited access could find such a huge network should set off alarms at Meta.”

Former health minister Lord Bethell, campaigning on tougher social media regulation, slammed the tech titan Meta — previously known as Facebook.

He said: “This is shocking and there has to be consequences. If they were a shop selling magazines with child abuse images, the police would be there in a heartbeat.

“If that means jail for bosses who fail to stamp this out then so be it.”

Former digital minister Damian Collins said: “This is appalling and Instagram needs to be picking this kind of thing up.”

A Meta spokesman said: “Child exploitation is a horrific crime.

“We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.

“Predators constantly change tactics and that’s why we have strict policies and tech to prevent them from finding or interacting with teens on our apps.”

  Read More 

Advertisements