Even after complaints, Instagram was unable to delete accounts that publish photos of partially dressed children or children in swimsuits. Such pages contain hundreds of sexualized comments, informs The Guardian.
Meta – Instagram parent company – assures of its complete intolerance of child exploitation. At the same time, the auto-check did not find any violations in the accounts complained about through the internal Instagram tool – so they continue to exist.
In one case, photos of children in sexual poses were published on the page. After the researcher reported it to the support service, he was told the same day that “due to the large volume” the complaint could not be considered, but “technology found that the account probably does not contradict community rules.” On Saturday it was still working, having over 33 thousand subscribers. Similar pages are also available on Twitter.
he effectiveness of complaint tools is a matter of concern. Critics say that the content of the pages can be left because it does not cross the criminal limit, however, it was linked with the suspicion of illegal activity. Accounts are often used as a bait – when technically legitimate images are published in general, but meetings are arranged online in private chats to distribute other material.
Andy Burroughs, head of online child safety policy at Britain’s National Society for the Prevention of Cruelty to Children (NSPCC), described accounts as a showcase for pedophiles. As social networks do not consider them as a threat, he called on deputies to finalize a bill about Internet security. The law should regulate the activities of companies on social networks. It will be debated in the British Parliament on April 19.
Imran Ahmed, a head of the British Center of countering digital hatred, says: “Relying on automatic detection, which is known to be unable to cope with simple hate speech – not to mention cunning and determined groups of child exploitation – is a waiver of the primary duty to protect children.”