Last year, Apple announced that it intends to implement scanning of users’ photos for child sexual abuse material (CSAM). This should happen directly on users’ devices. 

The function was supposed to scan the images of individual users and, in case of detection of similar photos, notify the technical department about it, which in the future should contact the law enforcement agencies.

Of course, this caused a wave of indignation. Security and privacy experts agree that the scanning feature can be reconfigured to hunt for other types of content. They also argued that the functionality could ultimately become a “slippery path to broader surveillance abuses, and could quickly become a loophole for police.”

At the time, Apple tried to fight back against such claims, but eventually announced that it would delay the implementation of such functionality. Now, with the introduction of end-to-end encryption, in a message to the Wired, the company makes it clear that it will go the other way:

”After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company said in a statement. “We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”

The proliferation of CSAM is a serious problem that only gets worse over time. Apple’s intention to try to influence this was clearly well-meant, but the chosen tool was hardly the right one for it.