Apple Explains Pullback from CSAM Photo-Scanning

Originally published at: Apple Explains Pullback from CSAM Photo-Scanning - TidBITS

In a letter responding to a child safety group, Apple has outlined its reasons for dropping its proposed scanning for child sexual abuse material in iCloud Photos. Instead, the company is focusing on its Communication Safety technology, which detects nudity in transferred images and videos.

3 Likes

They pretty much came to agree with the points that a lot of folks generally (and in earlier TidBITS threads) were saying. From the letter:

It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories. How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution? Tools of mass surveillance have widespread negative implications for freedom of speech and, by extension, democracy as a whole. Also, designing this technology for one government could require applications for other countries across new data types.

1 Like

And Nick Heer has some thoughts about how this intersects with the UK efforts to require back doors for end-to-end-encryption technologies.

1 Like

Interesting article – I do think Nick is misinterpreting what the British minister said. Translated from political-speak, he’s essentially saying that 1) the British government won’t make companies do something they can’t; 2) it will consult with them to see if building the capability is possible 3) if the company convincingly argues that it’s not possible, then the Govt can’t hold them liable for not doing it.

It’s a definite retreat for the British government and it’s actually not a bad final policy to settle on.

1 Like

With a lot of these AI applications no one talks about false positives. These will generate a lot of manual inspecting of users photos, possibly including looking at their full library. Then they have to inform the police, who will decide whether to investigate.