I’m beginning to think a remedial reading course should be a requirement for posting on the internet. The hysteria being displayed by everyone is quite insane.
Here’s what I read on the subject:
Software on your iPhone will create hashes for photos being uploaded to iCloud photo storage. Those hashes will be compared to a hash database of known illegal widely distributed images reported to the NCMEC (National Center for Missing and Exploited Children).
If there are a sufficient number of hash collisions, the photos whose hashes match will be decrypted and examined by a human. If those photos indeed contain CSAM (Child Sexual Abuse Material) the account will be frozen and the matter will be reported to the NCMEC.
Nowhere does it say it will be scrubbing all images on your device - it will only be checking that files being uploaded to iCloud photo storage do not match fingerprints of widely distributed and known CSAM. This is done by Google and Microsoft and any other organization or entity hosting image data by US law - Apple’s just catching up on iCloud photo storage. They didn’t even say if they would be doing this for photos already in iCloud photo storage, though to be in compliance they may have to do that too.
The whole purpose of this exercise is to insure that pedophiles aren’t passing around photos and brazenly putting them in their iDevice photo libraries and uploading them to iCloud photo storage.
This is not image scanning - so your sexts or shared adult photos won’t be affected. This is only being done for known, widely distributed CSAM reported to the NCMEC, and uses only file image level fingerprinting.
We all have secrets to hide, and deserve our privacy - but we also have to realize that some things go beyond the pale and should be stopped.
If I were Apple, I’d want to make sure that none of that trash sat on my servers too.
This is separate from other initiatives protecting your under 13 year old children from possible adult content - those do involve scanning inbound or outbound images using AI and notifying the parent.
Yes, Apple could be forced to do things beyond the scope of the current initiatives, but I’m sure Apple would fight those changes with every lawyer they could muster.