New CSAM Detection Details Emerge Following Craig Federighi Interview

See my rather detailed summary of Apple’s technical summary document.

This is the definition of how Threshold Secret Sharing (TSS) works. A shared secret is distributed in multiple cryptographically encoded pieces such that anyone with more than the threshold number of pieces can reconstruct the secret but anyone without the threshold number can not.

The threshold is arbitrary, but it must be determined at the time the secret (in this case, the per-device encryption key for the security voucher content) is generated and split into pieces.

It would be a completely useless technology if someone without the secret (meaning any software not running on your phone) could suddenly change the threshold.

Of course, Apple could change the threshold, then push out an iOS update that forces your phone to re-generate and re-upload every security voucher.

The same way they could choose to just ignore all this cryptographic pretense and just upload everything to a government database without telling anyone. Or how they could save themselves a lot of bad PR and not change anything, but simply grant governments secret backdoor access to all the photos they already have (without any encryption) stored on their servers.

If you’re so concerned about how Apple could change these algorithms in the future, why aren’t you even more concerned about how they could do all this and much much worse with the data that is already stored on servers without any protection whatsoever?

Ah, but in this case, no “backdoor” was ever necessary because all of the photos in question are already stored without encryption on Apple servers. Apple can and does grant law enforcement access to this data when presented with a warrant.

If you’re afraid Apple will choose to (or be forced to) become evil, they can do what you’re afraid of without any of this incredibly complicated bit of cryptographic security algorithms.

It’s nothing like prior claims of being unable to extract a device’s SSD encryption key from the secure element without knowing the device’s pass-code.

Yes, there is a concern that Apple may change the software to start scanning and reporting images that aren’t uploaded to iCloud. To that I’ll just add that Apple has been doing on-device scanning for many many years already. How do you think it automatically makes albums based on who is in each photo, and generates “Moments” from your library?

If you’re concerned about Apple abusing their CSAM-scanning technology, are you equally concerned about all of the other scanning that takes place on-device? Apple could just as easily subvert that code into government surveillance, but nobody has even mentioned that little bit.

Once the discussion goes beyond “how can this software be abused” into “what could it do in the future if Apple changes it”, then you’re calling into question whether any part of any operating system can ever be trusted. That’s a completely separate discussion whose validity has not been changed by any of Apple’s recent announcements.

2 Likes