New CSAM Detection Details Emerge Following Craig Federighi Interview

From the 9to5 article it also states that “ Other child safety features announced by Apple last month, and also now delayed, include communications safety features in Messages and updated knowledge information for Siri and Search.” So yes it looks like it will.

2 Likes

Yes, I think so. The scenario I initially found most likely is that the NSA (or whatever other alphabet agency working with/for NSA) subverts the CSAM database and Apple’s verifiers. This could be done without Apple executive’ knowledge; all they’d have to do is find out who these Apple employees are, tell them that they’d be heroes if they cooperate and go to jail (the average person commits three Federal crimes a day, so it wouldn’t be hard to trump something up, especially given the info NSA has on every on-the-grid American) if they refuse. This is the sort of tactic we already know them to do; some of the surveillance Snowden revealed was happening without company execs’ knowledge. Now that we know that another organization besides NCMEC is involved, this makes this scenario more difficult, but not impossibly so; USG just hacks/subverts/demands that the other organization add its chosen images to its database. Unless the other organization is in Russia or China, they’ll likely submit.

But then I realized that there’s a far, far easier route than all this cloak-and-dagger stuff: the FBI presents Tim Cook with a National Security Letter, which makes clear that not complying or revealing anything will result in prison time, which simply orders Apple to report whatever images the FBI wants them to.

What would they be looking for? Terrorist imagery. Memes shared by known terrorist or subversive groups. Anything that would help them identify terrorists, or terrorist sympathizers, or criminals, or subversives.

1 Like

Could you provide a source for this claim? Even this very article says that this is Apple’s “initial” match threshold, which implies that they could change it at any time.

1 Like

Aah, but there’s a crucial difference: Prior to this, Apple’s response to such efforts was, “we can’t and won’t build such backdoors into our system.” Now we know that they can and will.

2 Likes

See my rather detailed summary of Apple’s technical summary document.

This is the definition of how Threshold Secret Sharing (TSS) works. A shared secret is distributed in multiple cryptographically encoded pieces such that anyone with more than the threshold number of pieces can reconstruct the secret but anyone without the threshold number can not.

The threshold is arbitrary, but it must be determined at the time the secret (in this case, the per-device encryption key for the security voucher content) is generated and split into pieces.

It would be a completely useless technology if someone without the secret (meaning any software not running on your phone) could suddenly change the threshold.

Of course, Apple could change the threshold, then push out an iOS update that forces your phone to re-generate and re-upload every security voucher.

The same way they could choose to just ignore all this cryptographic pretense and just upload everything to a government database without telling anyone. Or how they could save themselves a lot of bad PR and not change anything, but simply grant governments secret backdoor access to all the photos they already have (without any encryption) stored on their servers.

If you’re so concerned about how Apple could change these algorithms in the future, why aren’t you even more concerned about how they could do all this and much much worse with the data that is already stored on servers without any protection whatsoever?

Ah, but in this case, no “backdoor” was ever necessary because all of the photos in question are already stored without encryption on Apple servers. Apple can and does grant law enforcement access to this data when presented with a warrant.

If you’re afraid Apple will choose to (or be forced to) become evil, they can do what you’re afraid of without any of this incredibly complicated bit of cryptographic security algorithms.

It’s nothing like prior claims of being unable to extract a device’s SSD encryption key from the secure element without knowing the device’s pass-code.

Yes, there is a concern that Apple may change the software to start scanning and reporting images that aren’t uploaded to iCloud. To that I’ll just add that Apple has been doing on-device scanning for many many years already. How do you think it automatically makes albums based on who is in each photo, and generates “Moments” from your library?

If you’re concerned about Apple abusing their CSAM-scanning technology, are you equally concerned about all of the other scanning that takes place on-device? Apple could just as easily subvert that code into government surveillance, but nobody has even mentioned that little bit.

Once the discussion goes beyond “how can this software be abused” into “what could it do in the future if Apple changes it”, then you’re calling into question whether any part of any operating system can ever be trusted. That’s a completely separate discussion whose validity has not been changed by any of Apple’s recent announcements.

2 Likes

Aren’t those albums and “moments” generated locally, on your device? Why do you think Apple downloads all the applicable data, processes it, then generates and pushes the albums and “moments”?

Any other specific examples you can provide of Apple doing “on device scanning for many years” when that scanning is not controlled and managed by local software would be greatly appreciated.

Apple has made this clear for many many years. There’s no expectation of privacy when you store your data in iCloud. They should have just started complete CSAM scans but instead are trying to monetize this new process under the guise of privacy, leaving their commitment to solving the big picture problem open to question, while reversing their privacy position ca 2016.

Please explain how on earth Apple is “trying to monetize this new process.” Whatever you think about how Apple has developed CSAM, they have not mentioned charging users an extra fee for this. They will not be using using it to harvest audience data to sell advertising or to sell to access to users. They will not be eliminating App Tracking Transparency. But they will be introducing Private Relay in Safari, which will make Apple devices much more private than VPNs. And the new Mail Privacy Protection will eliminate tracking pixels in Mail:

The sky is not falling. And there will be no revenue stream flowing from CSAM.

1 Like

I am concerned about that, but this thread is about the CSAM feature.

2 Likes

OK you got me. I admit to some overblown rhetoric here; I have no specific knowledge of Apple’s inside machinations.

But the timing of the announced release of the two backdoors was interesting. I saw absolutely no news reported by the media about how consumers were demanding these “features,” but there’s been plenty of press about the pressures being applied to Apple by the government (Congress) in general, pretty well across the board. It doesn’t take too much imagination to envision Apple upper level management sitting at the table with Congressional staffers horse-trading future “capabilities” (backdoors) in exchange for some relaxation on legislation affecting the App Store, for example. And that’s how I would expect any additional revenue stream would flow.

If the sky is not falling, it is because enough people are concerned with Apple’s actions in this area that they’ve backed off to some extent.

1 Like

I am. It’s why I’m ready to turn off iCloud. It’s why I’m seriously considering switching to Linux. It’s why I’m looking into other phones.

If you can’t understand the difference between answering government warrants vs going through your papers and calling up the government and tipping them off that they might want to get a warrant for this guy, I don’t think there’s much I can say to explain it to you.

Now that Apple has created a method to collect that type of information and report its users to the government, obviously the rest of the on-device scanning is concerning. It’s so far the opposite of privacy, that every one of their other mitigations for privacy become meaningless.

1 Like

Apple is refusing to do this scanning on their own servers, and is forcing our devices to to do it with our processing power using our electricity and adding wear and tear to our batteries. This is clearly a large financial benefit at your and my expense.

1 Like

Precisely. The point is that by adding this feature, they’ve tipped their hand. That’s why there’s such a big uproar.

1 Like

This is because complaints about child porn initiated by parents, teachers, and other parties need to be reported to law enforcement, regulatory agencies and nonprofits before action is considered. That’s why the National Center For Missing And Exploited Children, government agencies, etc. exist. The powers that be need to be involved before journalists can cover a story, or they could face libel action.

https://www.firstcoastnews.com/article/news/crime/terry-parker-high-school-teacher-arrested-for-distributing-child-porn/77-606107432

https://netsanity.net/apple-ios-12-parental-controls-getting-better/

https://www.govinfo.gov/content/pkg/CHRG-111shrg66284/html/CHRG-111shrg66284.htm

Given that Apple has said it is delaying the launch of these technologies, there’s no point in continuing to debate what may or may not have been true of the earlier announcement.