New CSAM Detection Details Emerge Following Craig Federighi Interview

I think that there is an enormous difference between Apple refusing to create a special iOS that would bypass security gates that would enable access to the total contents that resided in iCloud of the iPhones of two dead terrorists. Apple is setting up an opt in service, not an iOS, that will screen for the hash codes of pornographic photos that match those of a respected database of known pornography that are being sent to the iPhones of living and vulnerable kids under the age of 13. They are two totally different scenarios and issues, and parents have the opportunity to implement it for their children if they think itā€™s a good idea.

This is just about CSAM and giving parents the opportunity to communicate with their children about potentially dangerous situations; nothing else. Apple swears that no government agency is involved, and that the pornographic images of children that match are only those that reside in highly respected database. This might be backdoor, but China or 1984 it ainā€™t.

Iā€™m not sure why all the concern is about China, but given it keeps coming upā€¦

Except that this system adds absolutely nothing to the Chinese governmentā€™s monitoring capabilities. Why would they pass a list to Apple to eventually add to iOS so that certain images can be flagged and Apple notify themā€¦ when they can just directly scan everyoneā€™s iCloud Photo Library on the servers they already have access to?

3 Likes

If they donā€™t know about these dissidents, how are they going to somehow get lots of known pictures of them into the NCMEC and multiple other child safety databases where the pictures are highly vetted to be known CSAM? Remember, NeuralHash only says ā€œThe hash of this iPhone photo thatā€™s being uploaded to iCloud Photos matches a hash in Appleā€™s intersection-created hash database.ā€ It does NOT do machine-learning type facial recognition.

Thatā€™s absolutely true and not the goal of the system. It is designed to identify quantities of known CSAM thatā€™s uploaded to iCloud Photos.

That would fail the third-party audit of how Apple creates its database of intersections from multiple child safety databases. So to assume itā€™s still possible is to go back to the state actor having complete power over Apple, at which point we have independent security researchers to identify abuses.

As with @neil1ā€™s suggestion above, that will fail the third-party audit of how the Apple database is created from the intersection, unless Apple is going to say that ā€œOh, this is from the Chinese NCMEC and itā€™s legitā€”look, hereā€™s their root hash.ā€ But any independent organization doing the auditing would throw a flag on that.

And again, whatā€™s the surveillance win in exact matching of known images? What this technology tells you is if an iPhone user has and is uploading existing known images to iCloud Photos, thatā€™s all.

OK, thatā€™s fair, though pretty specific, and it would seem easily changed by the political movements. Itā€™s in essence a code, and whenever codes are known to be broken, the people using them create new ones.

It would if it were a radically new capability, but it doesnā€™t seem any different than the iPhone in general. If China can compel Apple to do anything it wants, why arenā€™t our iPhones already reporting everything we say and do back to Beijing? China may be able to compel Apple to do some things within the country, but clearly hasnā€™t been able to make the company do whatever it wishes.

3 Likes

Iā€™m not sure thatā€™s relevant, since the technology will be present in all versions of iOS. Presumably Apple has some way of enabling it for US users only, but in the extreme subversion scenario that is being suggested, the company could theoretically be compelled to enable it for another country too.

Personally, Iā€™m unconvinced by the extreme subversion scenario. Aside from the fact that it seems like a really hard secret to keep, I think the US government would take a very dim view of US companies being compelled to participate in activities sponsored by another state actor. And I have to assume that Apple has sufficient ties to the US government that such communication would take place.

2 Likes

This is true, and I should have made it clear that Iā€™ll bet Apple has probably been talking to the equivalents of The National Center for Missing and Exploited Children in some of their target countries, probably the EU as well. But the big kerfuffle in the press might have thrown a monkey wrench into the works. And there might be countries that donā€™t have an equivalent database.

It isnā€™t the dissidents image theyā€™re interested inā€¦itā€™s a list of people that have pictures that dissidents are likely to haveā€¦Tank Guy at the square for instance and other images they donā€™t like. I did not suggest that they would get images of the dissidents or that they needed to get them into the NCMECā€™s database. China can simply pass a law that requires Apple to scan against their database in addition to the NCEMC oneā€¦and also that the threshold be changed to 1 from whatever it is for the current planā€¦and that Apple report to Chinese authorities. They can also require that Apple scan every photo on the phone against the Chinese database as wellā€¦even if iCloud upload is disabledā€¦and report the results.

Nation states can mandate just about anythingā€¦and since the technology to do this already exists Apple would have no choice but to comply.

Iā€™m going to tryā€¦againā€¦well, for the 4th or 5th time actuallyā€¦to not reply any more to this or related threadsā€¦as I said nobody is going to change minds and Apple is gonna do what they are gonna do.

3 Likes

Apple doesnā€™t have to, and Iā€™ll bet wonā€™t, implement this program in any country that will not meet its standards. Like I said, Iā€™ll bet there are countries that donā€™t yet have an equivalent database.

I donā€™t think putting the weight of effort on the dissidents rather than the authoritarian government is really the argument you want to make.

Also, and as politely as I can make it, part of the issue with this discussion is that one side has very little expertise on the situation in China and thus makes what I would call frankly silly comments about it. The idea that there arenā€™t common images that China might hunt for made me shake me head in horror and Iā€™m only marginal (one book on historical China and various conference papers coming closer to the present) in my understanding of current China. The vast majority of the voices so far have been from the tech side, and the naĆÆvetĆ© has been really remarkable so far (eg Doug Millerā€™s post two below this one)

Because China has a set of priorities, budgetary limits, and resource allocation fights, like every other bureaucracy, and that limits them in what they can do. Dedicating Appleā€™s resources and expertise in a way that makes up for those limitations is not a good idea. I said it earlier, but Iā€™ll repeat, donā€™t hand the genocidal sociopaths a new weapon, please.

Again, the question is not why would they, but why wouldnā€™t they? Ready made surveillance, created by a company that has already caved to them, gives useful information in specifically images which theyā€™re already looking for. Why not?

4 Likes

Why hasnā€™t China passed such a law just doing raw image upload of everyoneā€™s iPhone already? Wouldnā€™t that be a lot easier than relying on this technology?

It seems that China would gain a lot more, and have a lot more control over the result, if they did something like that. For example, if Apple couldnā€™t ā€œrefuseā€ a special neural hash ā€œrequestā€ from China, why couldnā€™t they write a ā€œbugā€ in the code that doesnā€™t report hash matches at all for Chinaā€™s special hash? How would Chinese authorities know?

But they can already do this as all files in iCloud Photo Library for Chinese users are on servers the Chinese government controls.

Stepping back, I really donā€™t understand all this hand-wringing about China from the tech community. Has anyone who actually has experience of being targeted by China, or at least a foreign policy expert with actual knowledge of how the CCP carries out its surveillance, weighed in on this issue? All Iā€™ve seen (from both sides, including myself) is people who have expertise in the tech world making guesses about what may or may not happen.

Iā€™ve not seen anything indicating that people actually affected by Chinese surveillance see Appleā€™s new system as a concern. It would be useful to understand abuses that this system could be used for, not hypotheticals that only seem realistic to people (like me!) who donā€™t really understand the domain they would operate in.

2 Likes

With a little quick research, it appears that David is indeed more far knowledgeable in this area than I am, and probably more so than anyone else here. So Iā€™ll defer to his opinions in that regard.

But I do think that @jzwā€™s question is still a good one. @silbey, whatā€™s your thinking on this?

2 Likes

Howard Oakley has posted another insightful article relevant to this topic:

2 Likes

First off, this assumes that dissidents are merely trading the same photos with each other and that they arenā€™t constantly generating new content (which China would have to intercept and add to the database before this algorithm could detect it).

Second, it assumes a massive misunderstanding about how Appleā€™s system works. The threshold isnā€™t some arbitrary number that phones look at. Itā€™s an integral part of the cryptographic system. You canā€™t change it without breaking the entire database - which means it canā€™t be done without everybody else in the world finding out.

You also assume that the matching can be done without Appleā€™s servers. Again, read the technical description. The algorithms prevent any code (whether on the phone or the server) from learning anything about the content of the security vouchers without Appleā€™s secret decryption keys.

Finally, you assume that this is somehow going to be easier or more useful than using Chinaā€™s existing surveillance software (which is constantly scanning live video from thousands of cameras nationwide) to scan through a database of not-encrypted photos.

Additionally, the algorithm requires the server-side software to have Appleā€™s secret encryption key. I doubt they will every deploy this system to a country where someone else runs the server, since it would require divulging that key - undermining the entire system

If the do deploy something in China, it will be a different system. But it makes no sense (for Apple or China) to do so. It is far more likely that China will order Apple to not enable E2E encryption for the Chinese iCloud databases, and they will do their own scanning with their own existing surveillance software.

But the tech for this already existed, for quite a long time. The system Apple developed is far far more complicated than what would be necessary to satisfy a court order, especially since iCloud photo libraries are not encrypted.

Law enforcement already orders Apple to turn over the contents of suspectsā€™ iCloud backups and photo libraries, and Apple complies (when there is a warrant, of course). If they would choose to compel them to scan everybodyā€™s photos for any arbitrary image provided by the FBI, the tech to do so is already been present. So we are already trusting Appleā€™s policy in this area.

2 Likes

Two things Iā€™ve not seen much discussion of but which do concern me about these upcoming CSAM detection systems:

  1. What stops a government from requiring Apple to scan all photos on an iPhone, regardless of iCloud Photo Library? It seems thereā€™s no technical barrier, everythingā€™s in place, it is only a policy that means they arenā€™t doing this. As clever as their convoluted encryption system is, it doesnā€™t seem to protect against this kind of mission creep/abuse.

  2. Iā€™m surprised there isnā€™t more concern about the iMessage feature for children. The feature itself seems fine to me, but the principle of scanning messages before/after the E2E is dangerous. I can see the CIA and China being much more interested in this than the photo library stuff.

It seems that with all the focus on the list of hashes, these issues are being lost, and I think they have the potential to significantly undermine privacy.

3 Likes

These two concerns were always present, long before Appleā€™s announcement.

Any app that has access to your photos can scan your on-device photos for anything it wants. This could be for benign reasons (e.g. Photos creating ā€œMomentsā€ for you to review) or hostile reasons. Your only safeguard is Appleā€™ App Store policy.

Apple could always (going all the way back to very first iPhone) design and deploy new software and secretly push it into your phone. There is no technological mechanism that can prevent this short of hacking the phone to permanently disable all capability for software updates. You have always had to trust Appleā€™s intentions here.

Ditto for all other smart phones. Google, Samsung, Motorola, LG and wireless carriers all push updates into their various phones and devices. When you use them, you have to trust them to not push objectionable software into your phone.

Ditto for your computer. Unless you disconnect from the Internet and refuse all updates, Apple (or Microsoft, or Dell or Google) will push updates for their respective software. Although you can (usually) configure your computer to not automatically install updates, if you donā€™t trust the company with that ability, then you probably shouldnā€™t be trusting the disable switch either.

Even if you go open source, you canā€™t eliminate that problem. Sure, the Linux community audits major packages all the time, but how many normal users actually know anything about the updates that they get from the standard Debian (or Red Hat or Ubuntu or whatever) distribution server? Again, you are trusting your distribution to do what they say they are doing.

In other words, unless you take all your electronics completely off-grid, you are always implicitly trusting someone with matters of privacy and security.

For most of us the answer to ā€œhow can I ensure that my system will remain secure and private against direct action from governmentsā€ is ā€œforget it, kidā€. But it may still be useful to ask the question ā€œwho is most likely to give in and who is most likely to resistā€, knowing full well that everybody will fold if the pressure gets high enough.

1 Like

To add to @Shaminoā€™s good summary, your phone is already scanning all photos and has been for years. Thatā€™s how you can search for pictures of cats or oak treesā€”it has to scan all photos and analyze their content.

1 Like

Why would you explain it to her again? They are scanning content on my device. That is a huge step down the slippery slope, no matter what auditability they claim.

I donā€™t see how auditability changes anything regarding the slippery slope. All they have to do is change their mind. Sure, we might be able to find out that they changed their mind and expanded what they are scanning and what they are scanning for, but if the government demands that they scan encrypted content on a billion peopleā€™s phones, Apple cannot say ā€œimpossibleā€ anymore. Apple just built an encryption back door into its devices. And it can change its capabilities at any time.

By the way, I donā€™t know what auditable means, but being able to see whether the phone claims it is doing something is not auditing in my mind. And I didnā€™t see anything else in their report that seemed like the ability for me to audit this system.

3 Likes

Surely the difference between providing on-device search vs government reporting isnā€™t meaningless to you.

The question was related to the government.

There is a reason people donā€™t mind and in fact prefer on-device scanning for search, but not for government reporting.

1 Like

I think the problem Apple is facing is that the most effective solutions to the problem of online child porn distributed by messaging services would to have to involve scanning Messages. And the company that singlehandedly, repeatedly and over decades dropped nuclear bombs on the digital media industry over privacy did not handle their news release effectively, is now dealing with a sā€”- storm, especially from the companies who are loosing vast amounts of revenue because of Appleā€™s recently upgraded anti tracking initiatives.

If Apple had created a route that did not involve on device scanning but would be much less effective, like Facebookā€™s, they would have gotten criticized for not doing the same thing as everyone else sooner. They will not be scanning everyoneā€™s iOS devices. they are only scanning images from the NCMC. And Apple has a unique and enviable history of not backing down when faced with government requests to break encryption.

Because new information has arisen, so I need to update my thinking to accommodate it. If an opinion is based on incomplete information, or an incorrect understanding of the information, it should be reevaluated with new details in mind.

Certainly. The point I was making is that ā€œthe iPhone is scanning your photosā€ has been true for a long time. A more precisely worded question might be ā€œWhat prevents a government from requiring Apple to report matches for all photos on an iPhone, regardless of iCloud Photos?ā€ And that then gets into the questions of what database of photos those matches would be against, and how Apple would create that database, and so on. Itā€™s quite different.

We put a lot of effort into using words precisely in TidBITS, and I feel that in the context of CSAM detection ā€œscanningā€ is an ambiguous term that doesnā€™t lead to informed understanding. In the context of Communication Safety in Messages, itā€™s more accurate.

1 Like