Apple to install backdoors in iOS, macOS this year

I see people are already being misled by this.

The app store and iTunes streaming supposedly both involve “human curation” and we all know how that guarantees 100% accuracy and quality.

But much more importantly, Apple claimed analysis would always happen on device and photos would never leave the device. So if that’s true, the human at Apple supposedly vetting this, will never get to see an actual image. What they will see is the result of a hash comparison. But that’s not human review. Human review is somebody looking at my pic of Mona Lisa and determining, yes that’s art, not kiddy porn. OTOH if the person at Apple could somehow see an actual image (perhaps only those where the hashes appear to match), that means every once in a while also harmless images will end up going to Apple (I dare somebody to tell me the algorithm works so well that this will never occur) and that is of course a privacy violation. Either way around, Apple’s suggestion just doesn’t work.

1 Like

Do child pornographers actually use Messages? It beggars belief that they would use anything other than tools like Telegram to communicate and share.

Anyway, I tend to agree with Simon that this all comes under ‘slippery slope’. So my alarms go off, I also think that while slippery, slippage may not necessarily occur. I’d like to hear more from Apple about how this works and see if there’s concrete limits on how far this can go.


Messages has end to end encryption, which is why many child pornographers have been using it. Encryption is why Apple will do the scanning on the child’s device.

Why aren’t they scanning the pervert’s device??

“ The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.”

1 Like

Because they need to identify criminal behavior and the criminal, and notify the authorities. Habeas Corpus.

“backdoor” is hyperbole. I think EFF is a great organization, but that doesn’t stop them from overstating things from time to time.

A backdoor gives unfettered access to much or all of an operating system to allow things such as keyloggers, stealth access to the cameras and microphones, detailed location data regardless of your settings, and collection of data that even you don’t have access to. Nation states and a few others can already do this by taking advantage of bugs, e.g. Pegasus.

Apple’s CSAM has a mechanism more akin to downloading google’s safe browsing database so it can block you from inadvertently going to a site that might be hosting malware. I’m a little surprised that so many people on the net who claim to be security aware would be using iCloud photos or iCloud backup anyway, because they’ve never been end to end encrypted. If the feds want to see your photos in iCloud Photos, all they need is a subpoena. [Photosync is great way to backup your photos to your desktop or to many other destinations. Don’t send them to Dropbox though–they scan with CSAM…]

There is no slippery slope here. Apple has always had the ability to use a software update to change how it accesses your data because they wrote the whole system. They make changes we do and don’t like all the time. At least for CSAM you can opt out of having your photos checked on device at all by opting out of iCloud Photos. Many other services don’t let you opt out of it at all.

Whether it’s a good/bad/indifferent idea for Apple to be doing this, or whether CSAM is effective enough to be worth the trouble, are different issues and I don’t know enough to have an opinion. It’s certainly brought them bad press. Of course the bad press that they really need to have aimed at them is that they reneged on their promise for end to end encryption for most iCloud services, but for some reason the press just shrugged ‘who cares’ at the far bigger sin.

Examining photos via messages with ML is a whole 'nuther thing unrelated to CSAM except by proximity of announcement. Since it’s completely opt-in by the parents, I fail to see why that would be alarming to most people. But I’m no longer a kid, and I’ve never been parental.


Of course there’s a slippery slope here. In fact, arguing that Apple hasn’t taken advantage of the ability to access your data until now but now is is the very definition of a slippery slope. “We could do it but won’t” to “we could do it and we will for this one completely justified case” is a moving down the slope. Next up is “here are these other completely justified cases…” and so on.

“Opting out” is a generous way to describe this. Yes, if I don’t use a major feature of iOS that Apple pushes quite extensively, I can avoid this particular surveillance. I can also do that by turning the phone off and never using it, so I guess that’s opting out as well.

Does Google’s safe browsing database report people to the police at any point? No? Then I think they’re not really similar.


So much for Mr Cook’s spouting about privacy being in Apple’s DNA. A back door is a back door.

From now on when any government says “ban this”, “report that” “decrypt that” it will point to this ‘best of intentions feature’ and call balderdash when Apple says it can’t. Apple has just negated every “no can do” argument it’s ever made.

Mr Cook has successfully destroyed any illusion I harboured about him and Apple having any principles and any vestige of brand loyalty has just disappeared.

To all intents and purposes, from a government agency technical perspective an iMessage is now fundamentally no different to a Facebook post.

1 Like

Gruber has just published his take on the subject with the title “Apple’s New ‘Child Safety’ Initiatives, and the Slippery Slope”. He provides a readable account of both the Messages app component and the iCloud Photo Library component. The article made clear a few points that many commentaries and facile takes have missed:

  1. The Messages component currently applies only to messages involving minors under Family Sharing plans.

  2. The photo component only involves photos in the iCloud Photo Library. Furthermore, the first step involves using coding to determine does not involve actually examining photos, but rather matching photo hashes against known bad photos. Obviously, it does occasionally happen that a hash from an unrelated object might match one in the suspect database. However, further action requires multiple photos to fail in this way. ( I may have made a mess of this–go to the link for a clearer explanation).

  3. The current limits on what is being checked are ok, but there is always the possibility that the checking expands into more problematical areas is a potential future issue.

  4. It’s possible that setting up this system is a precursor to Apple providing full encryption for iCloud backups and photos.


Further from Gruber:

"But the “if” in “if these features work as described and only as described” is the rub. That “if” is the whole ballgame. If you discard alarmism from critics of this initiative who clearly do not understand how the features work, you’re still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.

What happens, for example, if China demands that it provide its own database of image fingerprints for use with this system — a database that would likely include images related to political dissent. Tank man, say, or any of the remarkable litany of comparisons showing the striking resemblance of Xi Jinping to Winnie the Pooh.
This slippery-slope argument is a legitimate concern. Apple’s response is simply that they’ll refuse.

Mr. Neuenschwander dismissed those concerns, saying that safeguards are in place to prevent abuse of the system and that Apple would reject any such demands from a government. “We will inform them that we did not build the thing they’re thinking of,” he said."

Just like Apple 'rejected" China’s demands that the personal data of Chinese customers must be stored on servers run by the Chinese state.


Speaking about Apple, Google and privacy, Google finally did kind of, sort of, maybe, don’t look too carefully, cave in the wake of Apple’s App Tracking Transparency initiative:

But Apple’s stringent requirements are yielding better results for iOS users:

According to early data from the ad-measurement firm Branch Metrics Inc, seen by The Journal , less than 33% of iOS users have permitted apps to track them across other apps. The remaining 67% of iOS users opted not to permit apps to track their activity. As a result, the amount of advertiser spending on Apple’s mobile platform has fallen by about one-third between June 1 and July 1, while spending on Android rose over 10% for the same month, according to ad-measurement firm Tenjin Inc.”

Ad dollars are what Google needs to survive and prosper.

And I’m happy for Apple to hash away on anything uploaded to their servers…but hash on the server.

Hashing on device let’s China demand that China control the hash database…and allows LEO to ask for more…and Apple can no longer say “we can’t do that ".

1 Like

This is the core of the problem. The cat is now out of the bag. It doesn’t matter either if Apple now claims they will only run this in certain countries. And even if they now backtracked and didn’t roll it out at all, it would be too late.

Next time they will be told to “activate” their system to check this or that, they won’t have a defense as in the past where they could say “there is no such system” and just walk away.

Now the system exists and the whole world knows it. So eventually they will be strong-armed into using it to the benefit of authoritarian thugs. And if anybody seriously believes Apple will then bravely resist in the name of freedom or human rights or whatever, I suggest checking how Apple so far has stood up to the Chinese dictatorship. :joy:

I seriously hope Apple has learned its lesson after having essentially publicly burned iCloud Photos within about 50 hrs.


I appreciate this discussion and the diversity of concerns.

That being said, my wife (a clinical social worker) and I (a retired clergyperson) spent decades working with children and families impacted by abuse. I spent a number of years working in the red light district of a major city reaching out to young teens who had been exploited (and now properly termed “trafficked” and “enslaved”) into prostitution. We both have seen face to face the horror of what is done by vicious and dehumanizing people, made only worse in this digital age where the exploitation can continue on long after the child or young person’s situation has changed. On a more personal note, I had my life threatened and even my family threatened while I was working on the streets. So this whole topic is a very deep one.

Your well taken cautions are appreciated. I am simply saying let us not lose sight of the human cost that comes with the use of these photos. Technology obviously does not exist in a vacuum! We need to find ways to protect privacy and still root out those who would exploit and seriously damage our children and young people.

Sorry to go on but this is important to me/us, and to those we worked with.


Glenn Fleishman and Rich Mogull have written a detailed analysis of this.


I sympathize with the human cost…but the simple fact is that you either have privacy…or you don’t. The Constitution guarantees us the right to privacy…and like encryption this is pretty much a binary thing.

I hate kiddie porn and terrorists and druggies and criminals as much as the next guy…and I don’t have any of that stuff on my phone anyway…but as a retired military guy I’ve got a healthy respect and devotion to that constitution thing that pretty much trumps everything.


No, sadly, the Constitution does NOT give us a right to privacy. That said, I’ve seen multiple people insinuate that people are downplaying child sexual abuse, but absolutely no one in this debate is doing that.


The Supreme Court disagrees with you, specifically in Griswold v. Connecticut, and succeeding cases. Thought the Constitution does not explicitly talk about such a right, SCOTUS has decided that a number of other rights would not function without a right to privacy and that the 14th Amendment creates one.


I find that quite annoying too. Nobody gets to insinuate we don’t take child abuse seriously, just because we’re not willing to sacrifice our privacy in the name of “doing something”. This is not an either-or issue.