Apple to install backdoors in iOS, macOS this year

“According to the report, citing a source within the Ministry, Apple struck a deal with the government that will show users a prompt when first configuring a device in Russia to pre-install apps from a list of government-approved software. Users will have the ability to decline the installation of certain apps.”

This is as close as Apple has come to capitulating in Russia to Apple’s privacy requirements, and it’s not a backdoor. If you can find any actual evidence that Apple has, or plans to, cave in to Russia or China re: a backdoor, or any proof they have plans to do so, please provide any links. Apple has always had the ability to create backdoors for Russia or China or any other government or entity, but they have not used it. I have not found any hard evidence that Apple intends to do so for anything other than this US child protection initiative. There is no evidence I could find that Apple has plans to, or will allow, backdoors for anything else.

And it appears some of Apple’s own employees, even those dealing with user security, harbor the same concerns.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

1 Like

To be fair, the article is pretty clear about how it’s not the security and privacy people at Apple who are saying this:

Though coming mainly from employees outside of lead security and privacy roles, the pushback marks a shift for a company where a strict code of secrecy around new products colors other aspects of the corporate culture.

Core security employees did not appear to be major complainants in the posts, and some of them said that they thought Apple’s solution was a reasonable response to pressure to crack down on illegal material.


You’re absolutely right, @ace. In my head I must have merged the original Reuters article with what MR wrote about it. I see they’ve now added a sentence to clarify.

Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.

Thanks for pointing that out. I’ve modified my post.

1 Like

Joanna Stern of the Wall Street Journal has a good interview with Craig Federighi of Apple about the CSAM Detection and Communications Safety in Messages features. I strongly recommend that everyone watch it.


3 posts were merged into an existing topic: FAQ about Apple’s Expanded Protections for Children

And if is that material then Apple has violated US law by uploading the photo to themselves for whatever verification is done before sending it to the NCEMC…if the hash mashes them Apple is pretty sure it’s an illegal image…which is expressly prohibited from being sent anywhere but the NCEMC…sending it via Apple is illegal.

1 Like

There isn’t anywhere else for Apple to move the manufacturing to. The labor cost, lax regulations on overtime and all the OSHA kind of things, and billions of dollars in facilities don’t exist anyplace else. Yes…India and Thailand and others have some manufacturing capability for electronics… it nothing remotely resembling the scale that Apple needs.

1 Like

My understanding is that the law requires them to report it once they see it – they have to upload it in order to see it, so as long as they report it to the authorities (and presumably delete it after locking the user’s account), there’s no violation.

1 Like

Although a big % of Apple’s manufacturing happens in China, Apple has manufacturers and suppliers across the world. Here’s Apple’s report for 2020, and I was surprised at the wide variety of locations:

1 Like

There’s an exception in the law for sending the material to NCMEC. There isn’t one for sending it to Apple.

1 Like

From the page that was referenced transmission other than NCEMC is illegal…so it is possible/probable/who knows how likely that LEO could interpret uploading to yourself a picture that you highly suspect is illegal since it passed your supposedly really smart NeuralHash voodoo algorithm is transmission of said illegal material. The author of tht page…why is apparently a person who is more familiar with this arena than I am thinks it could be…and there is likely some DA out there that might decide to make a name for himself by carving Apple into his belt. Apple can argue other way…but that’s what juries are for…and a persuasive argument can be made that they are 99.9999% sure (or however many 9s it takes to get to one in a trillion) the image is illegal constitutes “a reasonable person should have known” and “beyond a reasonable doubt”. I really don’t understand why they opened this can of worms.

1 Like

If they had better PR people, or even better, a PR firm that is focused on child advocacy, it would have been like opening boxes of delicious desserts at frequent and regular intervals for years and years. Apple totally blew this introduction. I hope it won’t deter parents from signing up for their kids.

Going back to the human element in all this, NPR has posted an excellent article that covers the issue but also brings the important perspective of a family impacted by this whole issue. It is worth a glance:

Survivors Laud Apple’s New Tool To Spot Child Sex Abuse But The Backlash Is Growing by Bobby Allyn

We have 2 eyes - so keep one on all the security issues and PR issues and one on life-long victims. Both are important and that is why I have appreciated all the discussion on all the threads here at TidBITS. I especially applaud the civility of most of the responses and the expertise represented here. My raising the human issue comes from my own expertise.

1 Like

What I haven’t noticed in all of this sturm und drang is the hardware and software requirements for this. On-device scanning implies to me only those devices equipped with “neural network” chips, and probably will require iOS 15 or later.

1 Like

It’s definitely iOS 15 and later.

I don’t think the csam neural hash requires much beyond what all phones running iOS 15 can provide, but the child messaging machine learning scans for explicit images - I wonder if that will require a phone with an ML core, so iPhone XS and XR and later? Apple had not said anything about that yet, though. Just that it’s coming with iOS 15.

Doubtful. It would completely undermine all of the child protection laws if the government started prosecuting people and businesses who are capturing images for the purpose of forwarding them to law enforcement.

If you were looking for a reason to convince all service providers to immediately stop scanning for CSAM, I don’t think you could come up with a stronger argument than this one.


WaPo op-ed by two Princeton researchers who have published a peer-reviewed paper on a CSAM-scanning system similar to Apple’s:


Governments aren’t always good at resisting short term gain because of long term damage. The fake vaccination campaign that the US ran in Pakistan to help find Osama bin Laden has to this day caused vaccine-resistance in those areas.

A DA interested in running for the next office up might not be particularly thoughtful about problems caused after they moved on.

1 Like

That ain’t no lie…considering there are currently bills in Congress to legally mandate law enforcement back doors and eliminate the ability of Apple to runtime sole store for iOS apps. The trouble is that the morons we have in Congress listen only to the FBI and others who complain about “going dark”…and while I sympathize with them wanting access any halfway competent terrorist or criminal can easily use offline encryption or book page/word reference encryption schemes to frustrate law enforcement. In addition…our congress critters don’t understand…and the FBI ignores…the fact that any law enforcement back door would inevitably be released on the web for bad people to use. Unless Apple maintains the only copy of governmentOS and every law enforcement agency sends their iPhones to Apple to crack…which isn’t what law enforcement wants, they want the OS and let them crack the device…and Apple is allowed to charge law enforcement for the cost of developing, maintaining, installing, and cracking the phone for law enforcement…which again law enforcement doesn’t want…I’m quite sure that Sheriff Bubba Smith from Gos County WY will have good old boy deputies who just might either sell the governmentOS to criminals or maybe just might not have the requisite computer security skills to prevent somebody from stealing it.

Just like most corporations…short term gain at the expense of long term damage is plenty fine for the government…they’re happy to either have tax reductions or social program entitlements (depending on your political point of view) for political points today regardless of the economic impact on future generations.

1 Like