Apple to install backdoors in iOS, macOS this year

People in China want to buy iPhones and other Apple hardware and services:

And many of Apple’s products are manufactured in Chinese factories, creating a very large number of jobs there. Foxconn is probably assembling iPhone 13s, etc. as we type:

My guess is that TSMC, who is now making Apple’s chips and building factories in the US, and Taiwan isn’t exactly China’s BFF. It could be Apple is playing one off the other; they could have partnered with a Chinese chip manufacturer.

I do concern myself with what happens to Chinese citizens - they are part of humanity on this planet and as such we should all work for the best welfare for all. We are not alone here, and I hope we’re a community that cares about everyone.

3 Likes

Apple was reluctant to place servers in China, but had to due to CCP law (which is capricious and and be changed at any time). They only control data from devices in China.

I’ve long held that only devices used in China should be manufactured there - but I guess that too much manufacturing know-how is located there, as well as suppliers in the iDevice supply chain.

Apple is trying to diversify out from all mainland China, but the problem is no one else has the fast turn-around for manufacturing changes (not to mention all those Apple owned CNC machines based there).

Once Apple has installed backdoors into iPhones and Macs, we’ll eventually receive proof that we’ve essentially handed keys to China, Russia and who knows how many freelance black hats. None of those parties have our interests at heart, and they don’t care about our children.

1 Like

In the other thread @ace posted this very interesting link. It basically laid out that this 1-in-a-trillion claim of Apple’s is at best marketing BS. It’s definitely not an actual probability resulting from real technical considerations. But he does offer a guess for how they arrived at that big number. Spoiler alert, it’s a bit disingenuous.

  1. Apple claims that there is a “one in one trillion chance per year of incorrectly flagging a given account”. I’m calling bullshit on this.

Facebook is one of the biggest social media services. Back in 2013, they were receiving 350 million pictures per day. However, Facebook hasn’t released any more recent numbers, so I can only try to estimate. In 2020, FotoForensics received 931,466 pictures and submitted 523 reports to NCMEC; that’s 0.056%. During the same year, Facebook submitted 20,307,216 reports to NCMEC. If we assume that Facebook is reporting at the same rate as me, then that means Facebook received about 36 billion pictures in 2020. At that rate, it would take them about 30 years to receive 1 trillion pictures.

According to all of the reports I’ve seen, Facebook has more accessible photos than Apple. Remember: Apple says that they do not have access to users’ photos on iCloud, so I do not believe that they have access to 1 trillion pictures for testing. So where else could they get 1 trillion pictures?

  • Randomly generated: Testing against randomly generated pictures is not realistic compared to photos by people.

  • Videos: Testing against frames from videos means lots of bias from visual similarity.

  • Web crawling: Scraping the web would work, but my web logs rarely show Apple’s bots doing scrapes. If they are doing this, then they are not harvesting at a fast enough rate to account for a trillion pictures.

  • Partnership: They could have some kind of partnership that provides the pictures. However, I haven’t seen any such announcements. And the cost for such a large license would probably show up in their annual shareholder’s report. (But I haven’t seen any disclosure like this.)

  • NCMEC: In NCMEC’s 2020 summary report, they state that they received 65.4 million files in 2020. NCMEC was founded in 1984. If we assume that they received the same number of files every year (a gross over-estimate), then that means they have around 2.5 billion files. I do not think that NCMEC has 1 trillion examples to share with Apple.

Perhaps Apple is basing their “1 in 1 trillion” estimate on the number of bits in their hash?

  • With cryptographic hashes (MD5, SHA1, etc.), we can use the number of bits to identify the likelihood of a collision. If the odds are “1 in 1 trillion”, then it means the algorithm has about 40 bits for the hash. However, counting the bit size for a hash does not work with perceptual hashes.

  • With perceptual hashes, the real question is how often do those specific attributes appear in a photo. This isn’t the same as looking at the number of bits in the hash. (Two different pictures of cars will have different perceptual hashes. Two different pictures of similar dogs taken at similar angles will have similar hashes. And two different pictures of white walls will be almost identical.)

  • With AI-driven perceptual hashes, including algorithms like Apple’s NeuralHash, you don’t even know the attributes so you cannot directly test the likelihood. The only real solution is to test by passing through a large number of visually different images. But as I mentioned, I don’t think Apple has access to 1 trillion pictures.

What is the real error rate? We don’t know. Apple doesn’t seem to know. And since they don’t know, they appear to have just thrown out a really big number. As far as I can tell, Apple’s claim of “1 in 1 trillion” is a baseless estimate. In this regard, Apple has provided misleading support for their algorithm and misleading accuracy rates.

https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html

1 Like

And as the article @ace linked to (I quoted it above) laid out, that claim is bogus. The fact that Apple’s Privacy head doubled down on a bogus claim, is a bit sad. It’s one thing for marketing to say something, we all know those guys lie 99% of the time. But the privacy head is another issue. Not sure how much credibility this Erik Neuenschwander has left after a stunt like this.

3 Likes

Uh yes, which is why Apple remains particularly vulnerable to Chinese pressure. I appreciate you supporting my point.

2 Likes

Apple remains an important employer in China - and the CCP would never attempt to pressure Apple in terms of global policy, lest they lose this big employer.

Nothing would chase Apple manufacturing out of China faster than a CCP attempt to force changes in global policy - especially affecting global data.

That’s not the concern.

The concern is that if they find enough of the hashes they’re looking for, they will start looking at your images and report you to the cops. Perhaps we’re fine with that here in the States because child porn is horrible and child porn owners deserve the worst. But that’s besides the real concern too.

The actual worry is that now the Chinese dictatorship can give them more hashes to check against. It’s not about finding who has a certain pic or what that pic is. It’s about having enough of the wrong kind of pics on your device. It’s not just about finding a pic of Tiananmen guy on an iPhone, it’s about finding those people that have that image plus one with the Uigurs in a camp, and another one with Hong Kong protestors, and another one perhaps with a subway car and drowning people. Voila, 4 naughty images is one more than the threshold (forget the exact numbers, it’s irrelevant) and so those pics get sent to Apple who then has to lock that user’s iCloud and report them to the Chinese thugs. Now they have the name and account of somebody who’s likely ‘subversive’.

In the past if the Chinese had demanded something like this of Apple they would have just said we don’t have that type of scanning tech so sorry but we can’t do that (their defense in San Bernardino). But now they have published for the whole world to see that they do have this tech. So the Chinese can turn scanning against their hashes into a “local law” that Apple then, by their own accord, will follow. You want to tell me they will not follow that law and exit China to make a point? Of course they won’t. 95% of their production is there.

This all happened because Apple built tech they shouldn’t have. But now they did. And they advertised it to the whole world. Now they can be blackmailed and coerced. They can never stuff that cat back into the bag.

4 Likes

That’s certainly what Apple marketing wants you to believe. But as pointed out above, that probability figure is baloney.

And let’s note here that every time Apple looks an image and determines it’s not CASM they have just violated somebody’s privacy. That’s why they need to bamboozle folks with such a big figure. If they cannot guarantee that there are no false positives (which they likely can’t) they are admitting that they will be invading innocent people’s privacy. The only crime being that those people used iCloud Photos. Now sure, you can tell people to get around that by just stopping to use iCloud Photos and that’s fine. But you will surely agree, that’s not the message Apple wants to see spread. Hence the BS odds figure. Doesn’t change the BS part though.

1 Like

"Apple remains an important employer in China - and the CCP would never attempt to pressure Apple in terms of global policy, lest they lose this big employer.

“Nothing would chase Apple manufacturing out of China faster than a CCP attempt to force changes in global policy - especially affecting global data.”

Folks believing the above are likely to be very disappointed when they learn what Cook’s and Xi’s real motivations are, and what they’re capable of doing.

And what is “global data” anyway? And where is it stored? And who’s in charge of “global policy”?

3 Likes

They’ve already done pretty much exactly that, in insisting that Apple store Chinese iCloud data on Chinese servers and have the keys controlled by a government owned firm. Apple caved. So not only isn’t my point ludicrous or insane, it’s already happened.

To think that they won’t continue to pressure Apple to do what they want, not just with their own citizens but on things that work globally is wildly naive.

1984 this ain’t. Apple has announced they will only be implementing the image testing in the US so far. They have not given any information about if and when, or if ever, they will do image testing in China or anywhere else. And since China’s iCloud data already stays in China, it’s probable they might not need or want Apple to screen for child porn at all; they can do it themselves. And the app and algorithms are only about child porn; there is no indication that Apple will check images for anything else. Apple said they will never use this technique for any other purpose, and they’ve stood their ground about privacy vs. government before.

Again, there is currently no indication whatsoever that Apple has plans to implement a hash system like this in China, Afghanistan, or anywhere else to date.

In 2016, Tim Cook said that installing a back door on the iPhone was like “inserting a cancer.” Now, in 2021, Apple is all-in on installing back doors (and not just on iPhones). They have lost their credibility in this area.

3 Likes

Messages will screen for any type of “inappropriate material,” not just CASM, unless parents opt out.

What people fail to understand here is that it’s not about what Apple says they will do or where they release a feature.

The point is they demonstrated they have done it, they have the tech, and they are implementing it somewhere.

That means entities like the Chinese junta or autocrats like Putin can now force Apple to use this tech (that they have openly admitted to having) in their country and against databases they choose.

I don’t believe Tim or Apple is lying about their intentions, but I also don’t believe that when Apple is told by Ji Xinping to do this for him, they will say, nope and just pack up and leave. They can’t. They are in a far too exposed position and that means they will be subjected to coercion.

The only way to insure yourself against that kind of threat is to not develop the tech in the first place so that you’re left with a San Bernardino defense. But now they blew that option. And that’s really an inexcusable strategic mistake on their behalf. :roll_eyes:

4 Likes

You’re absolutely right. Hopefully we don’t look back at this week as the beginning of the end of Apple.

1 Like

Agreed it’s not a backdoor. A backdoor allows someone other than the device’s owner to log in without permission, and/or to view decrypted content, again without permission. It’s, as you say" a documented feature. That doesn’t mean it’s desirable, but it’s certainly not a backdoor. I’m surprised the EFF used that term in this case.

4 Likes

“According to the report, citing a source within the Ministry, Apple struck a deal with the government that will show users a prompt when first configuring a device in Russia to pre-install apps from a list of government-approved software. Users will have the ability to decline the installation of certain apps.”

This is as close as Apple has come to capitulating in Russia to Apple’s privacy requirements, and it’s not a backdoor. If you can find any actual evidence that Apple has, or plans to, cave in to Russia or China re: a backdoor, or any proof they have plans to do so, please provide any links. Apple has always had the ability to create backdoors for Russia or China or any other government or entity, but they have not used it. I have not found any hard evidence that Apple intends to do so for anything other than this US child protection initiative. There is no evidence I could find that Apple has plans to, or will allow, backdoors for anything else.