Apple to install backdoors in iOS, macOS this year

It inspired less loss of trust, sure. If they’d tried to do this secretly and it came out, I would have been much more distrustful. What would have kept my trust level at the same height would be if they didn’t try to do this at all.

We’ve sort of gotten into that point of internet threads where people are mostly telling other people how they should be reacting to the news. I’m not sure how useful that is.

4 Likes

Ben Thompson’s take here:

3 Likes

Time to start posting articles about alternatives, I think.

Messaging is easy. There are numerous alternatives in that space.

But photos? Automatic uploading? Apple has all but removed the ability to keep photos on your device in much the same way that they’ve done so for music. What’s the alternative? And how do I convert?

1 Like

I’m beginning to think a remedial reading course should be a requirement for posting on the internet. The hysteria being displayed by everyone is quite insane.

Here’s what I read on the subject:

Software on your iPhone will create hashes for photos being uploaded to iCloud photo storage. Those hashes will be compared to a hash database of known illegal widely distributed images reported to the NCMEC (National Center for Missing and Exploited Children).

If there are a sufficient number of hash collisions, the photos whose hashes match will be decrypted and examined by a human. If those photos indeed contain CSAM (Child Sexual Abuse Material) the account will be frozen and the matter will be reported to the NCMEC.

Nowhere does it say it will be scrubbing all images on your device - it will only be checking that files being uploaded to iCloud photo storage do not match fingerprints of widely distributed and known CSAM. This is done by Google and Microsoft and any other organization or entity hosting image data by US law - Apple’s just catching up on iCloud photo storage. They didn’t even say if they would be doing this for photos already in iCloud photo storage, though to be in compliance they may have to do that too.

The whole purpose of this exercise is to insure that pedophiles aren’t passing around photos and brazenly putting them in their iDevice photo libraries and uploading them to iCloud photo storage.

This is not image scanning - so your sexts or shared adult photos won’t be affected. This is only being done for known, widely distributed CSAM reported to the NCMEC, and uses only file image level fingerprinting.

We all have secrets to hide, and deserve our privacy - but we also have to realize that some things go beyond the pale and should be stopped.

If I were Apple, I’d want to make sure that none of that trash sat on my servers too.

This is separate from other initiatives protecting your under 13 year old children from possible adult content - those do involve scanning inbound or outbound images using AI and notifying the parent.

Yes, Apple could be forced to do things beyond the scope of the current initiatives, but I’m sure Apple would fight those changes with every lawyer they could muster.

4 Likes

You mean for those people who are unwilling or unable to use their own computer for storing their music and photos?

My music collection is several dozen GB of files on my Mac, mostly ripped from CDs, and some purchased from Apple, Amazon and other sources.

My photos are similarly all stored on my Mac, via the Photos app.

Neither of the above are sync’ed to any cloud services.

I transfer photos from my phone and camera via USB. The phone using a lightening cable. The camera using a USB SD card reader. The Photos app on the Mac does the importing in either case.

I transfer photos to my phone using the Finder’s sync window. I have it configured to sync specific albums and photos taken in the last 6 months. Again, the images transfer over USB when I do my regular backup/sync.

I use the same mechanism to sync photos and my music library to an iPod Touch.

I don’t understand why you consider this so difficult. Your computer (whether Mac or Windows) can easily act as the “hub” that all your devices sync against. No need to use Apple’s servers for any of this if you don’t want to.

2 Likes

The fact that a human can decode and examine your photo is the bit that has folks concerned. That’s new. In the past, they couldn’t.

And in the future, anyone for any reason could “decrypt” anything they hold. This is the opposite of end-to-end encryption. It’s more akin to a sign on the door saying “private” than it is to any sort of lock or security measure.

1 Like

I’ve been looking into this even before this announcement, and I’m planning to write more about it, but the short answer is some kind of Network Attached Storage + the PhotoSync app.

3 Likes

Actually, the new system is more restrictive than before.

There is no “in the past, they couldn’t” here, because your iCloud photo library was never encrypted. Apple could very easily have just started scanning it without even telling you, but they chose not to.

Their system, generating hashes on-device, and then using an algorithm so Apple doesn’t even see the raw hashes until a significant number of hits occur (at which point, they go take a look) is clearly designed to work with some future system where the iCloud Photos library actually is E2E encrypted, since it’s far more complicated than necessary with today’s system.

I used to. I can’t anymore. Apple has deliberately broken most of those processes in recent years. Apple software simply can’t do it anymore. Not for music I ripped myself. Not for music I created myself. Not for photos taken on ios. And not for the stuff I perhaps foolishly allowed Apple to store in the past with their past cloud music offerings.

I can do that with most of my non-Apple still and video cameras. And I can share using Google drive, (something I can’t do with Apple’s offerings).

If I don’t let Apple upload my ios photos, I have no way of accessing them short of emailing myself copies. And that’s untenable. I haven’t been able to USB sync any phone to any mac since they eliminated iTunes.

If you need help with this, please start a new topic. It’s really not a big deal.

USB and Wi-Fi sync is now done by the Finder. Music, Photos and Books are used to manage and organize your library, and the Finder syncs them.

1 Like

I agree that having the photos be decrypted and human-viewable opens privacy concerns…

…but can you think of another reasonable, effective way to check against false positives?

By definition, hashes are not unique. If they were, they could be logged one-to-one and matched to the original data. This means that a matching hash is not a guarantee of matching data.

Which would you rather have:

  1. having your account locked and reported for CSAM trafficking because of a coincidentally matching hash; or,

  2. having a small potential privacy risk enabled in order to prevent innocent people from being accused of being pedophiles?

There is no immediately available option 3. Federal law requires Apple to take steps to limit the potential spread of CSAM. There is currently no machine algorithm that can reliably detect false positive matches with the accuracy of a human eye.

Yes, the future potential implications of option 2 are not insignificant. But they are things that can be prepared for and guarded against as technology advances and awareness increases. The personal risks of option 1 are real, current, and devastating for the person falsely accused.

Don’t go all in on fighting privacy risks without considering the ramifications of the alternatives.

(Personally, I think the real problem here is the ham-handed, technologically-challenged law, but that’s not something that can be changed quickly.)

1 Like

Apple has confirmed that all images on iCloud will be checked over time: Apple confirms CSAM checks will be carried out on photos already in iCloud | iMore

Since technically all of your photos in the Photos app are in iCloud if you use iCloud Photos, then all of the photos on your device will be scanned (and of course we already know that if you don’t have iCloud Photos turned on then none of your photos will be scanned.) Apple hasn’t said, though, that the scans will happen on devices for photos already in iCloud Photos - it may be that Apple will scan those on their servers over time.

Apple has said that checking will only happen if there are a number of hash collisions, and I thought that I read that the threshold means that there is approximately a one in one trillion chance that a sufficient number of false positives would result in flagging the detection.

The one in a trillion chance is per account - I’m sure the false hash rate is pretty low, and the fact that you need multiple collisions per account to raise an alert mean that the improbability level is vastly decreased.

If the false positive rate were 1 in a 10,000 and you needed 3 hits, the chance of three false hits would be 1 in a trillion.

1 Like

Which law? And why is Apple affected but other providers are not?

The obvious solution 3 for “Apple” would be to exit the market. That’s pretty trivial by spinning off a portion of their company in, say, Finland, and then subcontracting back to them. Or even just letting customers buy in separately.

Convince a judge to issue a search warrant when reasonable suspicion exists that a crime has been committed.

And when China requires them to add hashes to the database? Apple has already caved a number of times under Chinese pressure.

But more generally, I understood what Apple was doing before you summarized it and was and am still concerned. I don’t believe that’s an insane reaction.

2 Likes

According to the Apple Privacy head Erik Neuenschwander interview on TechCrunch the “one in a trillion” is for all accounts:

And so the threshold allows us to reach that point where we expect a false reporting rate for review of one in 1 trillion accounts per year

The way I read this is if there are 1 trillion iCloud accounts examined each year, there’d be one false positive. That’s incredibly low odds.

1 Like

Again from Apple Privacy head Erik Neuenschwander’s TechCrunch interview:

But even in the case where we’re talking about some attempt to change the system, it has a number of protections built in that make it not very useful for trying to identify individuals holding specifically objectionable images. The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.

Apple has specifically created this system to be difficult to be useful for any other thing other than CSAM scanning.

If China or someone else wanted, it would be much easier to just force Apple to scan iCloud photos server-side for whatever they sought – vastly more powerful, no threshold setting, much easier to implement and change, and users would never be aware of it.

2 Likes

Apple Privacy head Erik Neuenschwander in his TechCrunch interview (emphasis mine):

The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers, which we’ve never done for iCloud Photos. It’s those sorts of systems that I think are more troubling when it comes to the privacy properties — or how they could be changed without any user insight or knowledge to do things other than what they were designed to do.

This says to me that the whole reason Apple designed this system to not have to do server-side scanning which is far more invasive of privacy and opens Apple to be forced to search for lots more material besides CSAM.

It’s also important to remember that this on-device “scanning” isn’t actually looking at the content of your photos (i.e. there’s no facial- or object-recognition being used). It’s simply computing unique “fingerprint” (a hash) for the photo.

I think so many people are freaked out about this because they hear Apple will be “scanning” their photo libraries and that’s not what’s going on at all.

I still like the metaphor I came up with earlier:

A more apt comparison might be checking the serial number of the handgun you’re selling against a database of guns used in crimes (where serial number = image hash, database = CSAM database, and selling = you uploading image to iCloud).

1 Like