Apple to install backdoors in iOS, macOS this year

While there are major articles posted to both the NYT and WaPo, this EFF article not only provides more technical details but isn’t behind a paywall:

This is appears to be a huge philosophical shift by Apple with regard to user privacy (or risk to same). A political reaction intended to blunt potential anti-tech legislation?

2 Likes

The Times article

The WaPo article

For those with subs.

1 Like

Another article in Ars Technica, with over 400 user comments so far…

1 Like

I’m not sure I agree with the use of “backdoor” here. This sounds like a documented feature, running locally, and sending a message when it finds something. It does not provide any additional way to access the phone or its contents, or to decrypt any encrypted files outside of the context of having an authenticated user on the phone. (That is, the only context in which the files are decrypted is the same context in which the user accesses the files.)

Honestly, what it sounds like to me is an attempt to avoid political pressure to require an actual back door that would allow the encryption to be broken centrally, or devices to be accessed remotely or without the appropriate passwords/authentication.

Dave

7 Likes

To me this falls under slippery slope. In the past they could truthfully tell a court (and they actually did so in the San Bernardino shooter case) that they can’t break in because they do not have such a backdoor. But now they are creating this backdoor. They can be asked to tweak a few ML parameters or to change the database being checked against. In code these could be very small tweaks. They won’t be able to say that technically it’s not available. They made the framework. It’s a question of time until they are pressured to “adjust” it. And once they adjust it for China, I bet others will want in on the action too. Sure, some will be in markets that Apple can exit, others (like the US) in markets they cannot. Tada.

I’m also generally weary of big tech companies like Apple and Google using their fancy tech (especially when they throw around fashionable buzz words like AI and ML) to “scan” for content and then use that to report behavior to LE. Nobody should find kiddy porn acceptable. But there is a big difference between a human judging that something is unacceptable and then calling the cops, vs. an algorithm that does so and then call the cops, locks up the iCloud account (which Apple has already said they will do). Sure, they will tell us their ML is super accurate and results are human vetted and yada yada, but my experience tell me that’s not something I would bet my life on. Doing proper human inspection on a massive scale costs massive money. These companies didn’t get to where they are by wasting money. Just try talking to a human there or getting them to reverse an automated decision - it’s incredibly difficult as we’ve read in countless stories where people’s businesses have suffered dearly because of some kind of automated decision. And now we’re talking about shutting down people’s entire digital lives and bringing in the cops and the DA on child porn charges? Heck no. I don’t trust these companies are willing to spend the money it would actually take to do this right. They’re horny for the immediate PR benefits of “being tough on kiddy porn”, they’re not looking to spend billions just to help out LE. If that were truly the case, a simple start would be to just pay their taxes instead of trying to evade them.

Good thing I don’t use iCloud Photos. My concern is what will be next.

2 Likes

Yep…I’m a big believer in privacy…most retired military folk have a fondness for that constitutional rights thing…and this is a bad precedent. You’re right…once the capability exists in the OS…it will be exploited by government to bypass encryption…and more importantly if there is a way for Apple to get in other bad actors will get in as well…just like the ‘law enforcement only’ back door will get hacked.

There’s nothing illegal on my device….but if the cop lights up his blue lights behind me I’m doing the disable FaceID thing to my iPhone and absent a search warrant or being arrested I’m not giving permission to search…it is a principle thing for me.

Unfortunately…the left and law enforcement will keep trying to weaken or bypass encryption…and while I feel for them and there is some validity to the ‘law enforcement is blind argument there re numerous ways for terrorists or whoever to offline encrypt communications and any halfway smart bad guys are probably already doing so. However…the fact remains that either you have encryption or you don’t…no such thing as a good guy only back door.

3 Likes

“Back door” is hyperbole, and you should examine the motives of anyone who uses that hysteric term to describe this. But this certainly a surprising move. Though less so, when you consider that other companies routinely do the same thing in a less-private way. I, for one, didn’t know that Dropbox and One Drive are scanning for child p0rn. At least give Apple credit for the transparency and technical protections.

1 Like

The EFF clearly disagrees.

2 Likes

Across the globe, child abuse and pornography online are horrendously big problems that continue to grow. Apple is doing something that will hopefully do something to address what is a big issue of concern for a significant % of their consumers by screening predators out. What surprises me is that the people who are whining about this on ArsTechnica, etc., seem to be the same ones who were whining that Fortnight and Spotify and other game developers should be able to bypass the App Store and its privacy and security policies so they could make extra bucks. Quality control is something that has been a hallmark of Apple services and products, and I think that this fits under that umbrella.

My question…is Apple implementing this across the globe?

Apple has itself claimed it will lock up iCloud and call in LE if what they deem illegal content is detected. That is far beyond just “sending a message”.

The backdoor is that
a) what is being checked against is a remote database—now they say it’s kiddy porn, but what happens when China tells them to also scan against images that contain banners with anti government slogans? The check happens locally, but the database that scans happen against is set remotely. Apple can (and will) be pressured when it comes to that database, since they hold it.
b) you trust their automated scan is entirely accurate and only cries kiddy porn when the image in question is actually kiddy porn. But what happens when their algorithm cries kiddy porn because of a perfectly legitimate pic (remember when Google censored images of a famous painting because they thought it was some kind of nipple porn?) In such a case if they just sent you an email that would be one thing, but what they say they will do is lock up your entire digital life and call the cops.

2 Likes

My understanding is that for child porn, the checks are for checksum/hashes that identify specific, known illegal image files. And there seems to be a threshold involved – while by chance you might have one or two files whose hash matches a known illegal image, the odds are very steeply against anyone having a dozen of more other files that just happen to have fingerprints matching known illegal files. So it’s not a process that could do things like detect lettering on a banner–it would only captures files that are bit-perfect copies of known illegal images.

I would also assume that Apple might have substantial legal liability for inaccurate reporting to the authorities. (Think about how many other providers currently check against these databases, and the fact that there have been as far as I can tell essentially no reports of people being falsely accused to law enforcement because innocent images were mistakenly fingerprinted as illegal ones.)

The actual content-based detection (as opposed to digital fingerprinting) seems focused on the messaging features, and that reports issues to the child’s parents rather than to the authorities.

Having a feature like this built into the OS sounds a lot better to me than having a back door that would allow arbitrary outside access to my devices and data, which is what law enforcement is really after.

Dave

4 Likes

From the EFF article:

" We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content."

2 Likes

“ As part of its Expanded Protections for Children, Apple plans to scan images,on iPhones and other devices before they are uploaded to iCloud. If it finds an image that matches one in the database of the National Center for Missing and Exploited Children (NCMEC), a human at Apple will review the image to confirm whether it contains child pornography. If it’s confirmed, NCMEC will be notified and the user’s account will be disabled.”

1 Like

I don’t think it is hyperbole at all…what else would you call it? They’re taking it on themselves to examine photos on my iPhone for content. Yes…right now it is only for photos going to iCloud but switching options to all photos period would be trivial…and the user has no abilit6 to turn it off. This allows Apple to invade a user’s privacy…period…and the fact tha5 I don’t have kiddie porn on mine is irrelevant. Once this door into our devices is opened…government will jump in and try to expand the door to iMessages or whatever else is on the phone…and Apple can no longer say it just isn’t possible.

Even if we trusted government and law enforcement not to abuse the power…and I think we’ve seen sufficient evidence already that no government of either party is trustworthy…the fact that the ability exists on the phone for government will get discovered and abused by bad actors.

The only practical solution…and I admit it is tough and law enforcement won’t like it…is to encrypt everything on the iPhone…period…and then Apple can legitimately say ‘we can’t do that’. Take a stand on privacy…this is backing away from what they’ve historically done.

3 Likes

Google has been screening and blocking kiddie porn for years:

Apple does not automatically load this new feature as a default; it’s an option that must be turned on by parents.

So let me see if I understand.

Hypothetically, my 16 year old precious daughter sends explicit nudes to her 19 year old boyfriend. Not in any database, parents never find out.

She subsequently breaks up with said boyfriend, and he posts her nude photos to a porn site. Do the photos end up in any law enfacement database?

How can said porn site know if she is 16 or 18?

Is the government coming after the 19 year for his extensive collection of teen age girls engaged in sexual acts? Should he be charged with statutory rape for consensual sex? Should he be branded for life as a Sex Offender and be in a national database?

Isn’t there a difference between teen aged sex and kiddie porn? Is the age of consent too high at 18 for our current society?

I am all for locking up real perverts who get off looking at or touching pre-pubesecent children and abusing them sexually. They are too young to understand. It is a horrible problem, but probably not nearly as extensive as some in the media contend, but none the less it exists and causes damage to the victim.

So, if Apple is going to find the real perverts with photos of real babies and real children being exploited, that’s a very good thing.

But my hypothetical daughter at 16 might be ready for sex and intellectually able to consent. Maybe her parents already know and helped her get birth control?

Maybe your 18 year old daughter isn’t ready. Every child is different.

Do I want Apple poking around in the lives of teenagers? This is kind of a slippery slope.

Why not start searching for people on the FBI’s Most Wanted List? by looking a millions of users photos. Pretty soon, they’ll sell it to Experian and TransUnion to find people who walked away from their debts.

Apple’s looking at photos is just an extension of facial recognition. And frankly, it troubles me and I wonder why Tim Cook chose to do something so controversial. How about just know who’s vaccinated and who isn’t. That is a much bigger problem.

1 Like

Though she was over 19, remember what happened to Jennifer Lawrence and how traumatized she was?

The hacker was caught and had to serve time, but she still feels the effects to this day. A huge % of adult and kiddie porn that is distributed online is non consensual.

As a wise person once said to me:
“All will be revealed!”

David

The FBI has been doing this for years:

https://www.fbi.gov/wanted/seeking-info

And the FBI has a publicly available database:

https://vault.fbi.gov/

I couldn’t find any evidence of Experian or TransUnion stockpiling photos.

The fact that they will be having a human review the photos before reporting them is reassuring and disturbing. Facebook has had people traumatized by looking at these kind of photos as well as killings and other things they have to remove. I hope they do this in a manner that the reviewers can keep their humanity and mental health.