FAQ about Apple's Expanded Protections for Children

My main questions haven’t been answered: Why is Apple doing this? Why are they risking losing my trust?

Go read my new article on this and the first comment. I think @xdev hits it on the head. Apple didn’t intend to risk anything—they just completely failed to anticipate how it would be received.

Now that we know the details explained in that article, let’s discuss the state actor concerns over there.

2 Likes

Don’t get carried away by Apple’s sophisticated implementation.

My question remains: Why does Apple CSAM detection at all? CSAM is awful, but there are many similar awful things in our societies.

Perhaps because Apple sees it as the morally right thing to do? It could be nothing more than that. There might also be legal reasons or political pressure, but Craig Federighi was pretty clear in saying that Apple did it now because they finally figured out how to do it in a way they felt addressed the awful problem of CSAM while maintaining user privacy.

4 Likes

You can repeat this as many times as you want, but this doesn’t make it true. Apple is not “installing a back door in the iPhone”. That is what the FBI wanted: a way to circumvent the phone’s encryption and access all the data (in 2016 they specifically wanted the passcode limit feature disabled so they could brute-force guessing the passcode).

The CSAM detection is nothing like that. Are there concerns that a state actor could co-opt CSAM detection to identify people holding other, non-CSAM images? Yes. Is this a reversal of Apple’s previous position that it will not allow phones to be unencrypted by anyone other than the owner? No. They are scanning images that are being uploaded to their servers to ensure they don’t contain content that they do not want to host. And as @Shamino has extensively documented, there are many safeguards in place to ensure that Apple cannot view the images unless they are extremely likely to contain illegal content. (As a side note, anyone who is using iCloud Photo Library is already storing all their images unencrypted on Apple’s servers which is much more of a privacy risk.)

Making broad generalisations that any sort of notification of illegal content being uploaded is the same as allowing the FBI to bypass a phone’s encryption only undermines the genuine concerns that exist around this topic.

3 Likes

Give it up dude…I agree with you but nobody here is going to change their mind no matter what we debate or how convincing the other side thinks their position is. In addition…Apple is going to do what Apple is going to do…and even if the file is signed by Apple and part of an iOS update so it can’t be maliciously replaced…when China passes a law that requires Apple to include hashes of the images the Chinese government cares about Apple will follow local law and put the Chinese database in as well as the NCEMC database. Apple isn’t going to leave the Chinese market…and they really can’t move their manufacturing elsewhere either…so they will submit to coercion. You’re right…it isn’t about not trusting Apple…they’re still better and more secure than Android…but implementing this tech can and will have all sorts of unknown consequences. I’m sure they’ve debated it at the executive level to death…and they’ve made their decision…and users are just stuck with it or they can switch to a worse security model platform.

2 Likes

@neil1, I was initially going to argue it’s not quite that bad, plenty of lively back and forth still going on, and there’s always a chance Apple could come around. But recent events have proved me naive.

There is a level of vitriol being spewed at those who dare openly challenge the party line that I find unprecedented. As others put it much better, “we’re so sorry that you’re so stupid that you don’t understand how right we are”. I’m tired of this. I don’t understand why liking Apple products has to be equated with univocal approval of all they do, the constant urge to apologize every mistake the company makes, or to point out supposed silver linings so as to divert from BS some people are rightly upset about.

I’m tired of this “PEZ head” attitude. And I won’t submit to it or be bullied by its proponents either. Plenty of people with actual credentials (take Howard Oakley or the EFF) have come down on the critical end of this, why should I take any **** from anonymous posters with as far as I can tell zero credentials just because they don’t like Apple facing scrutiny? Nobody would behave this way on campus where I work, so why should I tolerate it online? Well, I won’t. I’m done with this “debate”.

2 Likes

In addition to what Adam said, Apple has been criticized for years for not doing anything to protect children from predators and abuse. CSAM is acknowledged to be one of the best, if not the best, sources that identifies child pornography, and It also has a good reputation for its security.

If Apple can help prevent the distribution, and maybe the development, of harmful images, it’s AOK with me.

1 Like

I think most of us here value the issues being fully stretched out for examination, which cannot happen without Apple’s line being thoroughly questioned. The issues are complex, trade offs, workarounds, absolute positions, all need elaboration on a nexus of topics as important as this. Folks will end up making their call on this, it’s not so much a matter of persuading anyone, or berating anyone for that matter, that shouldn’t happen. The last few days here have been very informative, and while some of you in particular have been in the fray, be assured that others reading have valued your contributions.

3 Likes

I understand, and feel similarly. I’m really tired of being preached at and patronized. Henceforth, I will generally limit my comments to links to folks like Mr. Oakley or the EFF.

2 Likes

First, I’ve seen supposed quotes from Apple’s announcement that say Apple is targeting “inappropriate child sexual abuse material”. That begs the question as to why ANY csam could be considered “appropriate”!

Second, in the recent “Kibbles & Bytes” (#1164) from Small Dog Electronics, Don Meyer points out the major danger of this move by Apple. I quote his third paragraph:

“This scanning raises questions in my mind regarding Apple’s commitment to privacy. Yes, we can all agree that sexual exploitation of children is evil and that there are some that should be locked up. But my concern is first Apple scans for this and, next, is it scanning for political images in the search for “terrorists”, scanning for undocumented workers or other uses. The pressure from government to do this scanning will expand, it is inevitable. Perhaps there are other ways to catch and prosecute those that perpetrate this heinous crime.“

If you’ve seen “supposed quotes” using the term “inappropriate,” you should direct your criticism at anyone using that word because Apple did not, at least in its published materials. As is easily checked with a search. I don’t have transcripts of all their executive interviews, so it’s possible some person slipped and used the word.

This concern has been addressed already, multiple times. The CSAM Detection can only be used to match photos being uploaded to iCloud Photos against known images, and only once there are at least 30 matches, and only after it passes an Apple human reviewer who is looking for images of CSAM, nothing else. That could conceivably be subverted at a state actor level to identify with known photos of something like Tank Man in China, but it cannot be used for surveillance in any effective way.

The Communication Safety in Messages feature does scan images for sexual content, but it does so entirely on the iPhone and alerts the user and potentially the parents, and does not report anything to Apple or any other party.

2 Likes

It may indeed, but that’s true of all technology and all change in the world in general. There’s no stopping the train.

1 Like

True, however I believe Don’s point is if software can be written to scan for csam, then it could very easily be modified in the future to search for material of the other types he mentioned.

Or locking the barn door after the horses have escaped.

Software can be written to do anything, but the CSAM detection feature does NOT scan for CSAM. It matches hashes from images that are being uploaded against a hash database created by Apple from the intersection of hashes from known CSAM images from the National Center for Missing and Exploited Children and at least one other child safety organization in another country. It’s an image matching system for known CSAM images, not an image scanning system.

3 Likes

Apple has now delayed this feature and promises to make improvements.

1 Like