FAQ about Apple's Expanded Protections for Children

Apple has now posted its own FAQ. We’re pondering how to integrate it into our coverage.

4 Likes

I don’t agree. Apple has responded (often in significant ways) to pushback on policies. But the reason to have a discussion here isn’t to get Apple’s attention. It’s to understand what’s going on and hash out opinions about what we think.

3 Likes

I strongly suspect that coverage like ours and discussions like this prompted Apple’s FAQ this morning. If they’d been planning to release it all along, it would likely have come with the original materials.

So yes, Apple does pay attention. Not necessarily to anything specific or any particular complaint, but to the general tenor of reactions and discussions. I wouldn’t be surprised if Apple’s FAQ expands as well, as the people in charge feel the need to respond in more detail or to other concerns.

4 Likes

I’m reminded of the performer who is trying to keep multiple plates spinning atop poles. As he dashes from pole to pole to adjust the spin, our eyes are drawn to the plates most in danger of falling, not the ones successfully spinning.

That seems to be our nature – looking for the worst to happen.

And yes, the worst does sometimes happen, but if we resist trying to make something better because our actions might have consequences we did not intend, then we will never get better.

I applaud Apple for making the attempt, and have confidence that they will do what they can to minimize unintended consequences. Will it be enough? Only time will tell.

1 Like

Ben Thompson’s Stratechery column hits some of the same issues we’re talking about here, the difference between capability and policy. In the past, Apple said there was no capability to compromise the privacy of your device; now it’s saying that that it has no policy to do so.

https://stratechery.com/2021/apples-mistake/

2 Likes

Absolutely. I guarantee you that Tim Cook gets regular briefings on the general tenor of press coverage, forum discussions, social media, complete with quotes.*

*multiple family members in PR and PR adjacent industries as reference.

1 Like

It’s disturbing. I think this paragraph from the article gets to the heart of the problem:

Given that only a very small number of people engage in downloading or sending CSAM (and only the really stupid ones would use a cloud-based service; most use peer-to-peer networks or the so-called “dark web”), this is a specious remark, akin to saying, “If you’re not guilty of possessing stolen goods, you should welcome an Apple camera in your home that lets us prove you own everything.”

3 Likes

Child sexual exploitation IS a current problem.
The “slippery slope” issue is a potential problem.

It comes down to balancing the rights of privacy, with the rights of the victims of unspeakable crimes. In this case I agree with Apple’s decision to do what it can to protect children.

2 Likes

Folks, I agree with everything stated in this comment section. Both points of view and we should help to protect kids who can’t really protect themselves. BUT…

In my belief, Apple doing what it has proposed is tantamount to making everyone who owns an Apple device, “guilty until proven innocent.” In my estimation this effectively is akin to an “unreasonable” search. Warrantless. The perverts that trade or engage in such heinous activities should be dealt with forcefully but that doesn’t give anyone the right to invade my privacy, papers, property, things. No matter how well intentioned the process.

Whatever happened too, “being secure, in your person, papers, things…”?

Amendment IV

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

2 Likes

I’m viewing the iCloud Photos NeuralHash feature less as keeping individuals safe, or persecuting baddies, and rather as an attempt to keep such material off iCloud Photo servers.

Framed as a “if you want to pay to copy your files onto our server, then it’ll be NeuralHashed first”, it seems… almost acceptable-ish.

And over time, if the hashes are on the iOS devices, I would not be surprised if Safari and other programs that can upload data also NeuralHash data before uploading. Not saying that I am ok with that.

That applies only to the government. With a private company like Apple, you’d be agreeing to the system by virtue of the legal agreements you enter into when using the device.

1 Like

Of course in this case it’s a bit more nuanced because Apple is implementing technology and company procedures that amount to an extended arm of government (LE & prosecution).

Several people have already pointed out that Apple could well be implementing this in order to preempt possible government action. If that is indeed the case, this IMHO moves us even closer to 4th Amendment territory.

1 Like

Here’s another interesting take on the situation:

https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html

1 Like

That doesn’t make it right.

“Right” is a different question. The point is that the Fourth Amendment explicitly refers to the government. @Simon is suggesting there may be more of a connection because NCMEC has special status within the government, and that may be true, but that’s an issue for experts in Constitutional law to hash out. And we aren’t. :slight_smile:

Yes, private companies are not bound by the Bill of Rights.

Unless they are acting on behalf of or at the request of a government agency.

This is one of the big issues in the news today regarding social media companies censoring what their customers post. They have been saying that as a private company, they can block or allow anything they want for any reason they want. But when news recently broke implying that government agencies are working with them in order to determine what “misinformation” should be blocked, that changes the game.

It’s no longer a clear-cut case of a private company doing what it wants on its own systems, but is now a case of the government ordering (strongly suggesting?) that they do so. Which means the Constitution now applies. The government can’t dodge its Constitutional obligations by telling a private corporation to do what it isn’t allowed to do.

2 Likes

With a warrantless government search, you are helpless (with protection only from the court system, you hope.). With this action by a private company, you still have choices; don’t use iCloud Photos; or use another mobile device that doesn’t search all of your photos; or, don’t use a connected device at all. And, of course, use your voice to make your displeasure known.

Then Apple should just have said so. As I said before…I’m quite fine with them hashing anything on their servers and even turning them over to the cops…as long as they tell users that they’re going to do that and either grandfather in images uploaded before they made the announcement or offered users an opportunity to delete them from iCloud photos before the hashing started…either of those would protect the privacy the user thought they had before the new hashing stuff.

1 Like

Then they should say that this is part of a long range plan…or at least leak it in some not to be attributed fashion like they probably do a lot already…I know that they don’t talk about unannounced products…but informing users of something coming down the pike would be nice even if they couched it in a ’this is our current plan, no firm date and no firm commitment because it might not work out’ vein…

1 Like

And therein lies the problem…policies can be changed a whole bunch easier than capabilities.

1 Like