Apple to install backdoors in iOS, macOS this year

You had me totally in agreement until the second to last word.

Whatever are people who play the card game Bridge going to do to come up with a new name for a wild card suit?

Do some due diligence on the right to privacy in this country before making such broad statements.

1 Like

But the app is not turned on by default. A parent needs to activate it, and they can choose whether they will or they won’t. The app is looking for images of nude and physically violated children that match what has been loaded by authorities in a respected database. And I don’t see how this violates privacy, or that it is a harbinger of privacy violations to come.

This is very different than Facebook scanning images to better target advertising. IMHO, Apple is protecting children while still protecting privacy. And Apple is sending a warning to child abusers that they should not even think about using Messages to send porn to kids.

That’s the Messages feature. I was talking about the scanning of photos for child porn, which in not activated by a parent.

Would it bother you if someone from Apple came into your house to look for analog pictures of child porn without asking your permission? This is exactly the same.

2 Likes

No, because Apple only performs the check if you’re trying to load the pictures onto Apple’s servers.

Dave

1 Like

So they say now. They can easily modify the backdoors they’re installing.

2 Likes

That’s not the same at all. Apple isn’t actually looking at the pictures. It’s all done on your phone and it is just computing a hash of the photo, not analyzing the content. And it’s only comparing against specific illegal photos, not looking for pics of your kids in their birthday suits or any random porn.

A more apt comparison might be checking the serial number of the handgun you’re selling against a database of guns used in crimes (where serial number = image hash, database = CSAM database, and selling = you uploading image to iCloud).

1 Like

If you read Apple’s documentation, it wouldn’t be easy at all to modify this system. It would have to be entirely redesigned.

For example, your own phone cannot even tell if you hit the threshold and uploaded the flagged images – it happens automatically and can’t be started or stopped by Apple or the user.

Yes, Apple could rewrite this to work differently, but why would they go to this much trouble to produce an extremely complicated, private, automatic system just to change to something simpler later? That makes no sense.

1 Like

It’s more like Apple coming in to your house to look through all your pictures without asking your permission.

This is analogous to warrantless review of your location data, your emails, (content or metadata), your browsing history, your app usage, etc. Stopping this is supposedly the basis for the privacy protections we have in this country.

1 Like

They write the OS. They can easily do anything they want without saying much about it. If you aren’t willing to trust a company that writes an OS, you’re pretty much stuck with Linux.

The guarantee that Apple is not scanning all local folders is the same as the guarantee that your phone isn’t sending plaintext copies of your messages to the government, or the guarantee Apple isn’t keeping a database of all of our bank passwords. At some point you either have to trust the OS vendor or significantly limit your computer capabilities.

Dave

6 Likes

We’re not likely to know the real answer on this as Apple is unlikely to allow outside inspectors to look at the code. I was referring to the functionality of implanting the capability of downloading database updates and using device CPUs to process user inputs without user control, as compared to CSAM database hash comparisons themselves.

What makes no sense is Apple fighting the FBI tooth and nail to maintain user privacy only to apparently completely switch philosophical positions a few years later. Unless, of course, they’re doing it because they deem it to be in their own financial interest. They’ve never been shy in alienating portions of their user bases when they’ve seen fit.

1 Like

Excellent suggestion!

1 Like

Nope. Not at all in fact. You can repeat this as many times as you want, but that still doesn’t make it true.

Makes perfect sense. Easier to get broad acceptance now. Helps encourage people to go on boards to do what is essentially free PR for Apple: convince everybody that having our privacy threatened is in our own best interest. Then enjoy watching more people update to iOS 15 without asking too many questions. They don’t call it a slippery slope for nothing.

3 Likes

Exactly right. You’re not trying to sell anything, you’re using a major feature that Apple pushes at you, and they’re rummaging through your possessions looking for illegality. It’s precisely like coming into your house to look through your photos for something illegal. People would be up in arms about this is they tried to invade your house and yet remarkably blasé about them doing it to your phone.

“ At some point you either have to trust the OS vendor or significantly limit your computer capabilities.”

Trust in a vendor comes from what they say and what they do. Apple is now saying and trying to do untrustworthy things and I’m making a judgment based on that.

2 Likes

Apple will be comparing photos in Messages to and from an existing and respected external database of objectionable and harmful material. It will only do so if one or more parents of vulnerably aged children opted in. It’s not at all the same. I consider child sexual abuse to be extremely objectionable and dangerous, so I am focusing on that and what Apple is doing to make Messages safer for children. Anything else about CSAM being totally inaccurate, or potential interactions with foreign governments to scan text Messages, etc. is speculation. And Apple has made it clear they will not be scanning or searching anyone’s apps or photo libraries, or that they will not be colluding with the US or any foreign governments that would require breaking Apple’s privacy initiatives.

Apple also emphasized that they will not be breaking into anyone’s photo libraries, and I think it’s safe to assume, anyone’s home. They will only be checking photos in Messages of participating kids whose parents signed up for this, to find any matches to a digital hash of acknowledged porn, from a highly respected database. Any and all matches will be scrutinized and verified by a trained human.

Personally speaking, I got my two Covid shots as soon as I could make a reservation, although very significant % of the US population considers the shots to be an extremely slippery and dangerous slope. Though I am not a parent, if I were, I’d consider this new feature to be similar to a vaccine that help would protect my kid from sex abuse.

Actually this makes a lot of sense. This is a way for Apple to monitor for one specific type of illegal content, without Apple or the government having the ability to decrypt all of your conversations and other data.

That is—it may well be that Apple felt that the alternative to having a feature like this was going to be facing a legal requirement to give the government the ability to read all users’ encrypted data. This is arguably a way to monitor for CSAM that provides a lot more user privacy than what law enforcement would really like to have.

Dave

2 Likes

Uh, MM, the Messages thing is only one part of it.

The second part is that Apple will go through the photos of anyone using iCloud photo sync, looking for known illegal material involving child sexual abuse. They will do this without permission from the owner of the phone and if illegal material is detected and verified, they will notify the police.

You need to reread the announcements.

2 Likes

The 4th amendment has routinely been interpreted as the right to privacy…it is how abortion became legal.

2 Likes

This is not true: refer to Apple’s own document announcing these initiatives. To quote: “Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit.” This has nothing to do with comparisons to existing CSAM databases, as it is inclusive of perfectly legal pornography the totality of which cannot be contained in any database in my opinion. While I’m all in favor of preventing exposure of children to either legal or illegal porn, I have to wonder how well my phone’s “machine learning” is going to handle pictures of naked winged cherub statues pouring water or photos of my relative’s young children, or even teenagers in bikinis. Or adults holding hands. Perhaps Apple will ensure no more that a one in a trillion “false positive” because they are such coding and quality control aces.

1 Like

I could not disagree more. Refer to the EFF document I quoted when starting this thread, and/or Glenn’s excellent (and extremely moderate) discussion elsewhere.

1 Like