That makes the situation worse, not better. Resistance to the Chinese government is not limited to people in China. Apple is essentially handing China access to a tool that will currently surveil American citizens and, as it expands, people in other countries. The Chinese government already does a lot of this in the US. You really want to risk handing them this ability?
As to the servers, China is likely already doing this with the iCloud servers located in that country, but it’s much more difficult to do it on servers elsewhere. Why should they go to the effort when Apple is creating a useful tool for them?
Your metaphor about the gun database fails because selling a gun is not parallel to uploading an image to iCloud, nor do the buyers of the gun come to your house and check the serial numbers of all the guns you own.
It’s not “your” photo; the app generates a hash for a photo that was sent to a child in your family plan’s iPhone and checks to see if it is associated with a photo that matches a CSAM image in the National Center For Missing And Exploited Children’s database. A human only intervenes if and when there is an element of doubt is generated about the match. Only then does a human look at the suspicious photo that was sent to your child to determine if it is a valid match. If there is a match, they raise flags for the parent and the kid.
Does it bother you that Google tracks, stores your location history from your iPhone and builds profiles of you and everyone in your family over many of its services and uses the data it accumulates to sell ads?
If you, your child or other family members or friends have Google apps on your iPhone, or use Google services via the web, you are being tracked.
Again, you are combining two separate programs that Apple has announced together. Starting sometime in iOS 15, all people with accounts in the US who use iCloud Photos - adults, children - will have the hash fingerprint comparison done. Please, you should read something like Josh’s article. FAQ about Apple's Expanded Protections for Children - TidBITS
There is a separate, opt-in program for family accounts that has nothing to do specifically with CSAM material - it allows a parent/guardian in a family plan to enroll children in the plan in a program that will use ML to scan all photos received by the Messages app, and photos that they attempt to attach to messages in the Messages app, to see if the photo may be sexually explicit photos. None of these will be seen by anyone at Apple or alerted to the police - it will instead warn the child and allow them to not attach or see the photos, and will alert parents if the child chooses to attach the photo (or view a received photo.) The parents will not see the photo. (It also prevents those questionable photos from being deleted from the child’s device, presumably so that a parent/guardian can see what the material is.)
“Apple will be rolling out new anti-CSAM features in three areas: Messages, iCloud Photos, and Siri and Search. Here’s how each of them will be implemented, according to Apple.
Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.
iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.
Siri and Search: Siri and Search will provide additional resources to help children and parents stay safe online and get help with unsafe situations.
My thinking is that the matching process takes place before it is stored in iCloud Photos to keep it from being saved prior to review. It makes a lot of sense to me. And it’s parents who have opted in for themselves and to monitor their children, not the iPhone owning public at large.
Actually…just disabling iCloud sync on your photos easily solves the immediate problem. One thing that disturbs me…well, another thing…is what’s the next iteration of this move to look good compared to the FBs, twitters, and other upright corporate citizens to help rid the world of the scourge of these evil images…and that gets back to exactly why Apple is doing this. They’re at least as smart as I am, probably smarter in total since they have many people thinking about it…and it is obvious that this is a mostly meaningless feel good PR objective unless they have a grand plan of which this is the first step. So what happens next? Dunno…but speculating here…but they could decide to broaden their hashing criteria and scan all photos on the iPhone for such material to increase the catch rate from very small to small regardless of the user’s permission…or they could be forced by the government to do so. They could enable full encryption of ll iCloud data…maintaining user privacy and legitimately be ble to tell government “sorry, no can do”…but then the small portion of said material that might get caught by the AI/hash filter (as opposed to all of the homegrown imagesthat won’t be since it hasn’t been verified illegal and hashed) would no longer be caught. Or they could be taking what a significant portion of users and privacy people or organizations think is a step on that slope we been talking about…which may be at the behest of government arm twisting or wink wink we won’t try to outlaw unbreakable encryption…or a whole series of other possibilities.
As I said…they’re smarter than I am and hopefully they’ve thought it all the way through from both the short term PR perspective as well as the privacy is in our DNA perspective…and have determined for themselves how far they’re willing to go. As it is…they’re breaking their previous policy and word that back doors are a no no…and I don’t like it one bit. I said way back at the beginning of this debate that they should be doing the hash on their end and I would be happy with that…since as long as iCloud isn’t E2E encrypted I personally have no expectation of privacy for my stuff on their servers…and I think doing it on their end provides better privacy for my device. In addition…the link that Adam (i think) posted indicated that Apple would be transmitting a highly suspect illegal image to themselves for verification and that such a transmission is expressly prohibited by US code as they can only legally be sent to NCMEC. IANAL of course…but the quoted portion of the law seemed pretty clear and unambiguous to me…so how does Apple get around that legal requirement. If they’re doing so at the behest or suggestion of LEO and have immunity…then they’re acting as an agent for the government…and the 4th amendment applies…making this a warrant less and unconstitutional broad search.
I dunno…I appreciate the concern they’re trying to show… it it still smacks of lowering their standards for a mostly meaningless gesture.
I also keep thinking I should stop commenting…they’re going to do what they’re going to do regardless of posts on Tidbits…they might read them and post some more wispy washy FAQs…but are very unlikely IMO to change course because of them.
Not really. People won’t be willing to give up on smart phones altogether and the only alternative is Android which is already worse.
I know there are a few open source mobile operating systems, but I don’t know of any modern phones that can boot them. (A few seem to be runnable as a layer over Android, but it’s very unclear if or how you could install them on a modern Android phone or if it will actually be free of Google’s ecosystem).
I’m sure some people will be willing to switch back to dumb feature phones, but I don’t think there will be enough for Apple and Google to even notice.
What happens in Android stays on Google’s humongous tracking database that links all their ad sales services together. And even though Google tracks everything and anything, people don’t seem to mind.
No one is coming into your house to check the serial numbers: you’re taking the action of offering the gun for sale, so then the serial number is checked. If you don’t want to sell the gun/upload the photo, then no serial numbers are checked and no one goes into your house looking for anything.
That’s exactly like not using iCloud Photos and not uploading photos to the cloud.
China’s economy is second only to the US. But for quite some time I’ve been reading that China is likely to eclipse the US economy in the not distant future. If Apple wants to maintain it’s position as the world’s most valuable company, it cannot afford to alienate or ignore China. Microsoft is biting at Apple’s heels:
That makes even more sense then … my 1 in 10,000 figure seemed awfully high - after all comprehensive hashes are used in enterprise dedup storage units to even determine if it’s worth copying data to the server, and enterprises hate losing data.
So that means that pretty much every photo decrypted to check for CSAM will actually contain CSAM - and maybe sometime in the next millennia we’ll see a false positive account flagged.
You can do a lot of projections about China, but most of them will be wrong because you’re not considering the stability of the country.
China gets away with a lot now, but even countries which are indebted to China for things like the Belt and Road Initiative are beginning to turn away from the craziness of the CCP.
The population of China is falling due to the fact that the one child policy is practically institutionalized, and Chinese society has been formed around it’s core principles. Most foreign manufacturers have begun to highly mistrust manufacturing in China, and foreign investors are becoming disenchanted with China’s on again/off again support of technological industries. China’s civil rights are non-existent, and things like suppression of religions and minorities up through and including forced organ harvesting have made even the most hard-core capitalist squeamishly uncomfortable.
Which is why it would be a very bad idea to give them access to this kind of surveillance system on the wildly naive belief that the CCP would only use it within its borders.
That part of the MacWorld article is incorrect (or perhaps badly worded.)
From the same article, further down:
What you said earlier:
The communication safety in messages feature for minor family plan accounts is not using the same neural hashing against the NCMEC hashes as the iCloud Photos feature.
Except if Apple creates a system whose scope is global they might well.
I mean that’s without even mentioning that the Tibetans would not agree with your comment, nor would the neighbors of any great power in history, including literally every single Latin American nation. The phrase “banana republic” literally comes from the US (when it was a rising power threatening the global superpower — sound familiar?) imposing its choices on non American consumers.