Apple Announces Siri Privacy Reforms

Originally published at: https://tidbits.com/2019/08/29/apple-announces-siri-privacy-reforms/

After a whistleblower revealed that Apple contractors were listening in on Siri conversations, Apple shut down the program and promised improvements. Here they are.

I would guess that the “listening” is a random data stream out of million if not billions of Siri voice streams. We the users are always the beta tester finding bugs in software. Why not Siri? As long as Apple is using data streams without any DNS or meta data attached, and improving Siri (and does it need improving!) why should I care?

Apple shouldn’t have been doing this and it would be nice if they followed Adam’s suggestion of automating it so users could teach Siri - However, the audio snippets are anonymous. To be honest this whole thing smells of a strategy of making Apple take the blame for what every other company also does…and does more egregiously. It looks like they found a way to make it so that whenever any user thinks about any voice assistant snooping on their audio, the users instantly think “Apple & violation of privacy”. It obfuscates the fact that most of the other major privacy violation companies are doing the same thing, only not anonymized and maybe even aggregated with other data. Recent reports are that, like Android, Alexa exists primarily as a surveillance tool.

1 Like

De-anonymization of large data sets is quite practical, as has been known for some time. Audio snippets would be even simpler, as the content is likely to leak significant information (names, locations, phone numbers, appointments) without even considering available technologies like voiceprint matching.

–Ron

1 Like

Well, Apple has painted a large target on its back by saying that privacy is a fundamental human right. So it feels reasonable to point out when the company is failing to live up to that stance. Apple deserved the criticism and has apparently responded appropriately.

And there has been plenty of coverage of Amazon and Google doing exactly the same thing, some of it even here. Those companies also deserve the criticism and will hopefully adjust their practices as well.

1 Like

I don’t disagree with your assessment Adam. They tout privacy, they screwed up and should be held accountable and you outlined a much better way they could choose to proceed. I’m not sure I agree with the other statements about de-anonymization. I suspect that while that may be true with genetic data and many other forms of data, that an isolated audio snippet is unlikely to present a de-anonymization risk, depending on what metadata is associated with it. I suspect that Apple likely had little to none associated with it … but I don’t know that.

My main point was more about the suspicious timing of this announcement and how Apple’s enemies seem to have found a way to muddy the waters around privacy and make its so that Apple (the company who seems to care at least a little about privacy) is on everyones lips, rather than themselves, when the risks of digital assistants is mentioned. Little Apple does at this point will change that in the minds of the general public, and I find that unfortunate and unfair. I find it unfair because I believe that what they do is 1000x worse, invasive and intentional than Apple’s mess up.

1 Like

The original report came from a whistleblower talking to the Guardian, so I don’t know that there’s any timing in play. The Guardian has no reason to sit on the story and risk losing the scoop. And, with Apple making major announcements a few times per year, nearly any negative news is likely to happen sometime around an Apple announcement.

The question of whether Apple gets more negative press than other tech giants with worse privacy records is an interesting one. I’d love to see data on that. I suspect we feel that Apple gets singled out through selection bias—we’re paying attention to Apple-related sites and Apple in the mainstream news. But there are also vast ecosystems of sites that cover Android and Amazon, and we may simply not pay as much attention when they’re criticized in those spots or in mainstream media.

1 Like

Google has now tightened the privacy surrounding its voice assistant grading program too.

https://www.wired.com/story/google-assistant-human-transcription-privacy/