Siri can be activated using a trigger phrase, “Hey Siri”. However, there are times when we say something that sounds like “Hey Siri”, accidentally triggering it in the process. The report goes on to claim that Apple sends some of our voice recordings over to contractors to be analyzed, and some of these recordings were the ones we made accidentally.
In response to the Guardian’s report, Apple issued a statement that reads, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
However, some of these recordings can still be rather sensitive, accidental or not. A whistleblower working for one of Apple’s contractors said, “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
For a company that seems to pride themselves in their privacy, this is definitely not a good look.