Apple contractors regularly listen to your recorded Siri conversations

They listen to confidential medical information, drug deals and couples having sex

USA-TRADE/CHINA-BRANDS (File) The revelation is a huge setback for Apple and its claims of pro-privacy policies—the company's most powerful weapon against Android phones

Apple contractors are paid to regularly listen to user conversations with its virtual assistant Siri, revealed an anonymous whistleblower who currently works with the firm. The revelation, reported by The Guardian, is a huge setback for Apple and its claims of pro-privacy policies—the company's most powerful weapon against Android phones. 

According to the report, Apple contractors around the world listen to confidential medical information, drug deals, and even recordings of couples having sex. These contractors are tasked with "grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate".

However, Apple, in its statement to The Guardian, said the data was used to train and help Siri to understand and recognise better what users say. "A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” It claimed that only a very small random subset—less than one per cent of daily Siri activations—are used for this purpose and are "pseudonymised recordings".

However, compounding the seriousness of the matter is the fact that Apple's users are kept in the dark about this sharing of information as Apple does not explicitly disclose it in its consumer-facing privacy documentation. The whistleblower raised concerns about Apple's lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.

Siri activations

 

What is alarming in the whole episode is that Siri can be accidentally activated, often without the user's knowledge. For instance, Siri can be activated when it mistakenly hears its “wake word” or the phrase “hey Siri”. 

According to the anonymous whistleblower, even "the sound of a zip" can be a trigger for Siri. Worse, when an Apple Watch detects it has been raised and hears speech, Siri is automatically activated. Interestingly, Apple Watch and the company’s HomePod smart speaker are the most frequent sources of mistaken recordings.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

Not just Apple

 

With the whistleblower's revelations, Apple joins the likes of Google and Amazon, which were also revealed to have humans listening to voice assistant recordings. In separate instances, both Google and Amazon had admitted that its contractors were listening to recordings of conversations between humans and their voice assistants—Google Assistant and Alexa.

However, as The Guardian notes, there is a difference. While users can opt out of some uses of their recordings, Apple offers no similar choice of disabling Siri entirely.