Those who work on quality control for Apple’s Siri voice assistant “regularly hear confidential details” about users, according to a contractor paid to assess responses to Siri recordings.
The news: The whistleblower told the Guardian these workers routinely hear sensitive information like drug deals, confidential medical details, and people having sex.
Why are they listening in the first place? Just like Amazon and Google, Apple employs people to listen to a sample of recordings from people’s conversations with Siri, transcribe them, and grade the responses according to a set of criteria. These include whether the voice assistant was activated deliberately or not, whether Siri could help with the query, and whether its response was appropriate.
However: Apple, again like Amazon and Google, does not explicitly disclose that it is doing this in its consumer terms and conditions (which are virtually unreadable, anyway). Apple likes to pride itself on being a privacy-conscious company, so this revelation may be more damaging for it than for other firms. Unlike the other two companies, Apple provides no way for users to opt out of their recordings being used this way, other than to just not use Siri at all. Apple told the Guardian that fewer than 1% of Siri recordings are used for training and that they are not associated with a user’s Apple ID.
Do consumers care? There’s been some online outrage about this practice and the fact it’s done without customer consent (and so could be illegal within the European Union), but adoption of voice assistant technology shows no sign of slowing.