Apple stops Siri grading program globally, future iOS update to provide opt-in/out controls

In response to privacy concerns raised a few days ago in an article by the British newspaper the Guardian, Apple has temporarily suspended the Siri grading program globally, saying that an upcoming iOS software update will give its customers new controls so they could choose whether they would like to participate in the grading process or not.

Apple gave this statement to TechCrunch‘s Matthew Panzarino:

We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.

The Guardian store shone light on the fact that Apple hires contractors to help it with Siri’s quality control process it calls grading. A tiny slice of audio recordings of people’s interactions with Siri are kept on servers for a short period of time for the purpose of helping Apple improve the service.

Panzarino explains:

The process of taking a snippet of audio a few seconds long and sending it to either internal personnel or contractors to evaluate is, essentially, industry standard. Audio recordings of requests made to Amazon and Google assistants are also reviewed by humans.

Apple claims it may select one percent of daily requests for the grading program.

These audio snippets are stripped of names, locations and Apple ID of individuals. In its story, the Guardian cited whistleblowers who claim that some of the audio snippets that contractors are tasked with reviewing might contain personal information, audio of people having sex and other details like finances that could be identifiable.

This may happen regardless of the process Apple uses to anonymize the recordings because it’s impossible to control what people will say during their Siri interactions. True, Apple’s term of service mention clearly that by using the Siri or Dictation features you consent to “Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation and dictation functionality in other Apple products and services.”

But privacy warriors say that’s not enough, not just because no one reads these legal texts but also because Apple’s terms fail to explicitly and plainly make it clear that some of the live recordings of your Siri conversations may be transmitted and listened to.

The likes of Amazon and Google run similar programs because that’s the nature of machine learning – you gotta have humans grading and testing machine learning algorithms—it’s the only way they could become better and more accurate over time. That being said, Apple carries perhaps the biggest burden to be as forthcoming and transparent about its assistant grading program as possible, especially since it’s turned privacy into a marketing tool.

Apple probably feels the same, otherwise they wouldn’t temporarily cease the Siri grading program over the Guardian’s article. No matter how you look at it, releasing an iOS software update bringing opt in/out controls is the first step in the right direction, don’t you think?

Let us know your reaction to Apple’s statement in the comments down below.