Apple has apologised for allowing contractors to listen to voice recordings of Siri users in order to grade them.
The company made the announcement after it completed a review of the grading programme, which had been triggered by a Guardian report revealing its existence.
According to multiple former graders, accidental activations were regularly sent for review, having recorded confidential information, illegal acts, and even Siri users having sex.
“As a result of our review, we realise we have not been fully living up to our high ideals, and for that we apologise,” Apple said in an unsigned statement posted to its website. “As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users.”
The company committed to three changes to the way Siri is run after it resumes the grading programme:
It will no longer keep audio recordings of Siri users by default, though it will retain automatically generated transcripts of the requests.
Users will be able to opt in to sharing their recordings with Apple. “We hope that many people will choose to help Siri get better,” the company said.
Only Apple employees will be allowed to listen to those audio samples. The company had previously outsourced the work to contracting firms. Over the past two weeks, it has ended those contracts, resulting in hundreds of job losses around the world.
In the past six months, every major producer of voice-assistant technology has been revealed to be operating human-oversight programmes, having run them in secret for years. Many have pledged to change their systems. Amazon, the first to be identified, has built a setting that allows users to opt out of the review process.
Google has paused the programme in Europe after a leak of recordings, and pledged to review its safeguards. However, the company has not changed its general human oversight practices, instead committing to “improve how we explain our settings and privacy practices”.
Microsoft has made no changes to its programme, which not only involved listening to its Cortana voice assistant but also to Skype conversations that were conducted with a translation feature turned on. It did, however, update its privacy policy when the practice was discovered.
Source: cnet