We have seen Google and Amazon getting accused of taking advantage of their virtual assistant services. Now Apple's virtual assistant service Siri is also added to the above list.
According to The Guardian, Apple employees working behind its virtual assistant service Siri are listening to the conservations and voice commands in order to improve Siri.
Apple contractors are hearing the conversations related to medical information, drug dealing and recordings of a couple having sex, etc. It might be that the conversations are being recorded using Apple devices which also includes Home Pod and Apple Watch.
What's Apple Saying regarding this?
In a statement to The Guardian, Apple said, “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri’s responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.“
According to Apple, these are all being done to evaluate Siri and perform quality checks. The recordings are being tested to find out whether or not the activation of Siri is helpful for a particular situation or command. The company is also interested to know whether or not Siri should answer the voice command or the query. And if its yes, then it is also necessary to know whether the reply will be helpful or not.
Apple said to the Guardian that it stores a small portion of the recordings (roughly equal to 1%) and the main aim of doing this is to just improve Siri.
For your knowledge, this is happening for the first time that a virtual assistant service is keeping a part of the users' recordings. Amazon Alexa has accepted that they have been doing this. Even the Google Assistant service does this and Google has officially admitted they stores around 0.2% of the recordings to further improve the AI assistant service.