Apple's Siri 'listens in on users' intimate moments', whistleblower claims




From Amazon’s Alexa to Apple ’s Siri, smart assistants are now a regularly feature in many people’s daily lives.


But a new letter by a former Apple contractor indicates that Siri may be doing more than helping you with your day-to-day tasks.


Thomas Le Bonniec claims that Siri listens in on users’ intimate moments, before the recordings are ‘graded’ by contractors.


Mr Le Bonniec claims that during his time working for Apple, he graded recordings of medical discussions, criminal activity, sex and official business talks.


Speaking anonymously to The Guardian last July, he said: “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad.


“It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.”




That person who sits too close to you on the bus could be a "suggested friend" on Facebook soon

Is your smartphone spying on you?

Now, Mr Le Bonniec has revealed his identity, and penned an open letter to European data protection regulators about his concerns.


The letter said: “It is worrying that Apple (and undoubtedly not just Apple) keeps ignoring and violating fundamental rights and continues their massive collection of data.


“I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the EU has one of the strongest data protection laws in the world.


“Passing a law is not good enough: it needs to be enforced upon privacy offenders.”


During his time at Apple’s Cork offices, Mr Le Bonniec claims that he listened to thousands of recordings a day from users’ iPhones, Apple Watches and iPads.









Video Loading


Video Unavailable









Read More


Smart assistants



According to Mr Le Bonniec, this included recordings of the device owners, as well as their friends, family and colleagues.


Apple has previously admitted that a small portion of Siri requests are analysed to improve the smart assistant’s diction.


Speaking in July last year, an Apple spokesperson said: “We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading.


“We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”


Apple has confirmed that Mr Le Bonniec worked as a subcontractor for two months up until July 2019.


However, it confirmed that its Siri policies have since been updated, including an ‘opt-in’ feature that means users can choose not to share their audio samples.


It also highlighted that only Apple employees review audio clips, and not contractors.









Source link

No comments:

Post a Comment