Apple revealed that some Siri recordings are being heard by experts hired for “grading” them for improvement of the voice assistant.
This data includes confidential information, such as medical history, sexual interactions, and even drug deals, a whistle blower working for the contractors involved has leaked this info. Apple has also responded to the report stating that a small portion of Siri recordings is indeed used for improvements.
Amazon and Google have recently also admitted to giving access to third parties for their voice assistant services.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower stated.
Accidental activation is most common on Apple Watch and HomePod, the whistle blower claimed.
“The regularity of accidental triggers on the watch, is incredibly high. The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on,” the whistle blower added.
Apple states that a small portion of Siri requests are analysed to improve Siri and dictation.