“Siri” — Apple’s automated assistant — apparently has a kinky side. It turns out, according to The Guardian, the AI routinely records people doing all sorts of naughty things like having sex, drug deals and discussions with doctors. You can thank a hair-trigger recording feature that can be accidentally activated by the sound of a zipper and of course when it mistakenly hears its “wake word”, hey Siri. (Anybody been involved in a conversation about Syria recently or asked for a nice sear on their steak?) The Apple Watch can even be activated if it detects it has been raised and then hears speech.
Siri, which is built in to the iPhone, Apple Watch and HomePod smart speaker, often begins operating by mistake, according to a quality-control contractor. Apple hires outside workers around the world to review Siri recordings to improve usability and they’ve been getting an earful. “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the contractor revealed.
Apple says less than 1 percent of daily Siri recordings get randomly reviewed to improve usability, that the recordings are usually only a few seconds long, that takes its customers’ privacy seriously. “User requests are not associated with the user’s Apple ID,” the company said. “Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” We probably wouldn’t be having this conversation if that last part were true though.
Either way, I have resigned to knowing everything is being recorded all the time and I hope they at least enjoy the show.