Skip to content ↓ | Skip to navigation ↓

I don’t often use Siri on my iPhone, but I’ve got to admit that when I do it’s really handy.

I’ll be driving the car and thinking “Arrrghh! I forgot to put out the recycling last night. I’d better say sorry to my wife as soon as possible, as she’ll be mad at me.”

I could stop the car on the hard shoulder (which would be dangerous), I could risk waiting until I get to my destination to tell my wife (which would be dangerous), or I could press a button my iPhone and tell Siri to send her a text message (probably the least dangerous option).

“Hey Siri. Send a message to Caroline. Oops. I forgot to put out the recycling. Don’t dump me.”


Siri whirs away, and before I know it it’s sent my missus a text message. Job done.

I already know that that every time I speak to Siri, everything I say gets sent up to an Apple server for “analysis” and is stored for two years. That’s been known about for a couple of years and – much as I don’t like it, I’ve come to accept it for the convenience that Siri offers me on the occasions when I use it.

I’m not even surprised to hear that the commands we give the personal assistant in our iPhone and iPads is shared by Apple with third-parties who specialise in voice recognition.

But what doesn’t appear to have been widely known until now is that complete strangers could also be listening to our Siri messages.

Here is what Reddit user Fallenmyst posted on Reddit recently:

I started a new job today with Walk N’Talk Technologies. I get to listen to sound bites and rate how the text matches up with what is said in an audio clip and give feed back on what should be improved.

At first, I though these sound bites were completely random. Then I began to notice a pattern. Soon, I realized that I was hearing peoples commands given to their mobile devices. Guys, I’m telling you, if you’ve said it to your phone, it’s been recorded…and there’s a damn good chance a 3rd party is going to hear it.

Fallenmyst went on to describe how they even get to hear our steamy sexting messages:

I heard everything from kiddos asking innocent things like “Siri, do you like me?” to some guy asking Galaxy to lick his butt. I wish I was kidding.

Just a heads up Reddit. I’ve heard more text-to-speech sexting than I care to. (You’ve never hear something sexy until you’ve heard a guy with a slight Indian accent slowly enunciate “I want to have sex with you” to his texting app)

Wow. And you thought you had a bad job. Just imagine having to listen to that kind of stuff all day long.

And don’t think that this is just an Apple problem. If you’ve ever found yourself saying “OK Google”, you may find that the search engine giant has also been keeping an archive of your commands. You can check out past recordings here:

The likes of Apple and Google are smart enough to have their bases covered, and their terms and conditions will ensure that they’re okay to share your voice messages with third-parties.

By using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and dictation functionality in other Apple products and service.

But, unfortunately, nobody ever properly reads the terms and conditions. And even if they did, I would guess that they would imagine that any analysis of the data would be done by computers rather than humans, sniggering at our sext messages and reminders to put the bins out.

There is no suggestion that the messages being listened to have your mobile phone number or Apple ID associated with them, but you should still clearly be very careful not to share with Siri any information which you would prefer to remain confidential.

This isn’t the first time that Siri has raised security and privacy concerns, and I doubt it will be the last.

As more and more devices integrate voice control we are likely to see a rising concern over how the data is communicated, stored, shared and analysed.

Just last month we saw another privacy storm kick off regarding gadgets which act upon our voice commands, after it became widely known that Samsung Smart TVs were sharing customers’ spoken commands with third-parties in order to provide voice control functionality.