One of my weirder hobbies is trying to convince people that the idea that companies are listening to you through your phone’s microphone and serving you targeted ads is a …
Not my title! I do think we are being listened to. And location tracked. And it's being passed on to advertisers. Is it apple though? Probably not is my take away from this article, but I don't trust plenty of others, and apple still does
A phone reacting to "ok Google" or the equivalent for the other assistants already requires to listen to what you're saying - doesn't seem to affect battery life all that much.
That was once true, but I am now very skeptical of that with on-device processing that can log key words and send them without using much data or power.
Google has an adsemse profile on you the user, that profile is added to by metadata from apps and sensors on the phone; then offloaded to google cloud servers when the phone is charging. No input from the user required.
Here's a GrapheneOS security feature to prevent persistence and breaks the above workflow.
That Google blog also says the same thing except it's written by a human. I'm not disputing that AI can process audio data into ad statistics; I'm disputing that audio data is constantly recorded and sent.
I interpreted Farts as saying that the device listens to key ad words just like it listens for "Hey Siri", and I asked how it decides which words to listen to. Each ad campaign has their own keywords, and if you want to personalize, you'll have to listen to all of the words from every campaign, which would be equivalent to listening to everything and would severely degrade performance.