Your Phone Has Been Listening. And It Is Completely Legal.
Here is something that will make you uncomfortable.
A few months ago, a team at an office had a meeting. They talked about tools they needed for a project. They named brands. They discussed specs. They never typed anything. Never searched for anything. Never photographed a catalog.
Days later, one of them saw ads. Exact ads. For the exact tools they had discussed.
The right brands. The right specs. Better prices than expected.
That person thought the algorithm was helpful.
I thought something was deeply wrong.
How Did It Know?
That meeting was not private. Something had been in that room. Something had listened, parsed what was said, and delivered it to people who wanted to sell things.
The really disturbing part? The person who got those ads felt helped.
That is exactly the point.
A surveillance system that makes you feel watched has failed. A surveillance system that makes you feel served is one you will never question, because you never even think to look for it.
The Numbers Are Worse Than You Think
- Ninety-one percent of people click “I agree” on Terms of Service without reading a single word.
- Ninety-seven percent of people aged eighteen to thirty-four.
- Only nine percent of adults actually read privacy agreements before agreeing.
- Researchers once slipped a clause requiring users to surrender the right to name their firstborn child into a privacy policy. Every single person agreed to it without noticing.
- The average person encounters approximately 1,500 privacy policies per year. Reading all of them would take 76 working days.
Terms of Service are written at the level of academic journals. Understanding them requires approximately 14 years of education.
The obscurity is not an accident. It is the design.
Meta Ray-Ban Glasses: “Designed for Privacy”
Meta has sold seven million pairs of Ray-Ban smart glasses. Their marketing says “designed for privacy, controlled by you.”
In February 2026, a joint investigation by Swedish newspapers Svenska Dagbladet and Goteborgs-Posten found footage from those glasses was being sent to annotators in Nairobi, Kenya.
What were those annotators watching?
- People in bathrooms
- People undressing
- Intimate moments
- Bank card numbers visible in frames
- Conversations about personal matters, crimes, finances
One annotator described watching a man place his glasses on his bedside table and leave the room. His wife came in and changed her clothes. The annotator watched everything.
When reporters asked Meta about this, the company took two months to respond and referred them to the privacy policy.
A class action lawsuit was filed in March 2026.
Amazon Alexa: They Removed Your Ability to Say No
In March 2025, Amazon sent an email to users. The subject line was something like “update to your voice settings.”
What it actually said: the ability to say “do not send voice recordings” was gone. Permanently. As of March 28, all voice interactions with Alexa flow to Amazon’s cloud by default.
You could not opt out.
Amazon decided it would use your voice to train its AI whether you liked it or not, and it removed the mechanism that let you say otherwise.
Remember when Amazon employees were caught listening to Alexa recordings in 2019? There was criticism. In 2023, the FTC fined Amazon 25 Million for retaining children’s voice recordings without consent.
Did Amazon change direction?
No. It accelerated.
Google: Your Emails, Now Training AI By Default
Before October 2025, if you wanted Google Gemini to read your Gmail, you had to actively turn that on.
After October 2025, Gemini has access to everything in your Gmail, your Google Drive, your Google Meet calls. By default. Automatically.
Opting out requires digging into privacy settings. Most people never do it.
A lawsuit was filed in California, alleging this violates the 1967 Invasion of Privacy Act. Google calls it a default setting. The lawsuit calls it unauthorized access.
That gap, right there, is where the surveillance economy lives.
XAI Grok: Your Inputs Now Belong to Them
Elon Musk’s xAI has a simple arrangement. When you use Grok, whatever you type, whatever it outputs, X now owns it. Classified as “Content” in their Terms of Service.
xAI also signed a 300 Million deal to put Grok inside Telegram. That gives them access to over 300 Million users’ conversations.
The Irish Data Protection Commission found xAI had no lawful basis for scraping EU user data and forced them to suspend operations and delete what they had collected.
That protection does not exist in Canada or the United States.
Microsoft Copilot: It Reads Everything You Type
Microsoft Copilot is now embedded in Word, Excel, Outlook, PowerPoint, and Teams. It ingests and processes everything you type unless you manually disable it.
Most users have not disabled it.
In November 2025, LinkedIn started using US members’ data to train AI content models. The default was opt-in. You had to find the setting and turn it off.
No popup. No notification. Just a quiet change in settings that most people never see.
The Question Nobody Is Asking
Here is the question I keep coming back to:
When did you stop being the customer and become the product?
Did you sign paperwork for this? Did you agree to have your bathroom visits watched, your bank cards read, your private conversations transcribed and sent to data centers around the world?
Technically, yes. You clicked “I agree.”
But nobody reads those documents. They are not designed to be read. They are designed to create a legal fiction of your consent while giving you none of the actual information you need to make a real choice.
What You Can Actually Do
Individual action is not enough. The system is built to be sticky, interconnected, and almost impossible to fully escape. But there are steps that matter:
- Audit your apps. Go into your phone settings today. Look at which apps have microphone access. Fourteen apps on mine did. I removed nine of them. It took 4 minutes. None of those nine needed it.
- Opt out where you can. Google, Microsoft, Amazon all have settings pages where you can reduce data collection. They bury them on purpose. Look for them anyway.
- Think about your devices as open microphones. Your phone. Your smart speaker. Your car. Your smart TV. They are all collection nodes. Act accordingly.
- Consider what you say near these devices. This is not about paranoia. It is about awareness. You do not have to be paranoid to recognize that a device with a microphone in your living room might be sending what it hears somewhere else.
- Get off the grid for sensitive conversations. Leave your phone in another room. Have important calls on a landline. Yes, landlines still exist. No, they are not perfect. But they are not sending your voice to a data center in Iowa.
The Real Problem
You cannot simply “read the Terms of Service” and make an informed choice. There are 1,500 of them per year. They are written by lawyers to be unreadable. Even if you read them, the defaults are set to extract, not protect.
The asymmetry is deliberate. The extraction is effortless and invisible. The awareness requires deliberate effort.
The companies understand this better than anyone. And they have designed everything to make sure you never think about it until you see an ad for something you only talked about in a closed room.
The surveillance economy is not breaking the law. It is operating exactly within legal frameworks that were built to allow it.
Your only real protection is understanding what you agreed to, even if you never actually agreed to anything in any meaningful sense.
And maybe, once in a while, leaving your phone in the car.
Want to understand your actual exposure?
Privacell offers a free thirty-minute consultation. No sales pitch. Just a conversation about what you are actually dealing with and what actually helps.
Call Gianni: 250.444.8404