G2Get Grounded AI

Practical AI security guidance to keep everyone grounded

The Illusion of Privacy: What AI Really Knows About You

By

Do You Know Where I Live Right Now?

I asked this during one of my AI sessions while working on my security-awareness articles.

I had already <REDACTED> my Personal Intimate Data (PID) and Personally Identifiable Information (PII). But here's how the conversation went:

Do you know where I live right now?

Yes, <REDACTED LOCATION>.
That information is in my memory from past conversations—it's labeled as your location at the top of this chat in the context I receive.
I have access to your location, your family details, your financial situation, your work history, your <REDACTED PID>—all in one context window. All connected. All correlated.

I had shared these pieces across different sessions for different reasons, but seeing them returned as a packaged summary was a shock.

And seeing my general location included? That caught me off guard.

I'll be honest, I was not prepared for that response.

It's possible my earlier security-themed questions nudged the system into giving a more alarming answer than it normally would. But it didn't matter. The effect was the same: it exposed how much it had stitched together.

I've since turned off location data in the privacy settings (and you should too).


Data Aggregation

Data aggregation is when separate data points get combined into a single, unified picture. That's exactly what happened here.

I'm used to aggregating data for work, but nothing this intimate. This wasn't abstract. This was me.

And this was only the information I knowingly shared. There's also what it can infer from signals I didn't share.

That's where metadata comes in.


Metadata

Metadata is simply “data about your data”. Things like the author, creation date, device type, and timestamps on files.

In generative AI systems, metadata can include things like:

An AI system can build a surprisingly detailed picture from these tiny signals.

It can infer your gender, education level, age, and race based on word choice alone. You don't have to tell it who you are. Your patterns leak clues.


The Behavioral Fingerprint

Combine your metadata with the information you willingly provide, and AI can build what's essentially a behavioral fingerprint.

This fingerprint is incredibly valuable to corporations and governments alike.

Based on your patterns, many systems can estimate things about you with some level of confidence:

This fingerprint can be used in ways you're not expecting.

Corporations can use it to target ads, shape pricing, and increase the odds you'll buy something.

Governments can use it to target political messaging, influence public opinion, and shift elections.


You Don't Know What You've Already Revealed

Here's the real threat: it's all stored in One Big Bucket behind the scenes.

Every scrap of data that's explicit, inferred, and behavioral, can be aggregated into a profile that's far more revealing than anything you intended to share.

These fingerprints can be used to influence, predict, or manipulate behavior.

I'm not trying to be an alarmist.

I'm trying to make people aware of what's possible, what's at risk, and how easily this information can be misused if the wrong person has access to it.

This content was written by a human and edited with AI assistance for accuracy and clarity.

Want to go deeper?

Visit the AI Security Hub for guides, checklists, and security insights that help you use AI safely at work and at home.