When you type a message to ChatGPT, Claude, or Gemini — where does that go? When Grammarly reads your emails, what does it see? When you use an AI image tool, who owns the output?
Most people using AI tools every day have genuinely no idea what's being collected, stored, or used. This lesson won't make you paranoid — it'll make you informed. Informed is different. You can make real decisions when you know the facts.
📌 The core principle
Your conversations with AI are not like conversations with a friend. They're more like conversations in a room with a transcript, a legal team, and a product roadmap. That doesn't mean you shouldn't use AI — it means you should know what you're sharing and with whom.
What the major AI tools
actually collect
Click each tool to see what's really happening behind the interface. This isn't scaremongering — it's the actual data from their privacy policies and terms of service.
Your personal
privacy audit
Work through this checklist for your own life. Every item you tick is a concrete step that reduces your exposure. You don't have to do everything — but you should make the decision consciously.
Account settings
✓
Turn off training opt-in on ChatGPT
Settings → Data Controls → Improve the model for everyone → Off
✓
Review chat history settings on every AI tool you use
Decide whether stored history poses a risk for you specifically
✓
Read the privacy policy of your most-used AI tool
At least the "What we collect" and "How we use it" sections — takes 5 minutes
✓
Use separate accounts for personal and professional AI use
Keeps your work data and personal data in separate risk buckets
What you share
✓
Never paste passwords, API keys, or credentials into an AI chat
Even "to check" or "for context" — these go to servers you don't control
✓
Remove identifying details before pasting documents about other people
Replace names, emails, addresses with [NAME], [EMAIL] etc. before asking AI to process them
✓
Check your company policy before using AI for client work
Many organisations have explicit policies — violating them can be a disciplinary issue
✓
Don't upload photos of other people to AI tools without their consent
Especially for face-based tools — privacy laws in many regions require explicit consent
✓
Be cautious with medical or financial information in AI chats
These are the highest-sensitivity categories. Anonymise where possible.
Browser & apps
✓
Audit browser extensions that have "read all site data" permissions
Grammarly, Jasper, and writing AI extensions read everything you type. Know which ones you have.
✓
Remove AI extensions from your browser when doing sensitive work
Or use a separate browser profile without extensions for confidential tasks
✓
Check what AI your phone keyboard is sending to servers
iOS predictive keyboard, Gboard, and others all have AI features with data implications
✓
Review microphone and camera permissions for AI apps
Voice AI tools especially — know what's being recorded and when
Your data rights
✓
Know that you can request deletion of your data from AI companies
GDPR (EU/UK), CCPA (California), and similar laws give you the right to request deletion
✓
Download your data archive from AI tools you use frequently
ChatGPT, Claude, and others let you export all your conversation history — useful to see what they have
✓
Delete conversation history you wouldn't be comfortable with others seeing
Old sensitive conversations don't need to live on AI servers indefinitely
✓
Understand that "free" AI tools are often funded by your data
The business model matters. Know whether you're the customer or the product.
✓
Set a recurring reminder to review your AI tool privacy settings
Privacy policies change. A quarterly check takes 10 minutes and keeps you current.
⚡ The practical rule
Don't share anything with an AI that you wouldn't be comfortable sharing with a stranger who works at a tech company. Not because AI companies are malicious — most aren't — but because data you share can be accessed, leaked, or repurposed in ways you didn't intend. This rule rules out about 90% of the situations where people overshare.
Key takeaway
Privacy isn't about being paranoid. It's about knowing what you're trading and deciding whether it's worth it — on your terms, not theirs.
✓
Mark lesson as complete