TLDR; When you paste client session notes into ChatGPT, those conversations become permanent training data. Samsung banned it after engineers leaked sensitive information; most coaches don't know they're doing the same thing. This post reveals the 7 questions that separate safe AI tools from privacy nightmares, and a free tool for you to get clarity around what’s safe to use and what isn’t.

Watch the Video

Prefer to read? Full post below.

The $3 Billion Warning from Samsung

In early 2023, Samsung engineers made a critical error that every coach needs to understand.

They pasted sensitive data into ChatGPT. Proprietary code. Meeting transcripts. Internal documents.

Here's what they didn't realize: ChatGPT stores everything you type. It uses your data to train future versions. Once information enters the system, it becomes part of the AI's permanent learning.

Samsung's response was severe. They banned ChatGPT entirely.

A company that builds cutting-edge technology decided consumer AI tools were too dangerous to use. (TechCrunch)

If tech experts at Samsung can accidentally compromise data, what happens when coaches paste client session notes into these same tools?

Your Session Notes Aren't Private Anymore

Here's what happens when you paste that session transcript into ChatGPT.

The moment you hit enter, your client's revelations get logged. Not just processed - stored.

ChatGPT's privacy policy states they use conversations to "improve their services." Translation: your client's deepest fears become training data.

Here's where it gets uncomfortable.

When your client shares something unique - maybe they're the only CMO at a sustainable fashion startup dealing with a specific challenge - those details become part of the AI's knowledge. Six months later, someone asks about "CMO challenges in sustainable fashion," and fragments of your client's story could surface.

Think anonymity protects them?

It doesn't.

When you paste "My client, the CFO of that Boston biotech that just went public," you've created a digital fingerprint. How many newly-public Boston biotech CFOs exist?

The real gut punch?

Once data enters the training pipeline, there's no getting it back. Delete your account, but the patterns learned from your sessions have already been absorbed. You can't unlearn what an AI has learned.

The 7 Questions That Could Save Your Practice

Before you paste another word into any AI tool, ask these exact questions. Copy them. Use them.

"What happens to the data I input?"

Don't accept vague answers. You need specifics: Where is it stored? For how long? Who has access?

"Do you use my data to train your AI models?"

This is the big one. If they say yes or dodge the question, you're looking at a tool that treats client sessions as free training material.

"Who owns the data that's entered into the tool?"

Some tools claim ownership of everything you input. Others state you retain ownership. This distinction matters when client information is involved.

"Does the tool use your data to train AI models?"

This is the big one. If they say yes or dodge the question, you're looking at a tool that treats client sessions as free training material.

"Does the tool share data with third parties, and if so, for what purpose?"

Every "trusted partner" is another potential leak point. Get specifics about who sees your data and why.

"Can you delete your data from the tool easily if you stop using it?"

Watch what "delete" actually means. Does it disappear immediately? Or get marked for deletion "within 30 days"? What about backups and already-learned patterns?

"Where is the data stored, and what security measures are in place to protect it?"

Geographic location matters for compliance. Encryption standards matter for security. Vague answers here are red flags.

"Does the tool comply with privacy regulations like GDPR or CCPA?"

Compliance isn't just about following rules—it shows the company takes data protection seriously. Look for specific certifications, not just claims.

"Are there any red flags, vague terms, or missing details that coaches should be aware of?"

Trust your instincts. If the privacy policy is confusing or full of loopholes, that's by design.

Here's the truth: changing names doesn't create anonymity.

If you're discussing "a Seattle coach who specializes in tech executives and just published a book," how anonymous is that really?

These questions aren't optional. They're your professional responsibility.

Want Help Evaluating Tools? There's a Shortcut

Reading privacy policies feels like decoding legal documents. Terms of Service blur together.

That's why I built the Coach's AI Safety Guide.

Give it any AI tool's website. It navigates to their official pages, reads their Terms of Service, Privacy Policy, and Security documentation. Then it delivers a plain-language report with:

  • One-sentence verdict: Safe for coaching, ⚠️ Depends on use case, or Not recommended

  • Three key things you need to know

  • Analysis of all seven questions above (with status indicators)

  • Specific risks for coaching practices

No technical jargon. No legal speak. Just clear answers about whether you can trust this tool with client information.

Try it with any tool you're considering. Takes 30 seconds instead of 30 minutes.

Professional tools built for businesses handle these questions differently. They're designed for real liability if data leaks.

These questions aren't optional. They're your professional responsibility.

There Is a Way To Do This Safely

Samsung learned the hard way that consumer AI tools aren't designed for sensitive data.

We can learn from their mistake without repeating it.

You can leverage AI's power and protect client privacy. You just need to know which tools to use.

Consumer AI products (ChatGPT, Claude's web interface, Gemini) are built for general users, not professionals with confidentiality obligations.

Professional tools exist.

They use AI's analytical power without treating sessions as training data. They're designed for businesses with real liability if data leaks.

The coaches who thrive in the AI era will understand this distinction. They'll ask the right questions. They'll protect their clients while leveraging technology to deepen their practice.

Your clients trust you with their breakthroughs.

Honor that trust in every tool you choose.

This is Part 1 of The Ethical Coach's Guide to AI series. Next: Security measures that actually matter for coaching practices.

What's your biggest concern about AI and client privacy? Let me know below.

If you're a coach committed to continuous improvement and curious about AI in your practice - I'd love to hear from you. Share your questions, challenges, or insights at [email protected]. And if you want to see how reflective practice can deepen your coaching work, visit journeyloop.ai to learn more about JourneyLoop

Keep reading

No posts found