AI Therapy Isn’t Protected by Confidentiality Laws

AI therapy tools may be accessible and judgment-free but they don’t come with legal confidentiality.

AI therapy tools are fast becoming a lifeline for people in need, especially those shut out by cost, stigma, or long waitlists. They’re convenient, always available, and often free. There’s one huge catch: unlike human therapists, AI chatbots don’t come with legal confidentiality. That could turn your most private thoughts into discoverable evidence in a courtroom.

Should You Trust AI With Your Deepest Secrets?

When you talk to a licensed therapist, your conversation is protected by confidentiality laws. That means your therapist can’t legally disclose what you’ve said unless there’s a serious safety concern. When you talk to an AI, none of those legal protections apply.

Even OpenAI CEO Sam Altman has admitted the AI industry hasn’t figured out how to replicate the privacy standards of human professionals. On a recent podcast, he explained that there’s currently no legal framework shielding user data shared with tools like ChatGPT. In fact, OpenAI is actively fighting a court order demanding access to hundreds of millions of chat logs, an order Altman described as a serious overreach.

Until that fight is resolved or new laws are introduced, anything you say to an AI could, in theory, be pulled into legal proceedings. Or worse, accessed in a data breach and used in ways you never intended.

Why People Still Turn to AI for Support

Despite the risks, AI remains a popular go-to for emotional support:

  • It’s always available: AI tools don’t take weekends off. You can talk to them in the middle of the night, during a crisis, or while waiting for a real therapist to become available.
  • It’s affordable: Most AI tools are free or low-cost, compared to therapy sessions that can run hundreds of dollars without insurance.
  • It feels safer…for now: Many people feel less judged talking to a bot. There’s no eye contact, no awkward silences, and no fear of being misunderstood or stigmatized.
  • It feels anonymous: People often believe that because they’re not face-to-face with someone, their data is inherently safer, which isn’t necessarily true.

AI is filling a very real gap, especially as we face a global shortage of mental health professionals. It makes support more accessible, especially in underserved or rural areas. That doesn’t make it a substitute for trained, licensed professionals, particularly when it comes to serious or deeply personal matters.

What You Say Could Come Back to Haunt You

The problem isn’t just that AI lacks confidentiality. It’s that users often don’t realize just how exposed they are. Unlike therapists, AI chatbots have no legal or ethical obligation to keep your data private. They have privacy policies that most people never read and data storage rules that could change with a single software update.

There’s also little to no transparency about who has access to your chats or how long they’re kept. Zero legal privilege means your conversations can be used in court or shared with third parties if compelled.

Even if you’re using a paid version with better encryption, there’s still no true doctor-patient confidentiality. That legal protection only applies when you’re speaking to an actual human therapist bound by licensing rules and ethics codes.

How to Protect Yourself When Using AI for Therapy

If you’re using AI as a stand-in therapist, or even just for occasional venting, it’s important to take some precautions:

  • Don’t overshare: Avoid entering sensitive personal data like real names, addresses, medical history, or anything you wouldn’t want disclosed in court.
  • Know the risks: Remember that AI isn’t your private journal. It’s a digital conversation that can be stored, accessed, or subpoenaed.
  • Check the fine print: Read the privacy policy of any AI app you’re using. Look for red flags like broad data-sharing clauses or unclear storage timelines.
  • Push for better laws: The more people demand AI privacy protections that match human standards, the more pressure there will be on lawmakers to act.
  • Use a human therapist when needed: If you’re dealing with serious mental health concerns or need guaranteed privacy, seek out a licensed therapist.

AI therapy might feel like a modern miracle if you’ve struggled to access traditional care. However, it’s important to understand what you’re trading for that convenience. These tools aren’t bound by the same legal or ethical standards as real therapists, and that makes them fundamentally riskier.

Until we have clear legal safeguards in place, treat AI therapy as a helpful assistant, not a confidant. Use it to bridge the gap, but don’t bare your soul. Because in the digital world, what you say to a chatbot might not stay between the two of you.

You May Also Like