Back to blog
Opinion & Commentary / Trending Topics

“Your Chat Isn’t Confidential”: ChatGPT CEO Warns of Legal Risks in AI Conversations

DE

Dev Soni

Published 28 July 2025

“Your Chat Isn’t Confidential”: ChatGPT CEO Warns of Legal Risks in AI Conversations

In an era where AI is becoming our sounding board, therapist, legal advisor, and late-night confidant, a chilling reality has surfaced: your conversations with ChatGPT are not legally protected and could land in a courtroom.

OpenAI CEO Sam Altman has issued a public warning that AI chats, despite their intimate nature for many users, do not carry the confidentiality traditionally granted to professionals like doctors, therapists, or lawyers. The growing reliance on AI for emotional and strategic guidance now comes with rising legal and privacy risks, many users remain unaware of.

No Legal Privilege: AI Isn’t Your Therapist

During a recent interview on Theo Von's podcast This Past Weekend, Altman made it clear: if you tell ChatGPT your secrets whether it's about a breakup, mental health, or questionable decisions, those words can be used in court.

“If you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that. I think that’s very screwed up,” said Altman.

Unlike human professionals bound by strict confidentiality laws, ChatGPT operates in a legal grey zone. It can and under court orders, must disclose user data when compelled. This is especially concerning for the rising number of users, especially younger ones, who treat the chatbot like a digital therapist.

Court Orders Are Already in Motion

Altman’s warning isn’t just theoretical. In a high-profile legal battle with The New York Times over copyright infringement, a U.S. federal court ordered OpenAI to retain all ChatGPT user conversations indefinitely including those that users manually deleted.

U.S. Magistrate Judge Ona T. Wang and District Judge Sidney Stein instructed OpenAI to preserve and segregate output log data, meaning what was once ephemeral and deletable is now potentially permanent.

This retention policy applies to users across ChatGPT’s Free, Plus, Pro, and Team plans, though Enterprise and educational users are currently exempt. In essence, deleted doesn’t mean gone, especially when lawsuits are involved.

Not Encrypted, Not Secure

Unlike secure platforms like WhatsApp or Signal, which use end-to-end encryption, ChatGPT conversations are not encrypted at rest or in transit. That means OpenAI has full access to everything you type and under certain conditions, so can third parties like lawyers, judges, or government agencies.

Normally, OpenAI deletes user chats within 30 days, but ongoing litigation, abuse prevention, or national security triggers can override that policy. So when users open up about their trauma, business ideas, or personal struggles, they may be unknowingly leaving a digital paper trail.

A Call for Legal Reform

Altman has since urged lawmakers to address this privacy gap with urgency.

“We should have the same concept of privacy for AI conversations that we do with a therapist,” he stated.

The reality, however, is that the law has yet to catch up to AI’s human-like role. While ChatGPT may sound like a therapist or lawyer, legally, it’s more like talking to an email client, everything is recorded and retrievable unless protected by specific legal frameworks, which don’t yet exist for AI.

User Guide: Don’t Confuse Empathy with Privacy

Until legal safeguards are in place, users are advised to be extremely cautious when using AI for personal, professional, or sensitive matters. Here’s what you need to know:

  1. There is no legal confidentiality in ChatGPT conversations.
  2. Deleted messages may still be stored, especially under court-mandated preservation.
  3. OpenAI can access your conversations and must share them when legally compelled.
  4. AI is not bound by the same ethics or legal obligations as doctors, therapists, or attorneys.

Final Word: Think Before You Prompt

What feels like a private, one-on-one conversation with a friendly AI assistant may one day become part of a legal case file. As the legal system continues to wrestle with the implications of AI in everyday life, users must take responsibility for their digital privacy.

In the meantime, treat your chats with ChatGPT the way you would treat any email or text: recordable, discoverable, and possibly permanent.

If you're about to type something deeply personal or legally sensitive, stop and ask yourself:

“Would I say this in front of a judge?”

Because one day, you might.


Comments
0

Please log in to post a comment.

No comments yet. Be the first to share your thoughts!

Related Articles

Loading related articles...