Get Updates
Get notified of breaking news, exclusive insights, and must-see stories!

Your ChatGPT Therapy Chats Aren’t Private, Warns OpenAI CEO Sam Altman

OpenAI's CEO, Sam Altman, has highlighted a significant concern regarding user privacy with AI chatbots like ChatGPT. He pointed out that the tech industry hasn't yet established how to maintain confidentiality for sensitive conversations. This issue becomes crucial as many people, including children, use these chatbots for therapy and emotional support.

Altman emphasised that there is no legal framework ensuring confidentiality for interactions with AI. Unlike conversations with therapists or lawyers, which are protected by legal privilege, discussions with ChatGPT lack such protection. "People talk about the most personal sh*t in their lives to ChatGPT," Altman noted. He stressed the urgent need to address this gap in privacy and confidentiality.

Privacy Concerns with AI Conversations

The absence of legal confidentiality means that discussions about mental health or personal advice with ChatGPT aren't private. In legal situations, these conversations could be disclosed in court. This contrasts sharply with encrypted messaging apps like WhatsApp or Signal, where third parties can't access chats.

OpenAI can access all user interactions with ChatGPT. This data helps improve the AI model and monitor misuse but raises privacy concerns. Although OpenAI claims it deletes free-tier chats after 30 days, some data might be retained for legal reasons. This practice has come under scrutiny amid a lawsuit involving The New York Times.

Legal Implications and Data Usage

Altman mentioned that if a lawsuit arises, OpenAI might be compelled to produce user conversations as evidence. "So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that," he stated. This potential for disclosure underscores the need for clear policies on AI data handling.

The ongoing lawsuit requires OpenAI to retain user data from millions of ChatGPT users, excluding enterprise clients. This situation highlights the tension between data usage for AI improvement and user privacy rights. As AI technology evolves, establishing robust legal frameworks will be essential to protect users' sensitive information.

The current lack of confidentiality in AI interactions poses risks for users seeking emotional support through chatbots like ChatGPT. Until comprehensive legal protections are established, users should remain cautious about sharing personal information with AI systems.

Notifications
Settings
Clear Notifications
Notifications
Use the toggle to switch on notifications
  • Block for 8 hours
  • Block for 12 hours
  • Block for 24 hours
  • Don't block
Gender
Select your Gender
  • Male
  • Female
  • Others
Age
Select your Age Range
  • Under 18
  • 18 to 25
  • 26 to 35
  • 36 to 45
  • 45 to 55
  • 55+