Your ChatGPT Therapy Chats Aren’t Private, Warns OpenAI CEO Sam Altman
OpenAI's CEO, Sam Altman, has highlighted a significant concern regarding user privacy with AI chatbots like ChatGPT. He pointed out that the tech industry hasn't yet established how to maintain confidentiality for sensitive conversations. This issue becomes crucial as many people, including children, use these chatbots for therapy and emotional support.
Altman emphasised that there is no legal framework ensuring confidentiality for interactions with AI. Unlike conversations with therapists or lawyers, which are protected by legal privilege, discussions with ChatGPT lack such protection. "People talk about the most personal sh*t in their lives to ChatGPT," Altman noted. He stressed the urgent need to address this gap in privacy and confidentiality.

Privacy Concerns with AI Conversations
The absence of legal confidentiality means that discussions about mental health or personal advice with ChatGPT aren't private. In legal situations, these conversations could be disclosed in court. This contrasts sharply with encrypted messaging apps like WhatsApp or Signal, where third parties can't access chats.
OpenAI can access all user interactions with ChatGPT. This data helps improve the AI model and monitor misuse but raises privacy concerns. Although OpenAI claims it deletes free-tier chats after 30 days, some data might be retained for legal reasons. This practice has come under scrutiny amid a lawsuit involving The New York Times.
Legal Implications and Data Usage
Altman mentioned that if a lawsuit arises, OpenAI might be compelled to produce user conversations as evidence. "So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that," he stated. This potential for disclosure underscores the need for clear policies on AI data handling.
The ongoing lawsuit requires OpenAI to retain user data from millions of ChatGPT users, excluding enterprise clients. This situation highlights the tension between data usage for AI improvement and user privacy rights. As AI technology evolves, establishing robust legal frameworks will be essential to protect users' sensitive information.
The current lack of confidentiality in AI interactions poses risks for users seeking emotional support through chatbots like ChatGPT. Until comprehensive legal protections are established, users should remain cautious about sharing personal information with AI systems.
-
Bangalore Gold Silver Rate Today, March 9, 2026: Gold and Silver Prices Fall as US Dollar Strengthens -
Vijay-NDA Alliance On Cards? Pawan Kalyan Reportedly Reaches Out to TVK Chief -
Who Was Mojtaba Khamenei’s Wife Zahra Haddad-Adel and What Do We Know About Her? -
Who Is Aditi Hundia? Viral ‘Girl in Red’ & Ishan Kishan's Girlfriend Spotted During IND vs NZ Final -
Hyderabad Gold Silver Rate Today, 9 March 2026: Latest 24K, 22K Gold And Silver Rates In City -
Kerala Election 2026 Date: When Can You Expect EC To Announce Key Dates of Voting & Counting? -
Chennai MRTS Velachery–St Thomas Mount Line Opening on March 10 Faces Delay; Direct Beach Route to Start Later -
Mumbai Water Supply Cut For 24 Hours: Check Dates, Timings & Areas Affected by BMC Maintenance Disruption -
Hardik Pandya and Girlfriend Mahieka Sharma’s Celebration Video Goes Viral After India’s Win -
Bengaluru Hotels to Shut From Tomorrow March 10 as Commercial LPG Supply Stops -
Trisha's Net Worth: How Rich Is Thalapathy Vijay's Rumoured Girlfriend? -
Pune Electrician Arrested After Viral Video Shows Him Raising ‘Pakistan Zindabad’ Slogans, Watch












Click it and Unblock the Notifications