Sam Altman warns that there is no legal confidentiality if you use ChatGPT to be a therapist.

ChatGPT users might want to think twice about using their AI app to provide therapy or other types of emotional support. OpenAI CEO Sam Altman says that the AI industry is still figuring out how to protect users’ privacy in these sensitive conversations. There’s no doctor/patient confidentiality when your doctor is an AI.

This exec made the comments on a This past weekend w/ Theo Von podcast episode is now available.

In answer to a question on how AI works in today’s legal system Altman said that one of the problems with not having a legal framework or policy framework for AI yet is that there is no legal confidentiality of users’ conversations. Altman said that “people talk about the most private things in their lives” to ChatGPT. “Young people, in particular, use it as a life coach, a therapist. They have relationship problems and ask [asking] : ‘What should I do?’ Right now, you can talk to a lawyer, a therapist, or a doctor if they are concerned about your problems. There’s confidentiality between doctor and patient, there’s confidentiality in the legal field, etc. We haven’t yet figured out what happens when you speak to ChatGPT.”

Altman said that this could cause privacy concerns for users in a lawsuit because OpenAI is legally required to produce these conversations today.

I think that’s very screwed-up. Altman said, “I think we should have the privacy concept for AI conversations that we have with a therapist.

According to the company, privacy concerns could prevent users from adopting AI. AI is also being asked to provide data from chats of users in some legal contexts. OpenAI is already available.

Techcrunch event

San FranciscoOctober 27-29, 2025

In a statement on its website, OpenAI said it’s appealing this order, which it called “an overreach.” If the court could override OpenAI’s own decisions around data privacy, it could open the company to further demand for legal discovery or law enforcement purposes. Tech companies are routinely subpoenaed to provide user data for criminal prosecutions. In recent years, however, there has been an increase in concerns about digital data, as laws have begun to limit access to previously established liberties, such as a woman’s freedom to choose.

For example, when the Supreme Court overturned Roe v. Wade and customers began to switch to more private period tracking apps or Apple Health which encrypted their records, they began to migrate. Altman asked the podcaster about his own ChatGPT use, since Von had said that he didn’t speak to the AI chatbot very often due to his privacy concerns. Altman said, “It makes sense to want privacy clarity before using [ChatGPT] much — like legal clarity.”

Sarah is a reporter at TechCrunch. She has been working there since August 2011. She worked for ReadWriteWeb for over three years before joining the company. Sarah was an I.T. professional before becoming a reporter. Sarah has worked in a variety of industries including retail, banking and software.

View Bio

www.aiobserver.co

More from this stream

Recomended