Serving technology enthusiasts for more than 25 years.
TechSpot is the place to go for tech advice and analysis.
What just happened? Over the years, many organizations have warned ChatGPT users not to share personal information with OpenAI’s chatbot. A recent incident involving an now-removed chatbot feature revealed that thousands of people may have shared deeply intimate information. This information was also accidentally made available through Google searches.
OpenAI confirmed recently that it had deactivated a feature that shared chat history on the open web. The functionality was only available with explicit user permission. However, the description of this feature may have been too vague. Users expressed shock when their personal chat information appeared in Google search results.
Users share ChatGPT logs to friends, family, and associates. They assume that only the intended recipients will receive the links. OpenAI tested a second option to make chats discoverable. The fine print stated that they would appear in search engine results.
For many, the company’s message was too vague. Fast Company Inputting share links into Google’s search bar revealed almost 4500 conversations. Many of these logs contained information that no one would have published on the web.
While the search results did not reveal the full identities of users, many logged conversations included their names, location, and other details. ChatGPT chat logs reveal that many people use ChatGPT to discuss sensitive topics such as anxiety, addiction, abuse and other sensitive issues.
We have just removed a feature that allowed users to make conversations discoverable by search engine, such as Google. This was a short experiment to help people find useful conversations. This feature required users to opt-in, first by picking a chat… pic.twitter.com/mGI3lF05Ua
– DANKs (@cryps1s) July 31, 2025
OpenAI quickly removed the discoverability feature, describing it as an experiment to help spread “useful” conversations. The company is working to deindex all of the information that users have shared. ChatGPT users are advised that an order from a US court requires OpenAI, which is the company behind ChatGPT, to store all chat logs for indefinite periods. The company would normally remove them periodically, but publishers such as The New York Times are currently suing OpenAI in order to investigate whether ChatGPT is able to reproduce copyrighted material when asked. Legal teams can view all information entered by users until the case is settled.
Previous incidents have shown this can include trade secrets. In 2023, Samsung employees were caught using ChatGPT to accidentally give OpenAI confidential company information. Asking the chatbot to optimize a code or create a meeting minutes, for example, requires entering data that could contain trade secrets.
Proton launched a rival chatbot in the midst of the controversy. Lumo is part of a company-wide effort to differentiate its product line as a privacy focused alternative to Google and Microsoft. The Swiss company promises that it will encrypt user communications, never store personal information, have an ad free business model, and release the open source code.
