• Advertise
  • Contact Us
  • Daily The Business
  • Privacy Policy
Social icon element need JNews Essential plugin to be activated.
Sunday, May 3, 2026
Daily The Business
  • Login
No Result
View All Result
DTB
No Result
View All Result
DTB

Can you trust AI to keep your secrets? Probably not, lawyers say.

August 31, 2025
in AI, artifical-intelligence, Law, lawsuits, openai
Can you trust AI to keep your secrets? Probably not, lawyers say.
Microsoft researchers found that jobs related to providing and communicating information are most likely to be affected by AI.

Oscar Wong/Getty

  • There's potential legal risk for people using AI chatbots for sensitive matters.
  • Conversations with chatbots lack legal protections like doctor-patient or attorney-client privilege.
  • People should use caution when conversing with chatbots, legal experts warned.

Artificial intelligence chatbots like OpenAI's ChatGPT are increasingly serving as confidants and stand-in therapists for many users.

But for the average user, sharing your deepest secrets with AI tools can potentially open you up to serious risks. Those conversations are not legally protected in the same way that they would be with, say, a doctor, lawyer, therapist, or even a spouse, attorneys warned.

Two lawyers with expertise in AI-related legal issues told Business Insider that people should exercise caution when conversing with AI chatbots, be familiar with their terms of service and data retention policies, and understand that sensitive chat records, if relevant, could be subpoenaed in a lawsuit or government investigation.

"People are just pouring their hearts out in these chats, and I think they need to be cautious," said Juan Perla, a partner at the global firm Curtis, Mallet-Prevost, Colt & Mosle LLP.

Perla, a leader in the firm's AI practice, said, "Right now, there really isn't anything that would protect them if a court really wanted to get to the chat for some reason related to a litigation."

OpenAI CEO Sam Atlman raised this point during a podcast that aired last month, noting that users, especially young people, are frequently turning to ChatGPT as a therapist or life coach. "People," the billionaire said, "talk about the most personal shit in their lives to ChatGPT."

"Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's like legal privilege for it — there's doctor-patient confidentiality, there's legal confidentiality," Altman told podcaster Theo Von. "We haven't figured that out yet for when you talk to ChatGPT."

"So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up," Altman said.

This lack of legal confidentiality when using AI tools, Perla said, should make users think twice about how much they choose to share.

Chatbot messages tied to situations like a workplace dispute, divorce, or custody case could be subject to discovery in related litigation, he said. The same goes for messages related to potential criminal activity.

"If you're putting something into ChatGPT because it's something that you would normally only share with your medical doctor, with your therapist, or with a lawyer, that should already tell you, 'I should not be putting this information in here,'" Perla said.

Even if users attempt to shield their identities or speak hypothetically to the chatbots, that won't fully eliminate possible risk.

The "wisest and safest" thing to do is not have those sensitive conversations with AI chatbots at all, Perla said.

"If you're talking about your personal intimate affairs with a chatbot that have nothing to do with the commission of a crime, that have nothing to do with a dispute or a litigation that could emerge, then the likelihood that these chats are going to be public or be turned over to a court or another party in discovery is pretty low," Perla said.

Knowing how AI platforms handle data

James Gatto, a partner at Sheppard Mullin who co-leads the firm's AI industry team of attorneys, told Business Insider that it's crucial for users to understand how different AI tools handle their data.

Some paid versions of certain AI platforms may offer more robust privacy features, such as the automatic deletion of user inputs, while the free, public versions typically do not, he said.

"If I was going to use a tool for anything sensitive. I'd want a tool that deleted the information," Gatto said. "And I would want to make sure the terms of service explicitly calls that out."

If users care about confidentiality and protecting themselves from any kinds of future legal risks, they must do their own diligence, he said.

"The important takeaway is you need to understand the pros and cons of using these tools, you need to understand the legal and personal risk," Gatto said.

"There may be circumstances where you're taking some risk, but the worst case scenario is not that bad," Gatto said. "There are other cases where a worst-case scenario is really bad and you wouldn't want to do it."

Perla added that the risk factor should be weighed "any time we're creating a record — text messages, chats, for sure."

"The question should be," Perla said, "am I comfortable with this information ever landing in the hands of somebody else that is not the person that I thought I was having this conversation with, or that is limited to the technology that I was engaged with?"

Read the original article on Business Insider
Previous Post

Your lookahead horoscope: August 31, 2025

Next Post

Maya in no hurry to get married

American Dollar Exchange Rate
Write us: info@dailythebusiness.com

© 2021 Daily The Business

Social icon element need JNews Essential plugin to be activated.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result

© 2021 Daily The Business

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.
Hacklink Satın Al