Imagine this scenario: you're worried you may have committed a crime, so you turn to a trusted advisor — OpenAI's blockbuster ChatGPT, say — to describe what you did and get its advice.

This isn't remotely far-fetched; lots of people are already getting legal assistance from AI, on everything from divorce proceedings to parking violations. Because people are amazingly stupid, it's almost certain that people have already asked the bot for advice about enormously consequential questions about, say, murder or drug charges.

According to OpenAI CEO Sam Altman, anyone's who's done so has made a massive error — because unlike a human lawyer with whom you enjoy sweeping confidentiality protections, ChatGPT conversations can be used against you in court.

During a recent conversation with podcaster Theo Von, Altman admitted that there is no "legal confidentiality" when users talk to ChatGPT, and that OpenAI would be legally required to share those exchanges should they be subpoenaed.

"Right now, if you talk to a therapist or a lawyer or a doctor... there's legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality," the CEO said. "And we haven’t figured that out yet for when you talk to ChatGPT."

In response to that massive acknowledgement, Jessee Bundy of the Creative Counsel Law firm pointed out that lawyers like her had been warning "for over a year" that using ChatGPT for legal purposes could backfire spectacularly.

"If you’re pasting in contracts, asking legal questions, or asking [the chatbot] for strategy, you're not getting legal advice," the lawyer tweeted. "You’re generating discoverable evidence. No attorney-client privilege. No confidentiality. No ethical duty. No one to protect you."

"It might feel private, safe, and convenient," she continued. "But lawyers are bound to protect you. ChatGPT isn’t — and can be used against you."

When an AI defender came out of the woodwork to throw hot water on her PSA, Bundy clapped back.

"I think it is both, no?" needled AI CEO Malte Landwehr. "You get legal advice AND you create discoverable evidence. But one does not negate the other."

"For the love of God — no," the lawyer responded. "ChatGPT can’t give you legal advice."

"Legal advice comes from a licensed professional who understands your specific facts, goals, risks, and jurisdiction. And is accountable for it," she continued. "ChatGPT is a language model. It generates words that sound right based on patterns, but it doesn’t know your situation, and it’s not responsible if it’s wrong."

"That’s not advice," Bundy declared. "That’s playing legal Mad Libs."

Currently, OpenAI is duking it out in court with the New York Times as it attempts to bar the newspaper and its co-plaintiffs from dredging up users' chat logs — including deleted ones — in court.

Until a judge rules one way or another, those same chats will, per Altman, be discoverable in a court of law — so chat carefully.

More on AI legalese: LeBron James Not Happy With AI Videos Showing Him Pregnant


Share This Article