The attempt was "misbegotten at the jump."

Asking AI

Despite its well-documented tendency to make up claims — and entire court cases — on the spot, New York-based law firm Cuddy Law used the OpenAI chatbot ChatGPT to help justify a $113,484.62 bill for a recently won trial.

The firm argued that it had asked the AI tool for feedback on how much to charge, a sum the losing side was expected to pay.

But as The Register reports, NYC federal district judge Paul Englemayer saw right through their ill-advised plan.

"It suffices to say that the Cuddy Law Firm's invocation of ChatGPT as support for its aggressive fee bid is utterly and unusually unpersuasive," he wrote in his order.

Chatbot Lawyering

The incident isn't the first time law practitioners have been caught making use of ChatGPT. For instance, earlier this year, Colorado-based lawyer Zachariah Crabill was accused of drafting a legal document using the tool, an offense that promptly got him fired.

Steven Schwartz of Manhattan's Levidow, Levidow & Oberman law firm was also caught making use of ChatGPT in court, where it fabricated made-up court cases to defend one of his clients — but got away with a slap on the wrist.

ChatGPT's ability to gauge how much a lawyer should charge is equally dubious, with Englemayer calling Cuddy's attempt "misbegotten at the jump."

Lawyers for Cuddy law, however, told The Register that the AI tool didn't have a direct effect on legal proceedings and that its rates "were consistent with the range of rates and typical reasons for such rates that a parent... might find if using ChatGPT while researching what rates to expect."

Englemayer, however, is wary of setting a precedent.

"Barring a paradigm shift in the reliability of this tool, the Cuddy Law Firm is well advised to excise references to ChatGPT from future fee applications," he wrote in his order.

In the end, Cuddy was only awarded $53,050.13 — half what it originally billed.

More on ChatGPT and lawyering: Politician Admits He Used ChatGPT to Generate New Law


Share This Article