Opinion

Attorney Faces Sanctions for Citing Fake Cases from ChatGPT

An old attorney adage: if you have the facts on your side, pound the facts; if you have the law on your side, pound the law; if you have neither the facts nor the law, pound the table.” But one New York personal injury attorney is charting a fourth course: when nothing seems to be going right, enlist ChatGPT. 

You’ll be familiar with this artificial intelligence tool used to mimic human conversation when generating answers to questions. But now, Steven Schwartz has tried using it to create imaginary cases. Serving as the attorney for the plaintiff, he filed a federal court brief that contained six fake cases. He now faces the possibility of sanctions.

ChatGPT Generates Six Fake Cases

Schwartz’s problems began when the defendant moved to dismiss his client’s case as untimely. In response, Schwartz cited the “case” of Varghese v. China Southern Airlines Co., Ltd. According to Schwartz’s response, this case was issued by the United States Court of Appeals for the Eleventh Circuit in 2019. But Schwartz didn’t just reference the case—he quoted language from the opinion. And adding fuel to the fire, the attorney cited several other cases that the court in Varghese had used to reach its opinion.

The only problem: none of this was real. The Eleventh Circuit never issued Varghese. The quotations were completely fabricated. And the cases cited by Vargheses? Yes, those were fake too. The judge succinctly described the situation as “bogus judicial decisions with bogus quotes and bogus internal citations.”

The issue first came to light when the defendant filed a reply stating that it could not locate these fake cases using a legal research tool. At this point, the court ordered Schwartz and co-counsel to file an affidavit attaching the cases. They did so, attaching what appeared to be full-length case opinions.

After the defendant still was unable to locate these cases, the court issued an order requiring Schwartz, his co-counsel, and his firm to show cause, or explain, why they should not be sanctioned. Schwartz explained that he had used ChatGPT for the first time in a professional setting to assist with legal research and find relevant cases. After being asked a series of questions, ChatGPT generated a lengthy analysis of whether Schwartz’s client’s case was untimely, as well as a complete set of citations in support. And when required to submit the full text of the opinions, Schwartz had requested that ChatGPT provide the full case, which it did through fabrication. But importantly: at no point did Schwartz independently verify that the cases were real.

Under the various sanction rules available to the judge, the offending parties may be required to pay a penalty into court or pay the defendant’s attorney’s fees and costs. They may even have their right to practice in front of the entire United States District Court for the Southern District of New York revoked. A hearing is scheduled for June 8 to deal with this unprecedented situation.

While Schwartz may be the first to face the possibility of sanctions for relying on AI-generated cases, ChatGPT has faced previous legal troubles.

While Schwartz’s matter was pending, Mark Walters, a radio broadcaster, filed a lawsuit against OpenAI, L.L.C., the company that created ChatGPT. Walters alleges that a journalist interacted with ChatGPT and asked for a summary of a lawsuit pending in federal court in the Western District of Washington. During that interaction, ChatGPT identified Walters as one of the defendants, stating that he was accused of defrauding and embezzling funds from the Second Amendment Foundation.

Again, the same problem: this allegation, along with a host of other scandalous acts, were completely fabricated, according to Walters. Walters was never accused of defrauding and embezzling funds from the foundation. In fact, he was never even named as a party to the lawsuit.

Whether Walters’ case turns out to be another example of fabrication has yet to be determined. But both his and Schwartz’s stories should serve as cautionary tales to all who use ChatGPT to do their homework—whether in college or in court.

Resources

You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help

Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.

Story originally seen here

Editorial Staff

The American Legal Journal Provides The Latest Legal News From Across The Country To Our Readership Of Attorneys And Other Legal Professionals. Our Mission Is To Keep Our Legal Professionals Up-To-Date, And Well Informed, So They Can Operate At Their Highest Levels.

The American Legal Journal Favicon

Leave a Reply