Attorneys involved in a product-liability lawsuit apologized to the judge for submitting documents citing non-existent legal cases.
A complaint was filed in June 2023 against Walmart and Jetson Electric Bikes for a fire that was allegedly caused by hoverboards [PDF]. According to reports, the fire destroyed the plaintiffs’ home and caused serious injuries to family members.
Last Thursday, Wyoming District Court Judge Kelly Rankin gave an order to show reason [PDF] directing the plaintiffs’ lawyers to explain why they shouldn’t be sanctioned because they cited eight cases that don’t exist in a filing dated January 22, 2025. The citations were part of a legal argument that the lawyers hoped could prevent some evidence from being presented to the jury. This argument was made in a motion limine [PDF] which is a special type motion that allows certain evidence to be excluded from trial without a jury present.
This document cites nine case studies to support its arguments. Among them is Meyer v. City of Cheyenne (D. Wyo.) 2017 WL 3461055. 2017).[1965909] [PDF]According to a later filing [PDF]the case was hallucinated.
Click to enlarge.
ChatGPT hallucinating the case of 2017 WL 3461055. Judge Rankin noted that eight of the nine citations included in the January motion came from nowhere or led to cases with a different name. The judge’s order cites some past cases where AI chatbots hallucinated during legal proceedings in the past few years, including Mata v. Avianca Incand United States V. Hayes (). He also asks attorneys who signed the document to explain why they shouldn’t be punished.
Taly Goody, and T. Michael Morgan, two of the attorneys involved in the case filed a joint reply [PDF] on Monday acknowledging the mistake. The filing includes:
To prevent this from happening again on Monday, the law office Morgan & Morgan of which T. Michael Morgan was an attorney filed a joint response [PDF] acknowledging the error. “added a click box to our AI platform that requires acknowledgement of the limitations of artificial intelligence and the obligations of the attorneys when using our artificial intelligence platform.”
Rudwin Ayala, the third attorney involved, took the blame for the faulty document in a Thursday response [PDF] and cleared his co-counsels. The attorney explains
“Part of my preparation of said motions in limine included use of an internal AI tool for purposes of providing additional case support for the arguments I set forth in the Motions,” . “After uploading the draft of the Motion, I asked the AI tool to ‘add to this Motion in limine Federal Case Law from Wyoming setting out requirements for Motions in Limine’. I also asked it to ‘add more cases regarding Motions in Limine’. AI chatbots can amplify the creation of false memories according to boffins, but do they?
Another question was “Add a paragraph to the motion in limine to prevent evidence or commentary about an improperly discarded tobacco causing the fire because there is no evidence for this. Include Wyoming federal court case law to support the exclusion of such evidence.
“There were a few other inquiries made requesting the addition of case law to support exclusion of evidence, all similar in nature. This was the first time in my career that I ever used AI for queries of this nature.”
This may be the last time. After thoroughly explaining the SNAFU to the court, the attorney threw his self at the mercy.
“With a repentant heart, I sincerely apologize to this court, to my firm, and colleagues representing defendants for this mistake and any embarrassment I may have caused. The last week has been very humbling for me professionally, and personally, one that I can guarantee shall not ever repeat itself.” (r)