AIFoPa-2024-0003 — Attorney Files Court Brief Citing Six Non-Existent Cases Generated by ChatGPT; Judge Requires Explanation
An attorney submitted a court brief citing six cases in support of legal arguments. Opposing counsel, upon attempting to locate the cases, could not find them. The cases did not exist. They had been generated by ChatGPT, which had produced case names, court designations, docket numbers, and judicial holdings with complete fluency and total inaccuracy.
The attorney, when required to explain the brief to the court, submitted a declaration stating that the citations "appear to be hallucinations from a generative AI platform." The Bureau notes that this is a precise and accurate description of what had occurred, and that submitting it to a federal court is a situation most attorneys would prefer to avoid.
This incident was neither the first nor the last of its kind. In 2023, attorneys Mata v. Avianca had submitted a brief with ChatGPT-generated citations and faced sanctions. By 2024, multiple bar associations had issued guidance, courts had implemented disclosure requirements, and the phrase "AI hallucination" had entered the legal profession's working vocabulary. None of this prevented the incident documented here.
The Bureau observes that guidance, requirements, and vocabulary are not, in themselves, verification. Verification is verification. The cases did not exist. No amount of subsequent documentation alters this.