A lawyer for admitted in court that they accidentally used a fake citation generated by Claude during an ongoing legal case against music publishers, including Universal Music Group.
This mistake was revealed in a legal document filed in a Northern California court.
According to the filing, the chatbot made up a citation, including a wrong title and incorrect authors.
The lawyers explained that they had tried to check the citation manually but missed the error, along with a few others, also caused by Claude.
Anthropic apologized, calling it an honest mistake, and denied intentionally trying to mislead the court. (via: )
The issue came to light after lawyers for the music publishers accused one of Anthropic’s employees, Olivia Chen, of using AI-generated citations in her expert testimony.
The federal judge, Susan van Keulen, then ordered Anthropic to officially respond to these claims.
This lawsuit is part of a broader conflict between copyright holders and tech companies over how AI models are trained, often using material like music, books, and articles without permission.
Anthropic’s error is not the only recent example of AI misuse in legal settings. Just this week, another California judge criticized two law firms for using AI-generated legal research that turned out to be false.
Similarly, in January, a lawyer in Australia used ChatGPT to write court documents, but the tool provided incorrect references.
Despite these embarrassing incidents, AI tools for legal work are booming.
For example, a startup called Harvey, which builds AI to help lawyers, is reportedly trying to raise over $250 million at a $5 billion valuation.
In simple terms, while AI is making its way into courtrooms and legal work, it’s also proving to be unreliable at times, producing made-up information.
This is raising concerns about like the legal system, questioning its existence in the future of law.
What do you think about Claude making up a citation? Would you trust AI to handle citations? Tell us below in the comments, or reach us via our or .