Lawyers representing Anthropic recently got busted for using a false attribution generated by Claude in an expert testimony.
But that's one of more than 20 court cases containing AI hallucinations in the past month alone,Woman Who Does as Her Oppa Wishes according to a new database created by French lawyer and data scientist Damien Charlotin. And those were just the ones that were caught in the act. In 2024, which was the first full year of tracking cases, Charlotin found 36 instances. That jumped up to 48 in 2025, and the year is only half way over. The database, which was created in early May, has 120 entries so far, going back to June 2023.
SEE ALSO: More concise chatbot responses tied to increase in hallucinations, study findsA database of AI hallucinations in court cases shows the increasing prevalence of lawyers using AI to automate the grunt work of building a case. The second oldest entry in the database is the Mata v. Avianca case which made headlines in May, 2023 when law firm Levidow, Levidow & Oberman got caught citing fake cases generated by ChatGPT.
The database tracks instances where an AI chatbot hallucinated text, "typically fake citations, but also other types of arguments," according to the site. That means fake references to previous cases, usually as a way of establishing legal precedent. It doesn't account for the use generative AI in other aspects of legal documents. "The universe of cases with hallucinated content is therefore necessarily wider (and I think much wider)," said Charlotin in an email to Mashable, emphasis original.
"In general, I think it's simply that the legal field is a perfect breeding ground for AI-generated hallucinations: this is a field based on load of text and arguments, where generative AI stands to take a strong position; citations follow patterns, and LLMs love that," said Charlotin.
The widespread availability of generative AI has made it drastically easier to produce text, automating research and writing that could take hours or even days. But in a way, Charlotin said, erroneous or misinterpreted citations for the basis of a legal argument are nothing new. "Copying and pasting citations from past cases, up until the time a citation bears little relation to the original case, has long been a staple of the profession," he said.
The difference, Charlotin noted, is that those copied and pasted citations at least referred to real court decisions. The hallucinations introduced by generative AI refer to court cases that never existed.
Judges and opposing lawyersare always supposed to check citations for their own respective responsibilities in the case. But this now includes looking for AI hallucinations. The increase of hallucinations discovered in cases could be the increasing availability of LLMs, but also "increased awareness of the issue on the part of everyone involved," said Charlotin.
Ultimately, leaning on ChatGPT, Claude, or other chatbots to cite past legal precedents is proving consequential. The penalties for those caught filing documents with AI hallucinations include financial sanctions, formal warnings, and even dismissal of cases.
That said, Charlotin said the penalties have been "mild" so far and the courts have put "the onus on the parties to behave," since the responsibility of checking citations remains the same. "I feel like there is a bit of embarrassment from anyone involved."
Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.
Topics Artificial Intelligence
YouTube will stream 13 MLB games this season'Toy Story 4' makers explain why this movie is a thing that existsInfluencers can use Instagram's new shopping features to sell productsWhy 'Game of Thrones' is heading for a Stark family victoryJ.K. Rowling responds to 7Donald Trump's grandfather was banned from Germany for avoiding military serviceDoes 'Avengers: Endgame' mean the Hulk can have sex now?Your Amazon Prime orders can now get delivered to your Ford trunkMark Zuckerberg says 'a private social platform' is the future at F8Boys' childhoods are ruined by discovery of Thanksgiving turkey's fate'Top End Wedding' brings the romOnePlus 7 and OnePlus 7 Pro leaks may have just revealed everything about the phonesWho are the 41 lucky people Trump follows on Twitter?Influencers can use Instagram's new shopping features to sell productsOculus Quest is a VR gaming revelation, but who is it for?Mark Zuckerberg's joke was a slap in the face of Facebook's victimsHuawei's phone sales grew big time as Apple and Samsung's declinedGoogle adds more biteFacebook redesigns app and website around groups and IRL connectionsFacebook redesigns app and website around groups and IRL connections NYT Strands hints, answers for January 12 Denver Nuggets vs. Dallas Mavericks 2025 livestream: Watch NBA online Best Sony TV deal: Save $200 on Sony 75 The Shark FlexStyle is $100 off and nearly at its lowest price ever Best 4K TV deal: Save $200 on Amazon Fire TV 55 Original Peloton Bike deal: $300 off at Amazon Best Fire TV Cube deal: Save $30 at Amazon NYT Strands hints, answers for January 11 Dakar Rally 2025 livestream: Watch Dakar Rally for free Commanders vs. Buccaneers 2025 live stream: How to watch NFL Wild Card online Best Buy Outlet sale: Score AirPods, gaming laptops, and appliances up to 50% off Calgary Flames vs. Chicago Blackhawks 2025 livestream: Watch NHL for free Best QNED TV deal: Save $150 on LG 55 How to donate to LA fire victims, and avoid falling for scams Best smartphone deal: Save $150 on Google Pixel Pro 9 How to unblock TikTok for free Best Uber deal: Uber Teen users can get free rides after failing a driver's test NYT Strands hints, answers for January 14 China is considering selling TikTok to Elon Musk, report claims Amazon Fire tablets on sale: Save up to 44% on top models
2.3168s , 8224.90625 kb
Copyright © 2025 Powered by 【Woman Who Does as Her Oppa Wishes】,Unobstructed Information Network