News

AI 'hallucinations' are causing lawyers professional embarrassment, sanctions from judges and lost cases. Why do they keep ...
Hallucinations from AI in court documents are infuriating judges. Experts predict that the problem’s only going to get worse.
The lawyers blamed AI tools, including ChatGPT, for errors such as including non-existent quotes from other cases.
Anthropic has responded to allegations that it used an AI-fabricated source in its legal battle against music publishers, ...
Claude hallucinated the citation with “an inaccurate title and inaccurate authors,” Anthropic says in the filing, first ...
The flawed citations, or "hallucinations," appeared in an April 30, 2025 declaration [PDF] from Anthropic data scientist ...
Anthropic on Thursday admitted that a faulty reference in a court paper was the result of its own AI assistant Claude and ...
The AI chatbot was used to help draft a citation in an expert report for Anthropic's copyright lawsuit.
The chatbot added wording errors to a citation ... to allegations that it used an AI-fabricated source in its legal battle against music publishers, saying its Claude chatbot made an “honest ...
The federal judge, Susan van Keulen, then ordered Anthropic to officially respond to these claims. This lawsuit is part of a broader conflict between copyright holders and tech companies over how AI ...
A lawyer representing Anthropic admitted to using an erroneous citation created by the company’s Claude AI ... errors aren’t stopping startups from raising enormous rounds to automate legal ...