AI Hallucinations May Be With Us for a While
If you’ve ever caught a friend inventing a creative excuse for being late, you’ve already encountered the human version of […] The post AI Hallucinations May Be With Us for a While appeared first on Lawyerist.


If you’ve ever caught a friend inventing a creative excuse for being late, you’ve already encountered the human version of an “AI hallucination.” Lies! Pure Lies!! The difference? Your friend may eventually confess. Generative AI, on the other hand, will double down with the confidence of a Harvard grad student and footnotes … fake footnotes!
A recent Axios article (June 4, 2025) reminds us that, despite all the hype, AI large language models (LLMs) are still prone to hallucinations. These are moments when AI tools confidently serve up false or fabricated information, citations, or even entire legal precedents listing real courts, judges and lawyers … but completely made up!
Let’s face it: the legal profession is built on facts, precedent, and trust, not on “alternative facts.” When AI tools hallucinate, the risks aren’t just embarrassing; they’re potentially career-altering.
“AI makers could do more to limit chatbots’ penchant for ‘hallucinating,’ or making stuff up but they’re prioritizing speed and scale instead.” – Axios, June 4, 2025
Damien Charlotin tracks legal decisions in which lawyers have used evidence that featured AI hallucinations. His database shows more than 30 instances in May 2025 alone!
What’s a Lawyer (or anyone using AI) To Do?
Trust, but Verify: Treat every AI-generated citation like a dubious witness. Cross-examine it thoroughly before putting it on the record. Set expectations with clients. Explain that while AI is a powerful research tool, it’s not infallible. Think of it as a very enthusiastic first-year associate who sometimes embellishes, to put it nicely.
The Bottom Line
AI is changing the practice of law, but it’s not a substitute for human judgment. As Axios puts it, “the industry continues to remind users that they can’t trust every fact a chatbot asserts.” Let’s embrace the future, but let’s not let AI write our closing arguments … at least, not without a thorough fact-check.
The post AI Hallucinations May Be With Us for a While appeared first on Lawyerist.