×
Wednesday, December 31, 2025

Avoid the “false claims” of AI hallucinations when litigating FCA matters - www.hoganlovells.com

Artificial Intelligence (AI) is touted as a revolutionary tool that has become increasingly popular in workplaces across a variety of sectors, including the legal field. A March 2025 survey by Law360 found that more than half of attorneys at law firms in the United States use AI for at least some purpose, and an American Bar Association survey found that 30.2 percent of attorneys' offices offered AI tools as of late 2024. Indeed, several law firms have bet on AI by investing significant amounts to develop proprietary AI tools for their legal workforces.

While commentators continue to predict that AI will fundamentally alter the legal profession, and in many cases greatly reduce or eliminate the need for lawyers, real world examples continue to demonstrate that AI is far from perfect. AI outputs can be prone to “hallucinations,” which are assertions that sound confidently true, but are actually false, fictitious, or misleading. In the legal sphere, such hallucinations may manifest as made-up caselaw, fake statutes, or incorrect legal assertions.

Documented instances of AI hallucinations in court submissions are on the rise. As of September 19, 2025, one researcher has identified 244 court opinions that note fabricated or incorrect material submitted due to AI, and one study identified 22 cases between June 30 and August 1, 2025, alone where fictitious cases were identified in courts.

False Claims Act (FCA) litigation – cases in which either the government or an individual...



Read Full Story: https://news.google.com/rss/articles/CBMitwFBVV95cUxQTFpWTWM5MDZlWUdGSUZubHJL...