Prosecutor Used Flawed A.I. to Keep a Man in Jail, His Lawyers Say
Source: NYT
When Kyle Kjoller, a 57-year-old welder, was ordered held without bail in Nevada County, Calif., in April, he protested. The charges against him multiple counts of illegal gun possession were not grave enough under California law to warrant keeping him in jail for months awaiting his trial, he argued.
Prosecutors disagreed, and offered 11 pages worth of reasons. But the brief they filed, Mr. Kjollers lawyers contend, was rife with errors that bear the hallmarks of generative artificial intelligence.
The lawyers soon turned up briefs in four separate cases, including Mr. Kjollers, that were filled with mistakes, all of them from the office of the same prosecutor, District Attorney Jesse Wilson. The mistakes included wholesale misinterpretations of the law, as well as quotations that do not actually appear in the cited texts.
-snip-
The Kjoller case, though, is one of the first in which prosecutors, whose words carry great sway with judges and juries, have been accused of using A.I. without proper safeguards.
-snip-
Read more: https://www.nytimes.com/2025/11/25/us/prosecutor-artificial-intelligence-errors-lawyers-california.html
struggle4progress
(125,206 posts)By Sharon Bernstein
November 7, 2025 5:00 AM
Northern California prosecutors used artificial intelligence to write a criminal court filing that contained references to nonexistent legal cases and precedents, Nevada County District Attorney Jesse Wilson said in a statement. The motion included false information known in artificial intelligence circles as hallucinations, meaning that it was invented by the AI software asked to write the material, Wilson said. It was filed in connection with the case of Kalen Turner, who was accused of five felony and two misdemeanor drug counts, he said ...
The situation is the latest example of the potential pitfalls connected with the growing use of AI. In fields such as law, errors in AI-generated briefs could impact the freedom of a person accused of a crime. In health care, AI analysis of medical necessity has resulted in the denial of some types of care. In April, A 16-year-old Rancho Santa Margarita boy killed himself after discussing suicidal thoughts with an AI chatbot, prompting a new California law aimed at protecting vulnerable users ... All three of the Nevada County cases are the subject of a petition filed late last month with the California Supreme Court by a defendant in one of them.
https://www.sacbee.com/news/local/article312815223.html#storylink=cpy
Eugene
(66,620 posts)but an AI hallucination-tainted criminal case is new to me. The stakes are obviously much higher.