On Monday, July 28, 2025, Mississippi Today reported that a federal judge in Mississippi issued a temporary restraining order containing multiple factual inaccuracies, prompting speculation about the use of artificial intelligence in its drafting.
U.S. District Judge Henry T. Wingate, based in the Southern District of Mississippi, released the error-laden order on July 20, temporarily halting a state law banning diversity, equity, and inclusion programs in public schools and universities.
The original order included incorrect details, such as naming plaintiffs like the Mississippi Library Association and Delta Sigma Theta Sorority Inc., who were not involved in the lawsuit and had no cases pending in the Southern District. It also quotes portions of the initial lawsuit and the state’s DEI legislation, presenting phrases as direct quotes that did not appear in either the complaint or the legislation. Additionally, the corrected order still referenced a 1974 case, Cousins v. School Board of City of Norfolk, from the U.S. 4th Circuit Court of Appeals, which could not be located, suggesting an incorrect citation or nonexistent case.
Attorneys from the Mississippi Attorney General’s Office requested clarification of the order, a motion unopposed by the plaintiffs’ lawyers. Judge Wingate issued a corrected version, backdated to July 20, though filed three days later. The original order was removed from the court docket, making it inaccessible to the public. The revised order still included the questionable 1974 case citation.
Legal experts expressed confusion over the errors. A Mississippi Attorney General’s Office official, speaking anonymously due to ongoing litigation, noted that their attorneys had never encountered such issues.
Christina Frohock, a University of Miami law professor specializing in AI’s impact on legal integrity, highlighted that AI “hallucinations” can produce fabricated case citations or quotes, though she refrained from directly attributing the errors to AI.
Wingate, 78, appointed by President Ronald Reagan in 1985 and confirmed by the Senate that year, served as chief judge of the Southern District from 2003 to 2010. He did not respond to inquiries about the order or whether AI was used in its preparation.
Unlike attorneys, who face ethical obligations and potential sanctions for AI-related errors, as seen in a recent Colorado case where two lawyers were fined for a mistake-filled AI-generated filing, judges face less direct accountability.
“If an attorney does this, a judge can demand explanations, but it’s not true in the other direction,” Frohock said. “We will probably never know what happened, unless an appellate court demands it.”
Source: Mississippi Today