On Tuesday, November 18, 2025, Migrant Insider reported that Immigration Judge John P. Burns of the New York Broadway Immigration Court has been utilizing artificial intelligence (AI) to produce audio recordings of his courtroom decisions. Internal records from the Executive Office for Immigration Review (EOIR) revealed this practice.

A senior Justice Department official, speaking on condition of anonymity, described the situation as “highly unusual.” The official could not confirm whether Judge Burns uses AI to draft complete written decisions or solely to read pre-written rulings aloud through text-to-speech software. Audio files reviewed by Migrant Insider suggest the latter.

This revelation surfaces following Acting EOIR Director Sirce E. Owen’s circulation of Policy Memorandum 25-40 months prior. The memo, distributed internally to immigration judges and court administrators, acknowledged the absence of a clear prohibition or mandatory disclosure requirement regarding the use of generative AI in immigration court proceedings.

The August memo permitted individual courts to establish local standing orders concerning AI use, but it did not mandate disclosure when such technologies are employed in adjudications. The memo’s silence on text-to-voice or AI-generated decision delivery seemingly grants individual judges discretion in this area. Judge Burns appears to be among the first to leverage this ambiguity. Courtroom assistants have indicated that the use of “voice rendering” software began earlier in the year and is now a standard component of his decision-making process.

Judge Burns’ employment of AI tools coincides with scrutiny of his record as one of the nation’s most restrictive immigration judges. Data indicates that he approved only 2 percent of asylum claims between fiscal years 2019 and 2025, significantly lower than the national average of 57.7 percent. Pathfinder data corroborates this 2 percent benchmark, positioning him among the lowest nationwide.

These statistics have drawn criticism from immigration advocates, who express concern that combining low approval rates with AI-assisted adjudication could undermine defendants’ confidence in the legal system. One immigration lawyer practicing before the Broadway court emphasized the need for certainty that a judge’s own reasoning is being conveyed, rather than a synthesized technological layer, particularly when a person’s freedom is at stake.

Internal EOIR emails indicate that Judge Burns’ appointment to the bench was politically influenced. Despite being initially rated “highly qualified” for his litigation experience by two Assistant Chief Immigration Judges, he received an overall ranking of “Not Recommend” in May 2020. However, senior EOIR leadership later overruled this assessment, reclassifying him as “Highly Recommended” in September 2020. This occurred during a period when Trump-era DOJ officials were expediting the appointment of judges with prosecutorial backgrounds.

At the time of his selection, Judge Burns was serving as an Assistant Chief Counsel for U.S. Immigration and Customs Enforcement (ICE) in New York, representing the government in removal proceedings and appeals. His résumé and military record were highlighted in his appointment announcement that December, as one of 14 judges named by the outgoing administration. EOIR data logs show that nearly all of these appointees came from government enforcement roles.

Recent EOIR correspondence references the removal of over 125 judges since January through firings and resignations, replaced by politically aligned appointees. An August rule further relaxed hiring standards, enabling the Attorney General to appoint “any licensed attorney” as a temporary immigration judge.

EOIR declined to comment on Judge Burns’ use of text-to-voice technology or his appointment history. A Justice Department spokesperson stated only that “immigration judges apply the law to the facts of each case.” Experts caution that the quiet introduction of AI applications into immigration courtrooms, without disclosure requirements or oversight, represents an accelerating shift.

A former EOIR official warned that the automation of adjudication in a system already facing fairness challenges could diminish accountability as the human element fades.

 

 

Source: Migrant Insider