23.5 C
Miami
Tuesday, April 21, 2026

Elite law firm Sullivan & Cromwell admits to AI ‘hallucinations’

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

Unlock the Editor’s Digest for free

Sullivan & Cromwell told a US federal bankruptcy court that a major filing it made in a high-profile case contained multiple “hallucinations” made by AI software.

Andrew Dietderich, the head of S&C’s restructuring practice, apologised in a letter to New York federal judge Martin Glenn on Saturday for mistakes that included misquoting the US bankruptcy code and citing cases incorrectly in a court filing made on April 9.

“We deeply regret that this has occurred,” he said in the letter.

Dietderich said the firm’s policies on the use of AI had not been followed when the document was prepared, and it was considering whether it needed to make “further enhancements” to its internal training and review processes.

The letter did not say which lawyers prepared the documents or whether they were still at the firm. S&C declined to comment.

The errors are the latest example of a professional services firm grappling with the use of cutting-edge technology to speed up laborious research and cut down on staffing while also trying to maintain quality standards.

The case in question revolves around S&C’s representation of liquidators appointed by legal authorities in the British Virgin Islands who are pursuing actions against Prince Group and its owner Chen Zhi.

US federal prosecutors last year charged Zhi with wire fraud and money laundering, accusing him of “directing Prince Group’s operation of forced-labour scam compounds across Cambodia . . . that stole billions of dollars from victims in the United States and around the world”.

In a separate action, US prosecutors also filed a civil forfeiture complaint seeking to seize nearly $9bn worth of bitcoin that the US authorities said represented the proceeds of the Prince Group crimes. Zhi was arrested earlier this year in Cambodia and extradited to China after a request from Beijing.

Prince Group is incorporated in the British Virgin Islands and the Chapter 15 proceeding in the US court system is designed to get the US government to formally recognise the powers of the BVI liquidators to represent creditors and victims in the US legal proceedings, liquidators told the court.

In multiple instances, S&C in the April 9 filing erroneously summarised the conclusions made in other cases, according to a list of strike-through corrections the firm submitted to the judge.

S&C has an enterprise licence for ChatGPT according to multiple people familiar with the firm’s operations. According to S&C’s website, at least five high-level partners have been assigned to the Prince Group bankruptcy case.

The firm’s partners typically charge more than $2,000 per hour in bankruptcy cases. The firm earned several hundred million dollars in fees in representing crypto exchange FTX in its bankruptcy liquidation.

Boies Schiller Flexner, the law firm representing Prince and Zhi, spotted the errors in S&C’s filing. In a document filed last week, BSF said words that S&C had quoted in its motion “do not appear in chapter 15 of the US Bankruptcy Code” and pointed to “multiple cited decisions” that were “misquoted or misidentified”.

It said a case cited by S&C in the motion “is not a case” and the reference was to “a different decision in a different circuit”.

S&C told the court that the firm maintained “rigorous” standards when using AI tools and that it “instructs lawyers to ‘trust nothing and verify everything’”. Failure to verify AI-generated output “constitutes a violation of firm policy”, it said.

It is the latest in a series of errors by law firms using AI tools. Last year, Latham & Watkins admitted that one of its lawyers had used Anthropic’s Claude model to help draft a filing which contained an apocryphal title and author for a journal article.

In another instance, a federal appeals court in New Orleans ordered a $2,500 sanction against a lawyer who had submitted a brief with 21 errors or fabrications that had been inserted by AI.

Separately, in September John Kucera, then a partner at BSF, said in a case against Amazon that a document for which he was responsible, prepared using AI tools, contained “material citation errors” due to his “failure to verify” details. “I am embarrassed by and very much regret these errors”, he said in the filing. BSF did not respond to a request for comment.

S&C told the judge overseeing the Prince Group case that its document review also showed “non-substantive and/or clerical errors in other filings in this matter”. The firm said those errors were made by humans, not AI.

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img