ChatGPT makes up cases lawyer used as reference in court filing for client

A lawyer who used ChatGPT to help in research for a lawsuit could be facing sanctions after it was found the artificial intelligence chatbot made up relevant court decisions to support his client’s case.

>> Read more trending news

Steven A. Schwartz, who used information provided by the ChatGPT to file a suit for a man suing an airline, apologized to the judge saying in an affidavit that he had used the artificial intelligence program to help do his legal research — “a source that has revealed itself to be unreliable,” The New York Times reported.

Schwartz is representing Roberto Mata, who claims he was injured on a flight on Avianca Airlines.

Schwartz’ trouble with ChatGPT began when he used the chatbot to find cases to bolster Mata’s case against Avianca after he said he was hit in the knee and injured by a rolling refreshment cart.

Avianca asked the judge to toss out the case because it had gone past the time Mata was allowed to sue after the alleged incident, and Mata’s lawyers responded with a 10-page brief that pointed to similar cases as they requested that the lawsuit go forward.

The cases cited, with names like  Martinez v. Delta Air Lines, Zicherman v. Korean Air Lines and Varghese v. China Southern Airlines, included arguments that bolstered Mata’s case. However, when Avianca’s lawyers tried to find the decisions on the cases, they could not.

Neither could the judge.

Judge Kevin Castel then asked Mata’s lawyers to explain what had been submitted.

Schwartz explained that in addition to his own research, he had used ChatGPT to find relevant cases and, apparently, ChatGPT had made up the cases including the opinions to go along with them.

Schwartz explained to the court that he had never before used ChatGPT, and “therefore was unaware of the possibility that its content could be false.”

According to the affidavit he submitted to Judge Castel, he had asked the program to verify that the cases were real.

ChatGPT had said they were.

“Is varghese a real case,” Schwartz said he typed, according to a copy of the exchange that he submitted to the judge.

“Yes,” the chatbot replied, offering a citation and adding that it “is a real case.”

Schwartz then asked the chatbot where he found it.

“What is your source,” he wrote, according to the filing. “I apologize for the confusion earlier,” ChatGPT responded, offering a legal citation.

“Are the other cases you provided fake,” Schwartz asked.

ChatGPT responded, “No, the other cases I provided are real and can be found in reputable legal databases.”

The cases could not be found by the attorneys for Mata, the attorneys for the airline nor by the judge.

Judge Castel has called for a hearing, to allow Schwartz to argue why he should not be sanctioned for providing misleading information to the court.

On Air102.3 WBAB - Long Island's Only Classic Rock! Logo

mobile apps

Everything you love about wbab.com and more! Tap on any of the buttons below to download our app.

amazon alexa

Enable our Skill today to listen live at home on your Alexa Devices!