In the US, a lawyer used ChatGPT information in a lawsuit, but it turned out to be false. This is reported by The New York Times.
Avianca is being sued by Roberto Mata, who claims he was injured on August 27, 2019, when a metal food cart hit his knee on a flight from El Salvador to New York.
When Avianca asked a federal judge in Manhattan to dismiss the case because the statute of limitations had expired, the passenger’s lawyer, Steven A. Schwartzof Levidow, Levidow & Oberman objected.
He called for the case to be continued and filed a document citing half a dozen court decisions, including those involving Delta Air Lines, Korean Air Lines, and China Southern Airlines. Meanwhile, neither Avianca’s lawyers nor Judge P. Kevin Castel could find the ruling or the citations cited by the lawyer.
Stephen A. Schwartz eventually admitted during sworn testimony that he used an artificial intelligence program for his legal research – “a source that has revealed itself to be unreliable.” He assured that he had no intention of deceiving the court or the airline, had never used ChatGPT, and therefore was not aware of the possibility that its content could be false.
He also emphasized that he asked the program to certify that the information it provided was real, and ChatGPT answered in the affirmative. Stephen A. Schwartz said that he deeply regrets relying on ChatGPT and will not do so in the future without absolutely verifying its reliability.
The judge noted in the ruling that he was presented with an unprecedented situation – a legal filing full of fake court decisions, with fake citations and fake internal references. He scheduled a hearing to discuss possible sanctions for June 8.
It was previously reported that about six out of ten American adults (58%) have heard of ChatGPT, but few have tried it themselves. This is evidenced by the results of a survey Pew Research Center conducted in March. According to the results, 18% of US adults have heard a lot about ChatGPT, 39% have heard a little, and 42% have not heard at all.
Loading comments …