Post a Comment Print Share on Facebook

Because of ChatGPT, an American lawyer cites judgments... that never existed

Originally, it was just a trivial lawsuit between an individual and an airline, accused of being responsible for injuries he says he suffered.

- 66 reads.

Because of ChatGPT, an American lawyer cites judgments... that never existed

Originally, it was just a trivial lawsuit between an individual and an airline, accused of being responsible for injuries he says he suffered. But as the New York Times reveals, the airline's lawyers were taken aback by the brief sent by the plaintiff's lawyers: among the cases cited as case law, to support their request, the latter cited several cases which did not simply never existed.

The New York judge in charge of the case, P. Kevin Castel, then wrote to the plaintiff's lawyers to ask them for explanations: "six of the judgments cited refer to false court decisions and mention false citations", he observes. .

Implicated, the law firm Levidow

Schwartz, who expressed "tremendous regret" to the court when he realized his mistake, explained that he had never used ChatGPT before, and was unaware that some of the answers provided by the algorithm were made up, and therefore false. However, ChatGPT warns its users that it sometimes risks “providing incorrect information”.

The lawyer provided the court with screenshots of his ChatGPT interactions, showing that the chatbot had confirmed to him that one of the whimsical stops had indeed existed. When the lawyer asked him what his sources were, the artificial intelligence cited LexisNexis and Westlaw – two databases referencing court decisions. However, when we enter “Varghese v. China Southern Airlines Co Ltd” (the name of one of the stops cited in the brief) in the LexisNexis search engine, no results are found.

The two lawyers, Steven A. Scwartz and Peter LoDuca, are summoned to a hearing on June 8, with a view to possible disciplinary proceedings against them. Schwartz promised the court that he would no longer search ChatGPT without then verifying for himself the reality of the judgments proposed by the artificial intelligence.

Avatar
Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.