NY Lawyer Faces Possible Sanctions for Citing Phony ChatGPT Case – Bloomberg Law
Connecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world.
Americas+1 212 318 2000
EMEA+44 20 7330 7500
Asia Pacific+65 6212 1000
Connecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world.
Americas+1 212 318 2000
EMEA+44 20 7330 7500
Asia Pacific+65 6212 1000
By Sam Skolnik
An attorney in suburban New York City faces possible sanctions for using ChatGPT to generate a nonexistent state court decision she cited in a legal filing.
The conduct of Uniondale, New York lawyer Jae Lee of the JSL Law Offices fell “well below the basic obligations of counsel,” the US Court of Appeals for the 2nd Circuit ruled Tuesday. The court referred Lee to a grievance panel, which considers possible discipline such as fines and suspensions.
“I am committed to adhering to the highest professional standards and to addressing this matter with the seriousness it deserves,” Lee said in an emailed response to a question. She said she’s unable to answer additional questions “given the confidential nature of the disciplinary proceedings.”
Lee joins other attorneys who have been flagged by courts for inappropriately using ChatGPT and similar tools that use artificial intelligence to produce essays or other information after a user’s prompt.
US District Judge P. Kevin Castel fined two Manhattan lawyers, Steven Schwartz and Peter LoDuca, $5,000 last June for filing a ChatGPT-generated court brief containing bogus quotes from nonexistent cases. Donald Trump’s former lawyer Michael Cohen unwittingly included phony cases generated by Google’s AI tool in a brief arguing for his release from post-prison supervision, according to court papers made public last month.
In the 2nd Circuit matter, Lee was appealing a district court’s dismissal of her client’s medical malpractice suit. In the appeal, she cited two cases, including one Bourguignon v. Coordinated Behavioral Health Services, that didn’t exist, the 2nd Circuit judges said.
Lee told the court she had “encountered difficulties in locating a relevant case” before she turned to ChatGPT, which suggested the phony Bourguignon case. The judges found that Lee “did not read or otherwise confirm the validity of the (non-existent) decision she cited.”
The case is Park v. Kim, 2nd Circuit, 22-02057, 1/30/24.
To contact the reporter on this story:
To contact the editors responsible for this story:
AI-powered legal analytics, workflow tools and premium legal & business news.
Log in to keep reading or access research tools.