(StraightNews.org) — An attorney faces possible sanctions for using artificial intelligence (AI) to conduct research. Steven Schwartz, a lawyer with Levidow, Levidow & Oberman in New York, admitted using the ChatGPT application to research case law in a personal injury matter. His briefing contained references to fictional cases.
Judge Kevin Castel of the Southern District of New York said the attorney’s research documents contained six examples that appeared to be “bogus.”
The case involved a lawsuit against Avianca Airlines by a passenger alleging a drinks trolley had injured him. The ChatGPT application presented the lawyer with six seemingly similar cases. The AI system assured the attorney that the six cases were legitimate and even claimed they were available in the globally-utilized legal resources LexisNexis and Westlaw. However, Schwartz admits he did not check if this was true before accepting and using the information.
Screenshots presented to the court reveal a “conversation” between Schwartz and ChatGPT. In one exchange, Schwartz asks the app if the Varghese v. China Southern Airlines Co Ltd case is genuine. The app replied that it was and can be found in legitimate legal research sources. However, the case was not real, and Schwartz said he had no idea that the information supplied by the AI device could be false.
He must explain why he should not face sanctions at a special hearing in June.
ChatGPT is a new software application that answers questions in a conversation-like exchange with users. Developed by the company OpenAI, it was launched in November 2022. It can answer questions, create music and art, and generate stories. Significant figures in the world of technology, including Elon Musk, have warned about its dangers. It has the potential to replace human beings in thousands of jobs, and some people express concern that it will minimize the power and usage of the human mind and lead to a lazy and technology-dependent world with limited privacy and a high risk of error.
Copyright 2023, StraightNews.org