In a recent hearing at the Utah Court of Appeals, attorneys faced scrutiny over a petition that included erroneous citations, some of which were fabricated. The case, Garner v. Kadince, raised significant concerns about the use of artificial intelligence in legal filings, particularly regarding the responsibilities of attorneys to ensure the accuracy of their submissions.
The hearing began with an apology from the attorneys involved, who expressed embarrassment and disappointment over the situation. They explained that the petition, filed on February 18, 2025, bore the name of Doug D'Urbano, a seasoned attorney who had been recovering from a car accident at the time. However, it was Richard Bednar, another attorney, who was responsible for the petition's preparation and filing. Bednar admitted that a law clerk had used ChatGPT to draft parts of the document without his knowledge, leading to the inclusion of non-existent case citations.
Before you scroll further...
Get access to the words and decisions of your elected officials for free!
Subscribe for Free The court's concern was palpable as judges emphasized the importance of attorney accountability. They pointed out that the integrity of the judicial system relies on attorneys presenting accurate information under their signatures. The judges noted that while AI can be a useful tool, it cannot replace the attorney's duty to verify the accuracy of legal documents.
As the discussion unfolded, it became clear that the firm had not previously established a formal policy regarding the use of AI in legal work. However, following this incident, they have implemented a new policy to prevent similar mistakes in the future. The attorneys acknowledged their responsibility and expressed a commitment to uphold the honor system that governs legal practice.
The court also considered the implications of the errors, discussing potential sanctions for the attorneys involved. While the attorneys were prepared to cover the costs incurred by the opposing side due to the erroneous filing, the judges indicated that further monetary sanctions could be imposed to reinforce the seriousness of the matter.
In conclusion, the hearing served as a critical reminder of the evolving landscape of legal practice in the age of technology. As AI tools become more prevalent, the responsibility of attorneys to ensure the accuracy and integrity of their work remains paramount. The outcome of this case may set important precedents for how the legal community navigates the intersection of technology and professional ethics in the future.