100 years + of collective legal malpractice experience

More on artificial intelligence: Potential impact on legal malpractice

On Behalf of | Mar 18, 2024 | Artificial Intelligence, Duty of Care, Legal Malpractice, Trial Errors |

People in many professions are working with artificial intelligence (AI) to see how it can enhance their work. But equally important is careful analysis of ways in which AI may harm the integrity of work product, even when the enhancements seem significant.

We previously wrote about an attorney who had asked ChatGPT to create a written brief to file in New York federal court. A brief provides the court with a party’s legal arguments in support of their position in a case along with supporting citations to legal authorities like cases and statutes.

In that case, the law firm filed the AI-written brief without checking to see that the cases cited were good law and that quotations were accurate. Checking citations is an expected and basic part of legal research and writing.

The AI program had included in the brief fictional cases and quotations it had “hallucinated.” For filing the brief citing false authority, the judge sanctioned and fined the lawyer, his partner and law firm. He also required them to notify every judge falsely named as the writer of one of the fictional cases.

According to CNBC, the sanctioning judge wrote that “[t]echnological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance … But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”

Déjà vu

As we wrote in our previous post, when an attorney breaches their duty of care to a client and the client is hurt, the lawyer commits legal malpractice. A court filing with false legal sources not only violates ethical standards and court rules, but also constitutes likely legal malpractice if the filing harmed the client.

NPR reported that another federal court filing containing “hallucinated” content surfaced recently in New York. Disbarred attorney Michael Cohen gave an AI-generated writing to his own lawyer for use on behalf of Cohen. This brief also contained false case authority and was filed without cite checking by either Cohen or his attorney. Both claimed not to have known that the cases were fictional. The court fined two of Cohen’s lawyers, according to Bloomberg Law.

Could AI use breach a lawyer’s duty of client confidentiality?

Broadly, chatbots based on AI work by uploading voluminous information from which the programs draw to use in their future outputs. The uploaded materials may include the prompts and questions an attorney might use to generate written text on behalf of a client. Bloomberg Law reports that concern exists whether this endangers attorney-client privilege if the prompt contains details that could identify the client or details of their legal or personal problem.

In a public AI program, the AI output to another user could contain client confidential information from the original lawyer, reports Bloomberg. It could be an ethical breach, but also potentially a cause of legal malpractice. For example, if a legal question is obscure, the opposing party’s attorney in a lawsuit might put in key words that pull up AI-written content learned from the other side’s AI prompt. If this reveals confidential information or a legal strategy, there is the potential of harm to the original client’s legal interests.

Even creating a chatbot that only learns from information from a private law firm may not adequately protect privileged information if the information is shared among multiple clients, explains the Bloomberg Law article.

AI in the practice of law may be a minefield

As the courts, technicians and the legal profession wade through these complex ethical and malpractice issues with AI, we will get more clarity. In the meantime, anyone who suspects their lawyer’s use of AI in providing legal services may have harmed them should speak with a malpractice attorney as soon as possible.