An Attorney’s Gatekeeping Responsibilities Are Not Mitigated by the Growing Presence of Artificial Intelligence

By David Alexander

July 26, 2023

An Attorney’s Gatekeeping Responsibilities Are Not Mitigated by the Growing Presence of Artificial Intelligence

7.26.2023

By David Alexander

Artificial Intelligence graphic blue finger pointing at justice scales

ChatGPT is still in its infancy and lawyers need to understand that its overzealous use could cause permanent harm to their reputation and character.

Anne Louise LaBarbera, of the Anne LaBarbera Professional Corporation and a past chair of the Young Lawyers Section, moderated Monday’s lunch hour CLE program, which focused on the ethical issues surrounding the use of ChatGPT, a form of Artificial Intelligence that can compose anything from a letter to legal briefs based on available information on the internet.

Justice Melissa A. Crane, New York State Supreme Court, New York County, Commercial Division, and Brandon Lee Wolff, immediate past chair of the Young Lawyers Section, were the program’s panelists.

They discussed a lawyer’s responsibility as a gatekeeper of any materials they submit in court, their responsibility to ensure accuracy and their appreciation for research as a legal process cornerstone.

“Your credibility is all you have, really, it’s so important. So, if somebody misrepresents a case to me.…I’m going to remember it not for just the case you’re on, but forever probably and it’s hard to repair that damage. Credibility is so important before the tribunal. People should guard their reputations by making sure that the case says what it says, so even if you’re not going to get sanctioned, that’s an important aspect to your career,” said Justice Crane.

This issue came up recently when a pair of attorneys in Robert Mata v. Avianca, Inc. where opposition papers used ChatGPT, which came up with non-existent citations. The two attorneys, and their firm, were financially sanctioned by the judge.

The case underscored how attorneys have an obligation to understand any technology they use.

“You have to know the technology because ChatGPT, as I understand it, mines the data from your requests and you have an ethical obligation to know the technology, but to also not breach attorney-client privilege and that could be happening by using some of these tools,” said Justice Crane.

Wolff followed with his opinion of what an acceptable use of AI comprises.

“It’s really important that anything is not put into AI that contains any sort of attorney-client privilege even if it’s anonymized data.…Also, for new associates and new attorneys you should probably steer away from AI, but for more experienced attorneys, there are things you can do such as an initial draft of a letter, but if you are using it and are an experienced lawyer, you have to thoroughly review and double check any case citations to make sure nothing is misrepresented.”

LaBarbera added that there are going to be savvy lawyers who will attack the attorney-client privilege because of the susceptibility to data breaches of AI tools. In fact, the technology’s potential for unauthorized practice of law was put on display in California this winter.

The AI powered bot Robot Lawyer, which was designed by the technology startup DoNotPay, touted that it had the ability to direct defendants in real time on how to respond to the judge while contesting a traffic ticket by using a Bluetooth connected earpiece. However, it was discontinued in late January before ever being used after the company’s founder and CEO Joshua Browder faced backlash from lawyers and bar associations.

In addition to AI’s shortcomings, an article by attorney and author David Lat in Above The Law explained that part of understanding the technology includes a duty of supervision. Just as there are tasks that cannot be delegated to a paralegal, there are those that should not be delegated to an AI tool and an attorney needs to know how to delineate between the two.

Ultimately, the responsibility falls to attorneys to verify the accuracy of anything produced by an AI instrument.

“All of us, we have to be gatekeepers when it comes to evidence and we have to keep it in the forefront of our mind especially with deep fakes including fake news and fake evidence that looks so real,” said Justice Crane.

Please go here to register for the on-demand program.

Related Articles

Six diverse people sitting holding signs
gradient circle (purple) gradient circle (green)

Join NYSBA TEST

My NYSBA Account

My NYSBA Account