Ethical Implications of Using Artificial Intelligence to Represent Criminal Defendants

In the August Edition of Voice for the Defense Online, Chuck Lanehart and Megan Gower cover a recent case that earned a lot of media attention over an attorney’s use of ChatGPT in the courtroom. While the attorney has yet to be sanctioned, he was lambasted by the judge and media for referencing non-existent case law in a motion.

View the full article below shared with members of the Texas Criminal Defense Lawyers Association. Learn how AI, such as ChatGPT is starting to power robot lawyer programs (i.e. DoNotPay) and the legal implication of defendants relying on software and AI for legal counsel.

“Can we agree that’s legal gibberish?” These were the angry words of Judge P. Kevin Castel of the United States District Court for the Southern District of New York. Judge Castel was conducting a show cause hearing, and his “legal gibberish” question referenced a ChatGPT-created response brief that cited six nonexistent cases. At the show cause hearing, Steven Schwartz, the lawyer responsible for the ChatGPT research, received questions from Judge Castel such as, “Do you cite cases without reading them?” and “Have you heard of the Federal Reporter?” Trying to save himself from sanctions, Schwartz said he thought the program was a search engine, and his lawyer—speaking on Schwartz’s behalf—indicated that the public needs a stronger warning about the dangers of ChatGPT. Judge Castel has yet to decide whether to impose sanctions on Schwartz. Matthew Russell Lee, Lawyer Suing Avianca Used ChatGPT Which Invented 6 Cases Now Sanctions Hearing Here, Inner City Press, (last visited June 16, 2023).

Programs run by artificial intelligence (AI) have become a hot-button topic in the legal world. In the realm of criminal defense, AI carries two major implications: first, the ethical obligations for attorneys using AI; and second, the danger of clients foregoing hiring a licensed attorney in favor of seeking legal advice from AI.

Use of AI by Criminal Defense Attorneys

ChatGPT is a chatbot that is powered through artificial intelligence (AI). See Blair Chavis, Does ChatGPT produce fishy briefs?, ABA J. (Feb. 21, 2023, 1:58PM), does-chatgpt-produce-fishy-briefs. The program— released by the company OpenAI in November of 2022— can draft legal documents such as appellate briefs and contracts. See id. A lawyer must only type a request— whether for contractual language or a brief—into the ChatGPT website, and the program will draft a response in accordance with the request. See Kate Rattray, Will ChatGPT Replace Lawyers?, Clio, blog/chat-gpt-lawyers/ (March 14, 2023). While ChatGPT may be appealing to lawyers looking to save time when drafting legal documents, like all technology, it is prone to error.

Even proponents of ChatGPT recognize the technology is not perfect and “[r]esults obtained from ChatGPT are often riddled with errors and, in some cases, outright falsehoods.” Nicole Black, The Case for ChatGPT: Why lawyers should embrace AI, ABA J. (Feb. 21, 2023, 1:11PM), the-case-for-chatgpt-why-lawyers-should-embrace-ai. For example, when ChatGPT was asked to draft a LinkedIn post for an article in which a lawyer was quoted, the program cited a quote that did not exist in the publication. See id. When ChatGPT was used in connection with other legal software, the document drafted referred to a non-existent state ethics provision. See id. Other tests of ChatGPT have found similar concerns about the program citing authorities or sources that do not exist. See id. ChatGPT is not like the use of a pre-prepared form. While forms carry the risk of citing outdated law, ChatGPT carries the risk of citing non-existent law. When using programs such as ChatGPT, attorneys should be mindful of their duties of diligence and candor toward the court, requiring that they thoroughly review any document drafted or authority cited by AI-powered programs. See Tex. Disciplinary Rules Prof ’l Conduct R. 1.01 cmt. 6; Tex. Disciplinary Rules Prof ’l Conduct R. 3.03 cmt. 3.

Arguably, if a lawyer charges their normal fee for work that was completed using AI, the lawyer could be subject to sanctions under Texas Disciplinary Rules of Professional Conduct (TDRPC) Rule 15.07. A factor that the TDRPC enumerates as relevant when considering if an attorney has charged a reasonable fee is “the time and labor required, the novelty and difficulty of the question involved, and the skill requisite to perform the legal service properly.” Tex. Disciplinary Rules Prof ’l Conduct R. 1.06(DD). While the TDRPC do not currently account for AI, a lawyer billing for work that was completed using AI, e.g. charging the time it normally would have taken to draft a document that was actually drafted by AI, is comparable to billing for recycled work product. See ABA Comm. On Ethics and Pro. Resp., Formal Op. 379 (1993) (discussing that a lawyer violates their ethical duties by billing for recycled work product). Under TDRPC Rule 15.07, a lawyer who violates the duty to charge a reasonable fee may be subject to sanctions.

Finally, attorneys must be cognizant of the duty of confidentiality when using AI programs. See Tex. Disciplinary Rules Prof ’l Conduct R. 1.05(b). In April of this year, ChatGPT was banned in Italy until its parent company made changes to become more transparent about its data-collection process. See Adi Robertson, ChatGPT returns to Italy after ban, Verge (April 28, 2023, 2:17 PM), These changes appear to be limited to Italy, though other countries such as Spain and Canada are considering similar bans. Id. ChatGPT is not a search engine that learns from browsing the internet—it learns how to improve its responses, in part, from the prompts and feedback it receives from users. See Catherine Thorbecke, Don’t tell anything to a chatbot you want to keep private, CNN (April 6, 2023, 10:46 AM), https://www.cnn. com/2023/04/06/tech/chatgpt-ai-privacy-concerns/ index.html. Considering this language-processing model and the lack of transparency about data privacy when using ChatGPT in the United States, lawyers should refrain from including confidential information when giving the program a prompt.

Use of AI to Replace Criminal Defense Attorneys

The use of AI is not confined to lawyers, because laypersons have equal access to these programs. In January of 2023, Joshua Browder, CEO of the company DoNotPay, tweeted that his company was working on a “robot lawyer,” which would tell pro se defendants what to say in court via a Bluetooth device. Niamh Rowe, An AI Lawyer is About to Defend a Human in a U.S. Courtroom, Daily Beast (Jan. 14, 2023, 2:53AM), Browder stated that his goal was to ensure “that the ordinary, average consumer never has to pay for a lawyer again.” Id. In addition to a robot lawyer that tells defendants what to say in court, DoNotPay also offered drafting assistance for a wide array of areas of law, including immigration services and contracts. See Debra Cassens Weiss, DoNotPay doesn’t live up to its billing as a ‘robot lawyer,’ offers ‘substandard’ legal docs, suit claims, ABA J. (March 10, 2023), news/article/suit-claims-donotpay-doesnt-live-up-toits-billing-as-a-robot-lawyer-offers-substandard-legaldocs. Notably, Browder—just like the creators of other AI programs—is not a lawyer. Id.

On March 3rd, 2023, a lawsuit was filed that alleged DoNotPay violated California law by holding itself out as a lawyer and providing legal services without a law license. See id. The lawsuit alleged DoNotPay is not supervised by a licensed attorney, and many documents created by the program were unusable because they were “so poorly or inaccurately drafted.” See id. The lawsuit is anticipated to turn into a class action. See id. Following the threat of criminal charges in multiple states for the unlicensed practice of law, Browder said on Twitter the company would switch to consumer rights issues, such as helping dispute medical bills and credit reports. Debra Cassens Weiss, Traffic court defendants lose their ‘robot lawyer,’ABA J. (Jan. 26, 2023), article/traffic-court-defendants-lose-their-robot-lawyer.

It may be easy to brush off the DoNotPay fiasco as another fraudulent, failed Silicon Valley business venture, but the larger ramifications of the situation cannot be ignored. Laypersons, specifically those coming from a low socioeconomic background, are particularly vulnerable to companies like DoNotPay. These companies present themselves as the solution to access to justice issues. However, their lack of expertise makes it more likely that a client will end up with an unnecessary conviction than a financially favorable deal on legal services. If a lawyer messes up a client’s case, a client may have an ineffective assistance of counsel claim. If a robot messes up a client’s case, the client will likely, at best, only receive damages to put in their commissary account as they serve out the jail sentence they got courtesy of their robot lawyer. Had Browder not widely publicized his program in a way that got the attention of criminal defense attorneys on Twitter, it is likely that AI may have “represented” pro se criminal defendants for a significant period of time before state bar associations became aware of the situation.

Despite Browder’s unlicensed practice of law, he did make a good point: “The truth is, most people can’t afford a lawyer.” Bobby Allen, A robot was scheduled to argue in court, then came the jail threats, NPR (Jan. 25, 2023, 6:05PM), a-robot-was-scheduled-to-argue-in-court-then-camethe-jail-threats. While AI programs should not be utilized by laypersons to draft legal documents, the use of AI by licensed professionals could increase access to legal representation. Attorneys may be willing to take on more pro bono work or work at a lower cost if they can save time with drafting assistance from AI. While a world in which criminal defense attorneys are replaced by robots sounds like a plot to an Orwellian novel, the use of AI to aid diligent defense attorneys in increasing access to legal representation is a reality that should be welcomed.

Chuck Lanehart is a shareholder in the Lubbock firm of Chappell, Lanehart & Stangl, P.C., where he has practiced law since 1977, and he is a 1977 graduate of Texas Tech University School of Law. He is board certified in the field of Criminal Law by the Texas Board of Legal Specialization. He serves on TCDLA’s ethics committee and strike force, and he is statewide co-coordinator of the annual TCDLA Declaration readings. He previously served as director of the State Bar of Texas and of TCDLA. He is author of several books, including “Evolution of the Texas Plains,” published by The History Press July 10. He is co-author of “Fatal Exam: Solving Lubbock’s Greatest Murder Mystery,” to be published by Texas Tech Press in November 2023. In 2018, the Lubbock Area Bar Association presented Chuck the James G. Denton Distinguished Lawyer Award, the Bar’s highest honor. In 2008, Chuck was named among the “200 Most Influential People in the History of Lubbock” by the Lubbock Avalanche-Journal. He can be reached at chuck@ or 806-765-7370.

Megan Gower is a rising 3L at Texas Tech University School of Law. She is originally from Alamogordo, New Mexico and attended Texas Tech University for undergrad, where she graduated magna cum laude with a Bachelor of Arts degree in History and Political Science. She is a member of Phi Delta Phi and is currently the President of Tech Law’s Criminal Law Association. Her TTUSL team was champion of the 2023 regional ABA National Appellate Advocacy Competition in Los Angeles, where Megan was named best advocate. After completing law school, she plans to practice capital defense in Texas. Megan has clerked for Chappell, Lanehart & Stangl in Lubbock since 2020.

Posted on: