July 2023:
The MLO Minute: “Don’t Let AI Be Your Estate Planning Lawyer!” —
By Dennis McAndrews, Esq. and Kelly Hayes, Esq. —
For several decades, lawyers have used electronic research tools and computer software in drafting documents to serve clients, while being required by strict ethical standards to use sound professional judgment and avoid overreliance on technological tools. Although computer programs and research software have long been staples of the legal profession, and the best attorneys use them to save time and keep costs down, efforts to utilize new modalities such as ChatGPT in developing legal documents have proven dangerous and, in some cases, costly for practitioners and clients.
As the rapid development of Artificial Intelligence (“AI”) has exploded onto the scene, many questions have developed regarding its proper uses in the legal profession. AI has promise in some fields and is finding a niche in some professions to provide time-effective benefits, especially where issues are neither complex nor require personal or professional judgment. But far too often, legal experience and sound judgments are absolutely essential in proper representation of clients in their legal affairs.
And one area of legal practice that particularly requires sound professional judgment and personal insights is estate planning. Even well-known web-based programs have been ineffective in developing fully appropriate estate planning documents. Clients—or their beneficiaries after the testators have died—can discover too late that internet sources for developing these critical instruments have offered inadequate or inaccurate information concerning such issues as the proration of taxes among beneficiaries, identifying assets that do not pass under the wills (and must involve separate beneficiary designations to be effective), naming appropriate guardians for minor children, and recognizing the effects of divorce or incapacity among principals or beneficiaries, to name just a few.
A law firm in New York recently discovered how inaccurate and harmful ChatGPT can be in conducting research and providing answers to legal questions. ChatGPT provided the law firm with court cases and legal opinions that did not exist and which they cited in a legal brief to the court. When opposing counsel and the court discovered the non-existence of these cases and opinions, the court was furious and fined each of the attorneys and their firm $5000 for the various harms that were caused by this reliance on artificial intelligence, including waste of judicial resources, the time expended by opposing counsel trying to find nonexistent documents, harm to the reputation of counsel and clients, and diminution of respect for the legal system. Fortune magazine offered a full report of this matter. See https://fortune.com/2023/06/23/lawyers-fined-filing-chatgpt-hallucinations-in-court/amp/.
As noted above, our office uses a variety of electronic and respected research and drafting tools which are designed to enhance the efficient preparation of documents to save client’s costs while providing the best possible legal service to the families we represent. But we will never cut corners by using untested, unproven, and unreliable artificial intelligence systems as we always strive to maintain the highest levels of professionalism to provide the peace of mind and legal services that our clients deserve.