We live in a moment where artificial intelligence is woven into nearly every corner of daily life. People are using AI to plan vacations, write wedding vows, edit novels, brainstorm business ideas, and even redesign their homes. These tools are fast, friendly, and capable. It’s no surprise that many people wonder: If AI can do all of that, why not ask it to create my will or trust?
As an estate planning attorney, I understand the appeal. Typing questions into ChatGPT or Claude feels easier than scheduling time with a lawyer. And there is no shortage of online platforms promising quick, inexpensive wills and trusts “drafted in minutes.” But the truth is that estate planning, and especially Medicaid and long-term care planning, is not something AI can safely do for you.
Every month, our office now meets with families trying to unwind the fallout from AI-generated or online-generated estate plans. What was meant to be a shortcut ended up creating delays, disputes, extra expenses, and in some cases the loss of benefits or property. These issues often appear only after a person has passed away or suffered a medical crisis, when it’s too late to fix the problem.
Before relying on AI for something as important as your estate plan, here is what we wish every client understood.
Even AI Says It Shouldn’t Create Your Estate Plan
Out of curiosity, we asked ChatGPT whether it should be used to draft wills or trusts. The response: no. AI tools can help you learn, outline your thoughts, or understand basic concepts, but they cannot replace an attorney.
AI Cannot Apply State-Specific Law With 100% Precision
Estate planning is governed by state statute, case law, and strict execution requirements. A small mistake can have big consequences. For example, in New York, a will that isn’t witnessed properly, a Power of Attorney that is missing the statutory gifts rider, or a trust using outdated language can lead to:
- Documents being declared invalid
- Delays in probate or estate administration
- Assets passing to the wrong individuals
- Avoidable taxes or penalties
AI tools are trained on vast amounts of general information—but not on your state’s specific, ever-changing legal requirements. They also cannot comply with the New York Estates, Powers & Trusts Law (EPTL) or the Surrogate’s Court Procedure Act.
AI Cannot Ensure Proper Execution of Documents
A perfectly drafted will is useless if it is signed incorrectly. New York requires specific witnessing procedures; certain documents must be notarized; and others need statutory warnings read aloud. AI cannot walk you through these requirements in real time or confirm they were followed.
Incorrect execution is one of the most common reasons we see documents fail.
AI Cannot Identify Hidden Issues or Address Sensitive Issues The Way a Human Can
A significant part of estate planning is uncovering issues clients don’t realize matter. AI only knows what you type into it—and clients rarely know what to disclose. For example:
- Blended families or estranged relatives
- Disabled beneficiaries and the need for special needs planning
- Medicaid look-back issues and asset transfer penalties
- Tax exposure or titling problems
- Business succession complications
- Creditor protection concerns
An attorney is trained to ask the questions you don’t know to ask. AI is trained to respond to the questions asked.
Medicaid and Long-Term Care Planning Is Too Complex for AI
Medicaid eligibility involves a five-year look-back period, transfer penalties, exemption rules, trust requirements, spousal allowances, and frequent regulatory changes. There is no algorithm that can analyze your assets, family circumstances, health risks, and legal options with the nuance required.
Mistakes in Medicaid planning are often irreversible—and costly.
AI Cannot Give Legally Binding Advice
AI is not licensed, cannot practice law, cannot assume professional responsibility, and cannot be held accountable. Estate planning is not simply document drafting; it is legal advice, strategic planning, risk analysis, and fiduciary responsibility. Those duties cannot be delegated to software.
Should You Use AI to “Check” Your Estate Plan?
Another common question we hear is whether it’s safe to upload an existing estate plan to AI to “analyze” it. The answer is no—primarily because of privacy.
We are in the early stages of learning how AI systems store and process information. Just as social media felt harmless until people realized their personal data was being tracked, shared, or sold, we will likely see a similar learning curve with AI.
When you upload sensitive information to an AI system, it enters a database stored on massive servers you cannot control. It may be retained permanently or temporarily. You cannot be certain who may have access to it, now or in the future. Estate planning documents contain financial information, family details, medical concerns, and business ownership structures, all of which need not be set loose in the world.
Our firm uses multiple security systems to keep client data private. AI tools do not offer that level of protection, and their privacy policies make that clear.
AI Lacks Human Judgment—and That Matters
Even with impressive computing ability, AI does not understand family dynamics, personal values, or the emotional nuances that shape estate planning. It cannot anticipate the “what-ifs” that attorneys deal with daily.
Consider just a few common scenarios:
- A will leaves a home to a nephew, but the home is sold before death. Is the nephew entitled to something else?
- A child develops substance-use issues after a trust is created. Should the trust include protection or restrictions?
- Parents want to treat children equally, but one child receives significant lifetime gifts or caregiving support. Should the estate reflect that?
These are human questions, not software questions.
AI Also Makes Mistakes
A phenomenon often called “AI hallucinations” occur when an AI system provides answers that are simply wrong. In the legal world, this has already caused real-world harm. Several attorneys across the country were sanctioned after filing briefs containing cases, quotes, and citations entirely fabricated by AI.
Estate plans created by AI can contain similar errors—incorrect statutory language, references to nonexistent laws, or clauses that contradict one another. Unlike a wrong restaurant suggestion, these mistakes can have lasting legal and financial consequences for your family.
Human Estate Planning Needs Human Experience
Early GPS systems sometimes sent drivers into ponds. AI is no different: its output is only as good as its training, and it often fails in unexpected ways. Estate planning requires judgment, precision, and a deep understanding of human circumstances. It is as much about protecting relationships as it is about distributing property.
AI is a powerful tool, and it has its place—education, brainstorming, drafting outlines, gathering general information. But it cannot replace the experience, responsibility, and foresight of a qualified estate planning attorney.









