Skip to main content

NewIrys for Word is here. Analyze, draft, review, and revise — all from one sidebar. Install Free →

Market IntelligenceChatGPT

ChatGPT for Legal Work: The Privilege, Data, and Sanctions Risks Lawyers Should Understand

Irys Market Intelligence7 min read

The Reality of ChatGPT in Law Firms

ChatGPT is already the most widely used AI tool in legal practice. This is not a prediction. It is a fact.

According to recent surveys, 69% of solo practitioners use general-purpose AI tools like ChatGPT, Claude, or Gemini for legal work. 28% use AI daily. Yet only 4% of small firms have adopted AI at the firm level with any kind of formal policy or governance.

The gap between individual adoption and institutional readiness is enormous. Most lawyers using ChatGPT are doing so without firm approval, without data handling policies, and without understanding the specific risks that apply to legal work.

What the Data Says

The adoption numbers tell a clear story: lawyers are not waiting for permission. They are using the tools that are available, affordable, and fast.

ChatGPT costs $0-20 per month on consumer plans. It is accessible on any device. It produces coherent-sounding legal analysis in seconds. For a solo practitioner billing at $250 per hour and spending 6 hours a week on research, the appeal is obvious.

But consumer AI tools were not built for legal work. They were built for general question-answering. The difference matters more than most lawyers realize.

The Privilege and Sanctions Risk

There are three distinct risks that lawyers face when using ChatGPT for client matters.

1. Privilege waiver through data exposure

ChatGPT stores chat history by default for approximately two years. When you paste a client's confidential information into ChatGPT, that data sits on OpenAI's servers.

This is not theoretical. OpenAI has been successfully subpoenaed, with full chat logs, uploaded PDFs, and documents produced in response to legal process. If opposing counsel learns you used ChatGPT for case strategy and subpoenas OpenAI, your client's privileged communications may be discoverable.

ABA Model Rule 1.6 requires lawyers to make reasonable efforts to prevent unauthorized disclosure of client information. Using a consumer tool that retains data on third-party servers and has already been subject to compelled disclosure raises serious questions about whether that standard is met.

2. Hallucinated citations and sanctions

ChatGPT does not verify that the cases it cites actually exist. It generates text that looks like legal citations, but it has no mechanism to confirm those cases are real, correctly cited, or still good law.

Courts have responded to this problem with increasing severity. Sanctions exceeding $100,000 have been imposed on lawyers who submitted AI-generated briefs containing fabricated case citations. Multiple federal and state courts now require disclosure of AI use in filings.

The issue is not that lawyers used AI. It is that they submitted AI output without verification, which courts treat as a failure of competence and candor.

3. No matter continuity

ChatGPT has no memory across sessions. When you close the tab, the context is gone. Every conversation starts from zero.

For a one-off question, this is fine. For ongoing matters that span weeks or months, it means re-explaining your case every time you return. You lose the accumulated context that makes AI assistance actually useful over time.

What "Good Enough" Actually Costs

The hidden cost of using ChatGPT for legal work is not the subscription fee. It is the risk exposure.

Consider what a single sanctions motion costs to defend. Consider what a privilege waiver means for your client's case. Consider the time you spend re-entering case context every session because the tool has no memory of your matter.

A $20/month tool that creates a $100,000 sanctions risk or a privilege waiver is not cheap. It is the most expensive option available.

Only 41% of solo practitioners budget specifically for technology. Most technology purchases come from personal operating cash. This creates pressure to choose the cheapest option. But cheap and low-risk are not the same thing.

How Purpose-Built Legal AI Addresses This

Irys was built specifically for legal work, and its architecture directly addresses each of the risks described above.

Data handling: Irys processes approximately 80% of queries through its own in-house infrastructure. The platform operates under a zero data retention policy and maintains SOC 2 Type II certification. Client data is not stored after processing. There is nothing to subpoena.

Citation verification: Irys grounds legal citations through CourtListener, a database of over 50 million cases. When the platform cites a case, it verifies the citation exists and links to the source. This is not a guarantee of perfect accuracy, but it is a structural safeguard that consumer tools do not have.

Matter continuity: Irys maintains context across sessions within a matter. When you return to a case next week, the AI remembers your prior research, your arguments, and your case strategy. You pick up where you left off instead of starting over.

Pricing: Irys offers a free plan at $0, a Starter plan at $69/month, and a Professional plan at $149/month. For a solo practitioner spending $200/month on average for technology, this competes directly on price with the tools lawyers are already paying for, while eliminating the privilege and sanctions risks.

What to Consider

ChatGPT is a capable general-purpose AI. It is good at many things. But legal work has specific requirements around privilege, data security, citation accuracy, and matter continuity that consumer tools were not designed to meet.

Before using any AI tool for client work, lawyers should ask:

  • Where does my data go? Is it stored on third-party servers? For how long? Has the provider been subject to compelled disclosure?
  • Are citations verified? Does the tool check that cited cases exist and are current, or does it generate plausible-looking text without verification?
  • Does the tool maintain context? Can I return to a matter next week without re-explaining everything?
  • What is my obligation under Rule 1.6? Am I making reasonable efforts to protect client information when I use this tool?
  • What is the real cost? Not just the subscription fee, but the risk exposure if something goes wrong.

The 69% of solo lawyers already using AI are not going to stop. The question is whether they will continue using tools designed for consumers, or move to tools designed for the specific demands of legal practice.

chatgptprivilegedata securitysanctionssolo lawyerslegal aiabaethicshallucination

See how Irys compares