← All posts

A Federal Judge Just Ruled That Using Consumer AI Can Destroy Attorney-Client Privilege

The platform's privacy policy killed the protection. Thirty-one documents were handed to prosecutors.

White half-face Phantom mask resting on legal documents under a desk lamp with gavel in background

On February 10, 2026, a federal judge in New York made a ruling that should matter to every lawyer in America. And every person who's ever hired one.

The case is United States v. Heppner. The defendant, Bradley Heppner, had used consumer Claude to create 31 documents tied to his legal matter. His lawyers argued those documents were shielded by attorney-client privilege and the work-product doctrine.

Judge Jed S. Rakoff disagreed. All 31 documents were handed to prosecutors.

The ruling wasn't about what the documents said. It was about where they were created. The moment Heppner typed into a consumer AI tool, the court found, he gave up any real expectation of privacy. Anthropic's own privacy policy allows data use for model training and sharing with third parties, including the government.

The platform's terms of service killed the privilege.

What privilege actually protects

If you've ever told a lawyer something you wouldn't say in public, privilege is the reason you felt safe doing it.

Attorney-client privilege is the legal rule that says your talks with your lawyer can't be used as evidence against you. It's why a CEO can tell their attorney "I think we have a compliance problem" without that sentence showing up in a lawsuit. It's why someone going through a divorce can hand their lawyer a full list of bank accounts without worrying the other side will get a copy.

Without it, being honest with your lawyer becomes a risk. The whole system rests on one thing: the exchange stayed private. Courts have always asked the same question. Did both sides have a good reason to believe the conversation would stay between them?

Consumer AI tools answer that question with a "no," in writing, in their terms of service.

The same document nobody reads before clicking "Accept."

ChatGPT, Gemini, Claude, Copilot: all of them, in their free versions, keep the right to use chat data for training, safety work and product updates. Many say outright they may share data with partners, vendors or in response to legal demands.

Every time a lawyer pastes a client memo into one of these tools for a quick summary or a draft, they're sending that message to a third party. Heppner says that breaks the chain.

Same day, opposite outcome

Hours after Heppner, a second ruling came from a federal court in Michigan. In Warner v. Gilbarco, AI-assisted work was protected. But only because a supervising attorney had directed the AI use as part of a controlled, privileged workflow. The difference was oversight, records and intent.

Two cases, same day, opposite results. The variable wasn't the AI tool. It was whether the data stayed under the lawyer's control.

It's not just documents anymore

While Heppner is about typed input, a parallel problem is growing around AI meeting tools.

In January 2026, the New York City Bar's Ethics Committee issued Formal Opinion 2025-6 on this exact topic. The findings were blunt: AI assistants like Otter.ai and Fireflies record attorney-client calls and send them to outside servers for processing. The lawyer can't promise those talks stay private. The transcript may be fair game in court. The privilege may be gone.

The Committee flagged something surprising: when a client uses their own AI recording tool on a call, the attorney loses control over privilege too. The client's app runs on different terms than the firm's.

Privilege is now a two-device problem.

What this changes

The Heppner ruling is being picked apart by major law firms across the country. The consensus is forming fast: consumer AI tools and privileged legal work can't mix without strict controls.

Enterprise AI deals with data processing terms and no-training clauses offer some cover. But they take procurement, vetting and ongoing checks that most solo lawyers and small firms don't have.

The simpler answer is what courts are now demanding between the lines: privileged data should not leave the lawyer's hands. Don't paste a client's deal terms into ChatGPT. Don't run a litigation memo through Gemini. Don't feed discovery materials into Claude's consumer interface.

The tools aren't going away. But Heppner has made the tradeoff clear, and it's the client who bears the cost.


BeatMask catches privileged and sensitive content before it reaches AI tools. On your device, before anything is sent. Nothing leaves. Nothing is logged.

Legal Attorney-Client Privilege AI Policy Data Privacy