The Surveillance State: Judge Wang’s AI Order Is a Carpenter Trap

Moral Machine Setlist

The Police’s “Every Breath You Take” is the perfect anthem for the theme here. Surveillance that feels routine is still surveillance. When courts turn temporary chats into permanent archives, the state does not just watch. It records. The Fourth Amendment was written for moments like this. (Thanks Madison!)

Judges Turning Civil Discovery Into Government Pre-Surveillance

At a time when courts should be defending privacy, some are busy manufacturing surveillance. And if Carpenter v. United States taught us anything, it is that “detailed, encyclopedic, [and] effortlessly compiled” digital dossiers deserve Fourth Amendment protection. Translation: cops should need a warrant for this stuff.

The Order That Started This Mess

On May 13, 2025, Judge Wang ordered OpenAI to “preserve and segregate all output log data that would otherwise be deleted… until further order of the Court.” She doubled down when they asked for reconsideration. OpenAI publicly objected, filed appeals, and warned about privacy violations. The court was unmoved about creating a mass-surveillance archive and went on to explain (in footnote 2) that despite appearing like a mass-surveillance program, it was not, and cited to the Articles I–III of the Constitution that the Judiciary is not a law enforcement agency.

So those late-night anxiety spirals you typed into ChatGPT. The strategy sessions about your divorce, or normally privileged legal conversations with an attorney. The thing that’s growing on your thing, or otherwise HIPAA protected conversations with a doctor. (Altman is trying) That experimental fiction you are too embarrassed to show anyone but scratches that itch. Congratulations: it is evidence now, and possibly forever. You are welcome.

Why Carpenter Should Have Everyone’s Attention

Remember when the Supreme Court finally figured out that tracking your phone’s location history was not the same as reading your phone bill? Carpenter said no to that idea. The Court called it “near perfect surveillance,” and said the government needs a warrant for that kind of omniscience.

Chief Justice Roberts called location data a “detailed, encyclopedic, and effortlessly compiled” chronicle. Sound familiar? That is exactly what AI chat logs are, except worse. Your location says where you went. Your ChatGPT history says what you were thinking when you got there.

ChatGPT logs are not “business records.” They are diaries the government can try to subpoena. When a Federal Court forces a company to keep logs longer than anyone reasonably expected, it manufactures the exact surveillance apparatus Carpenter warned about. The Fourth Amendment should be having a seizure about this seizure.

Three Ways This Violates the Constitution

Violation 1: It Is a Seizure, Full Stop.

Forcing OpenAI to keep data it would normally delete at the request of a user is a seizure. Not metaphorically. Legally. The government, acting through a court, just grabbed possessory control over hundreds of millions of people’s conversations. A Pew Research Poll found that nearly a third of all Americans have used ChatGPT, with almost two-thirds of Americans under the age of 30 having used it. That requires particularity and reasonableness. “Keep everything from everyone forever” fails both tests, in a huge way.

Violation 2: The State Cannot Manufacture Its Own Loophole


Carpenter said certain records are too revealing for the third-party doctrine. Here is the trick some will try. The State creates the archive through civil discovery, then later shows up with a basic subpoena claiming “these are just business records.” That is not clever. It is an unconstitutional end run. If the government wants to read the surveillance dossier it forced into existence, it should need a warrant. Period.

Violation 3: If Emails Need Warrants, So Do These Logs


The Sixth Circuit already ruled in Warshak that email content at providers requires a warrant. ChatGPT logs are emails on all the steroids. They are more personal, more revealing, and more dangerous. If Gmail needs a warrant, ChatGPT needs a warrant with specificity too.

“But Wait,” Say the Government Lawyers

“It is just civil discovery.” Courts are still the government. The Constitution does not take sick days for civil cases (or for creating mass-surveillance apparatus). When the State creates a surveillance archive today, against all reasonable expectations of privacy (think Katz), and raids it tomorrow, that is a Fourth Amendment problem with a cherry on top.

“These are ordinary business records.” Not after this order. The court transformed mayfly data into permanent dossiers. That is the very surveillance capacity Carpenter identified as different.

“Users agreed to the Terms of Service.” No one believes that clicking “I agree” means “please keep my deepest thoughts and secrets for future government fishing expeditions.” Carpenter already rejected this reasoning for digital dossiers.

What Courts Should Do

  1. Recognize that forcing platforms to preserve everyone’s chat logs is a seizure. Demand actual particularity. “Keep everything” is not particular.
  2. Require warrants with specificity for any government access to these preserved logs. No shortcuts with subpoenas or Section 2703(d) orders.
  3. Stop applying the third-party doctrine when the government created the archive. You do not get to manufacture evidence and then claim it was always there. Especially where is a clear expectation of privacy with “no training” and “delete” options.
  4. If preservation must happen, require real protections: cryptographic segregation, access logging, automatic deletion after litigation, and explicit warrant requirements for any government request.

Meanwhile, Back in Your Law Office on Monday Morning

While we wait for courts to remember the Constitution exists, you still have Rule 1.6 to worry about. That is what I covered in The ChatGPT Panopticon: Black Mirror Meets Rule 1.6. Quick recap:

  1. Assume Everything Is Permanent. Even with training off and a “delete” button available, retention may be on for a very long time. If you would not want it on a billboard, do not type it into ChatGPT.
  2. Cover Your A** Contractually. Get data processing agreements with real teeth: no training, minimal retention, geographic limits, and maybe warrant-only access. If your vendor will not sign it, find one who will.
  3. Actually Supervise This Stuff. Partners, this means you. Set policies, train people, and audit usage. “I did not know my associate was feeding discovery to ChatGPT” is not a defense.
  4. Tell Your Clients. They deserve to know you are using AI and how you are protecting them. One paragraph in your engagement letter beats one paragraph in your malpractice or grievance complaint.

Ethics Trifecta in Play

Feed client secrets to ChatGPT and you have just won the professional responsibility trifecta:

  • RPC 1.6 (Confidentiality): Failing to make reasonable efforts to prevent disclosure.
  • RPC 1.1 (Competence): Using technology you do not understand. “I did not know ChatGPT kept logs” is not a defense.
  • RPC 5.1 and 5.3 (Supervision): No firm policy, no vendor controls, and no adult supervision.

Why This Matters

AI is incredible. It is like having a brilliant intern who never sleeps, never complains, and occasionally hallucinates entire federal cases starring Rick and Morty for fun. But when civil discovery morphs into surveillance infrastructure, we have crossed a line.

Judge Wang’s order does not just preserve evidence for a data training and copyright lawsuit; that would have made at least some sense. Instead, the order created a permanent record of hundreds millions of people’s unguarded thoughts. The government will use this. Prosecutors will use this. Opposing counsel is already drafting discovery requests.

Chief Justice Roberts warned about “near perfect surveillance.” We are building it ourselves, one preserved chat at a time. You do not need cell towers when you have human consciousness uploaded to a provider’s servers.

Big Brother, But For Real

Civil discovery should freeze evidence, not forge Big Brother in a way the government (go ahead, tell me a Federal Court isn’t a government actor) should never be able to do. When courts turn temporary chats into permanent archives, they are building the surveillance machine that Carpenter said requires a warrant.

The fix is simple:

  • Forcing preservation means seizure. Seizures require particularity.
  • Accessing the archive means search. Searches require a warrant.
  • Using ChatGPT for client work is an ethics minefield. It requires better practices under the RPC.

Or, maybe, I don’t know, ORDER OpenAI to maintain logs of chats of over a hundred million citizens that have nothing to do with a copywrite lawsuit with the New York Times. I know, crazy talk.

Until courts remember the Fourth Amendment, treat every ChatGPT session like it is being recorded. Thanks to Judge Wang, it effectively is. It is preserved, segregated, and ready for subpoena, despite your expectations of privacy and (alleged) control of your own data. Good luck, Citizen.

For solutions and more detail, read The ChatGPT Panopticon: Black Mirror Meets Rule 1.6.

ABOUT AUTHOR
Chris D. Warren

Member, Scarinci Hollenbeck, LLC. Partnership and Business Litigation Attorney with Passion for the Nexus between Technology and Ethics in the Legal Profession.