Why Customers Must Set the Terms: The Sobering Reality of Surveillance Driven CX
You handed a company you've never heard of your passport, your face, and the geometry of your skull — for a LinkedIn badge worth $50 in liability. Really?
A few weeks ago, I wrote about why the agentic web needs contracts, not consent. I want to revisit that idea, as there’s a recent blog post that exposes the real risk of what happens if we don’t shift our fundamental thinking on how to maintain agency for us.
Someone spent a weekend reading 34 pages of legal documents so you don’t have to. Their post, I Verified My LinkedIn Identity. Here’s What I Actually Handed Over, is one of the most clarifying recent pieces of digital surveillance writing I’ve come across. The short version is that clicking verify on LinkedIn’s personal verification process redirected them to a company called Persona Identities, which is a company most people have never heard of. What followed was a three-minute process that captured their passport, selfie, facial geometry, NFC chip data, national ID number, IP address, device fingerprint, and behavioral biometrics, including hesitation detection. Then, Persona cross-referenced all of it against government databases, credit agencies, and mobile network providers.
All for a blue checkmark.
Persona’s subprocessor list, which are the companies that actually touch your data, includes 17 vendors, all of them based in North America (outside of EU jurisdiction, which mattered for the blog writer). The CLOUD Act means that even if your passport scan sits on a server in Frankfurt, a US court can compel Persona to hand it over without telling you. Oh, and if things go wrong? Their liability cap is $50 USD. Fifty dollars. For your face, your passport, and the mathematical geometry of your skull.
This is what consent looks like in 2026. A three-minute tap, a 34-page document nobody reads, and a surveillance chain that runs straight through AI infrastructure including Anthropic, OpenAI, and Groqcloud, all listed as subprocessors doing Data Extraction and Analysis of your identity documents.
Pretty sobering, if you ask me.
So, who exactly is “us”?
When I wrote The Case for Us I was talking about people who want genuine agency over their digital lives, and not just aspirationally, but operationally. The LinkedIn example shows why urgency matters. It’s not an edge case. Linkedin is a mainstream platform, where you may be compelled to get verified, without realizing that you handing over data to a surveillance chain most users never see. Now imagine that multiplied across every AI-powered interaction in the next decade, happening at machine speed, with no human in the loop to read anything before the terms are already set.
That’s the uncomfortable truth about where we are today. The surveillance model isn’t slowing down, rather it’s accelerating, and the agentic web will give it more surface area than ever before. But I believe the very scale of the problem is what makes this moment different. When the cost of the surveillance bargain becomes visible, as in your passport running through 17 companies, your face worth $50 in liability, three AI platforms doing “data extraction” on your government ID, people should start paying attention in ways they didn’t before. The case for individual agency isn’t a niche manifesto anymore.
It’s Time to Architect a Better Way
For nearly two decades, the Vendor Relationship Management community argued that customers should manage their own relationships with businesses, not the other way around. While this concept was aspirational, the enterprise software world largely ignored it as, for starters, there wasn’t a standard to adopt.
That’s changing. IEEE 7012, the MyTerms standard, is now officially available. It gives personal AI agents a machine-readable language for asserting your privacy terms before an interaction begins. As I explored in Beyond the Surveillance Bargain this isn’t just a privacy tool, it’s the first mechanism for genuinely bilateral customer experience. Your agent arrives at every interaction carrying your context and your conditions. Services either meet them or don’t get access. No more manufactured consent.
And as I wrote in Customer AI Agents are the New API this flips the entire CX model. You stop being a callable function, a data object to be processed, and become a platform with your own terms. Companies that adapt to that reality will have an opportunity to thrive in a balanced world. There’s so much more opportunity for value co-creation when both parties operate on agreeable terms. Companies that resist this change are running a countdown clock on a model that was always more fragile than it looked.
The Road Ahead
MyTerms itself is in its infancy. Most people have never heard of any of this. The surveillance chain is live and scaling right now. But, I believe we have a foundational element to rebalance the relationship between individuals and the entities they interact with. MyTerms is an endorsed standard, and is actively being worked on. The economic case for respecting customer agency is becoming undeniable as the costs of surveillance pile up.
The case for us isn’t that we’ve won. It’s that we finally have a foundation. That’s worth a lot.


