Technically Correct, Emotionally Inert: The Hidden Cost of Frictionless Customer Experience
In the chase to optimize everything with AI, are we losing inefficiencies that matter most?
Let’s talk about river systems, and what they reveal (stay with me…).
A river channeled into a straight, concrete-lined aqueduct moves water more efficiently than a meandering natural stream. The metrics are unambiguous, whether it’s volume per minute, loss to evaporation, or predictability of flow. The aqueduct wins on every engineering measure, yet the river loses something in its straightening. The perceived inefficiencies of natural flowing rivers are where ecosystems of all types, from fish spawning areas, to aquifers recharging, happen. This is where branches, leaves and other elements often pile up. These snag piles, what a civil engineer would call obstructions, are where biodiversity concentrates. You can optimize the river right out of being an ecosystem.
I think this analogy is important when we frame usage of AI in customer experience. Are we hoping to build very good aqueducts, aimed at measures that miss bigger opportunities? Are we sure we know what we’re losing?
Efficiency as a First Principle
The modern industrialized society runs on the idea that efficiency is the master value, the lens through which all systems should be evaluated and improved. Over a century ago, Frederick Taylor turned it into a science. The twentieth century turned it into a religion. And the twenty-first century, so far, is busy handing it to a GPU cluster and telling it to go faster.
The results, in purely technical terms, are astonishing. We can now handle millions of customer service interactions with no human involvement, at a fraction of the cost, with accuracy rates that would have seemed like science fiction a decade ago.
But efficiency is a value, not a fact. It is always efficiency toward something, but the question of toward what is not a technical question. It is a philosophical one. When we build AI systems that are primarily optimized for customer service efficiency, we have already made a profound choice about what customer experience is for, and I believe many have made that choice almost entirely without examining the real impact for both customers and companies.
The Flow Conversation as a Human Act
There’s a concept in psychology called flow (read more from the excellent book of the same name), that describes a state of optimal engagement where challenge and capability are in rough balance, and consciousness temporarily forgets itself in the work. It’s most associated with creative and athletic performance, but it applies equally to customer experiences. [I owe a special shout out to my brilliant friend Dr. Michael Wu, who introduced me to the concept of flow in the context of CX.]
The best customer conversations I’ve observed over the years have this quality of flow. A customer comes in with a problem that is, on the surface, a billing dispute or a technical question. What eventually unfolds when both parties are present, and the pacing, is right is something more like collaborative meaning-making. The customer discovers what they actually need. The representative discovers insights that extend beyond the transactional engagement, and a relationship, in the deepest sense, is renewed.
This is not a sentimental description. It has material consequences for both sides. For the company, this engagement impacts retention, expansion, referral, and, often, forgiveness when things go wrong later in the relationship. For the customer, the added value impacts usefulness of the product or service offered by the company. These are the most economically valuable outcomes in the customer lifecycle, and they emerge reliably from conversations that would be flagged as “too long” in any efficiency-optimized operations dashboard.
The flow conversation is the inefficiency that generates the value. And the current frenzy around AI, and AI agents for customer experience are, largely, systematically engineering it out of existence.
What AI Is Actually Optimizing For
Let me put it in a more provocative way. AI in its current dominant deployment model is not aimed at optimizing for customer experience. It is optimizing for the appearance of customer experience at the lowest possible cost.
This isn’t as a critique of the technology, in fact the technology is extraordinary. I say it as a critique of the problem formulation and the choices, made upstream of any engineering decision, that impact what success means.
When we define success as containment rate, handle time, and CSAT scores collected thirty seconds after a transaction closes, we have defined a problem that AI can absolutely solve. And it will. Beautifully, cheaply, at scale. The numbers will be good. The dashboard will be green.
What will not appear on the dashboard is the slow erosion of the thing those metrics were originally designed to protect, such as a customer’s sense that they are known by the companies they do business with. That they are, in some meaningful sense, in a relationship rather than a transaction sequence.
This matters beyond sentiment. An economy built on transactional efficiency, where every human interaction is a cost to be minimized, is a very different civilization than one built on relational depth. We are, right now, making architectural choices that will determine which of those we inhabit.
The Intention Economy, Revisited
Doc Searls wrote The Intention Economy more than a decade ago and the book remains prophetic in ways that have not been fully absorbed by customer experience experts. His central argument is the future of commerce isn’t companies getting better at predicting and capturing customer attention. The future is a combination customers getting better at expressing and acting on their own intentions, with true agency, and companies building the infrastructure to receive and respond to those signals with trust.
What he described is a fundamentally inefficient model by the standards of current CX thinking. It requires listening. It requires friction, not the bad kind, but the kind that slows things down enough for intention to fully form and be expressed. It requires the company to relinquish some control over the interaction in order to actually hear what the customer is saying.
AI can be deployed to support that model, and there are active developments to evolve to this more natural, yet digital, state for CX. One of the intentions of MyTerms is to enable the foundational, contractual, ability for customer-side AI agents to navigate a richer engagement model than one that is purely built on efficiency. The dominant market pull right now is in the opposite direction, toward AI that replaces the messy, expensive, slow work of understanding with the cheap, fast, scalable work of processing. These are not the same thing. The gap between them is the gap between a river and an aqueduct.
What We Should Be Building
I don’t think the answer is less AI. I think it’s AI designed around a different first principle, not efficiency, but fidelity to the customer’s context.
That means AI deployments that treat clarification as a feature and that it treat extended conversation as a signal of complexity worth honoring, not a failure of containment. It also means that capturing the full texture of a customer interaction, including the subtle subtext, and treating that as data worth having, not noise to be suppressed. It means building the right amount of friction back into systems that have been too thoroughly smoothed. Not because friction is good in itself, but because some things of value only emerge in slowing down.
I’d argue that the most important question in CX AI right now is not how fast can we resolve this? It is what are we missing by resolving it so fast?
The companies that learn to ask, and answer , that second question are going to build something that looks less like a better IVR and more like a genuine relationship at scale. That is the actual frontier. And it is, by design, not the most efficient path to it.
Which, I would argue, is exactly the point.


