The Friction Files | 03
The first false agreement arrived in the requirements session.
Everyone in the room said AI.
Almost no one meant the same thing by it.
This was not a chatbot.
It was a decision-support layer meant to flag, rank, route, and recommend across a public-private chain of decisions.
A private operator heard speed.
The sponsor heard coherence.
Risk heard traceability.
Somewhere in the middle, a few people heard cover.
The AI lead heard something else:
a problem clean enough to model.
By the time the room started saying prioritization, consistency, and smarter recommendations, it sounded aligned.
Then someone from one of the private operators asked:
“When you say recommendation, do you mean advice — or something the queue will actually reorder around?”
The AI lead said the system was configurable.
Risk said governance would define that later.
Operations said it would depend on the workflow.
The sponsor said the direction was clear.
No one answered the question.
Owen wrote another line in the margin.
Before AI gets built, it gets imagined.
The room had consensus on the promise.
The system was already being assigned different jobs.
In that room, each one sounded a little like rescue.