Recently, I observed a conversation among designers. Experienced people. Similar field, similar tools, similar background. They were discussing AI workflows, prototypes, what works in production and what doesn't.

And then, mid-conversation, someone wrote: "Quick clarification... I think we're talking past each other."

Not a big deal. Barely noticeable. Just a remark. But I couldn't stop thinking about it.

Because the real story in design is this:

Tools like Figma didn't just create a tool. They have created a language.

Not intentionally. Not as a product feature. But over the years, while an entire industry converged on a single tool we developed a shared vocabulary. Component. Variant. Frame. Flow. Handoff.

When a designer and a developer discuss an interface, they're pointing at the same things. Not because they always agree. But because they know what they're arguing about.

That was never about efficiency. That was coordination.

SAP makes the same argument — just louder.

Anyone who has ever set up an intercompany project in SAP knows the pain. The logic is not intuitive. The learning curve is steep. The differences between versions, implementations, clients... that's genuine complexity.

But once you've understood it, everyone has the same guardrails. The same accounting structure. The same document logic. The auditor knows where to look. The accountant knows what she can't get wrong.

The tool didn't create the complexity. The complexity was already there: in legal requirements, corporate structures, compliance obligations. SAP made it visible. And in doing so: collectively understandable.

This isn't only a problem in internal processes. It shows up in everyday life too, and sometimes you only notice it the third or fourth time around.

At a conference recently: a job board built as a chatbot. No CV, no cover letter, no formal application. Just a free conversation, profile and fit emerging from the exchange. The first reaction: finally. No forms, no boilerplate, no ritual.

But then the matches don't come. And you start looking: for reasons, for structure, for anything that provides clarity. After four attempts, you start building your own framework inside the conversation: your own categories, your own language, your own hypotheses about fit. The conversation has structure again. But the lightness that was promised at the start is gone.

A CV and cover letter were never just bureaucracy. They were the shared language between candidate and employer. Both sides knew what the format meant, both knew what to look for. When that disappears, there is no structure left to analyze failure. And no map for how to do better next time.

Now comes AI. And the feeling that the barriers are finally falling.

Each department can build its own process. Finance can optimize its own solution. Every designer generates their own prototype with their own stack, their own dialect.

Faster. More flexible. Individual.

And for many tasks, that's true. Booking your own travel doesn't require a shared language, the result doesn't affect anyone else. Goodbye, terrible travel software.

But intercompany processes still need an auditor. Design decisions still need to be discussed and decided across the organization. Teams still need to coordinate, even when everyone has their own tool.

The barriers are falling. The problems aren't.

Organizations are asking right now: Which AI tool should we adopt? How do we train our people? How fast can we scale?

The rarer question is: What do our people actually talk about when they discuss their AI deliverables? What do they point to when explaining what they have asked the AI to build to a colleague? What's the shared reference, when the familiar common structure or tool is gone?

That's not an abstract question. It's the question that determines whether AI leads to genuine coordination in an organization or to a group of individuals working past each other with remarkable efficiency.

The remark from that design conversation stayed with me not because of the situation itself. But because of what it suggests: that shared language is not a natural resource. It is a byproduct of shared pain, shared tools, shared failure. And that it disappears: quietly, without announcement, while everyone is busy getting faster.

What are you actually talking about when you discuss your next AI step?

Keep Reading