| The following article originally appeared on Mike Amundsen’s Substack Signs from our future past It is republished here with the author’s permission. |
There’s an old hotel in a windy corner of Chicago where the front doors shine like brass mirrors. Every morning, before the guests even reach the step, a tall man in a gray coat opens one of the steps with quiet precision. He greets them by name, points to the elevator, and somehow makes every traveler feel like a regular traveler. For a cost consultant, it is a line item. For guests, it is part of the building’s atmosphere.
When management installed automatic doors a few years ago, the entrance became quieter and cheaper, but not better. Guests no longer stayed chatting, taxis stopped less frequently, and the lobby seemed cooler. Automation has improved the hotel’s bottom line but not its character.
This story exemplifies what the British advertising executive said Rory Sutherland calls “The gatekeeper’s fallacy“, the habit of confusing visual tasks with the entire role. In this Short video explanationSutherland points out that a doorman does more than just open doors. It represents safety, care and celebration. His presence changes how people feel about the place. Remove it, and you’ll save money but lose meaning.
The lesson behind the metaphor
Sutherland expanded on the idea in his 2019 book ChemistryArguing that logic alone can lead organizations astray. We often underestimate the value of the intangible parts of human labor because they don’t fit well into a spreadsheet. For example, a janitor only seems redundant if you assume his job is merely mechanical. In fact, it performs a social and symbolic function. It welcomes guests, conveys prestige and creates a feeling of security.
Of course, this lesson extends beyond hotels. In business after business, human behavior is treated as inefficiency. The result is weaker experiences, more superficial relationships, and systems that seem simplistic on paper but ring hollow in practice.
The concierge in the age of artificial intelligence
In a Last article to Conversation, Gediminas Lipnikas From the University of South Australia that many companies are repeating the same mistake with artificial intelligence. He warns people against the tendency to replace people because technology can imitate their simplest tasks while ignoring the judgment, empathy and adaptability that define the job.
Lipnikas offers two examples.
the Commonwealth Bank of Australia The company laid off 45 customer service agents after rolling out a voice bot, then reversed the decision when it realized the staff were not redundant. They were interpreters of context, not just telephone operators.
Taco Bell It introduced AI-based voice ordering while driving to speed up service, but customers complained about errors, confusion and surreal exchanges with artificial voices. The company paused the rollout and acknowledged that human improvisation works best, especially during busy periods.
Both cases reveal the same pattern: automation succeeds technically but fails empirically. It’s the digital version of installing an automatic door and wondering why the hallway feels so empty.
Measuring the wrong thing
The gatekeeper’s fallacy persists because organizations continue to measure only what is visible. Performance dashboards reward ranked numbers, calls answered, tickets closed, and customer contacts avoided because they’re easy to track. But they miss the essence of the job: problem solving, reassurance, and calm support.
When we optimize for visible productivity rather than invisible value, we teach everyone to pursue efficiency at the expense of meaning. A skilled agent doesn’t just resolve the complaint; They explain the tone and calmness of frustration. The nurse doesn’t just record vitals; They noticed a frequency that no sensor could pick up. The line cook doesn’t just fill orders; They maintain the rhythm of the kitchen.
The answer is not to stop measuring; Rather, it is doing a better job of measuring. Key results should focus on engagement, problem solving and support, not just volume and speed. Otherwise, we risk automating the parts of the business that make them valuable.
Competence versus empathy
Sutherland’s insight and Lipnikas’ warning meet at the same point: when efficiency ignores empathy, systems collapse. Automation works well for specific, rule-based tasks such as data entry, image processing, or predictive maintenance. But once creativity, empathy, and creative problem-solving come into the picture, humans remain indispensable.
What looks like inefficiency on paper is often flexibility in practice. A concierge pausing to chat with a casual guest may seem unproductive, but that moment reinforces loyalty and reputation in ways no metric can show.
Training, not replacement
That’s why my work focuses on using AI as a tool Coach or mentornot as a worker. A well-designed AI trainer can prompt thinking, provide structure, and accelerate learning, but it still relies on human curiosity to drive the process. A machine can throw up possibilities, but only a person can decide what matters.
When I design an AI coach, I think of it as a thought partner, closer to Douglas EngelbartThe idea of a human-computer partnership is more than just a replacement employee. The coach asks questions, provides scaffolding, and amplifies creativity. It does not replace the messy interpretive work that defines human intelligence.
A more human kind of intelligence
The deeper lesson of the gatekeeper’s fallacy is that intelligence is not a property of isolated systems, but rather a property of relationships. The value of a doorman is evident in the interaction between person and place, gesture and response. The same applies to artificial intelligence. Detached from the human context, it becomes thin and mechanical. When it is driven by human purpose, it becomes powerful and humane.
Every generation of innovation faces this tension. The Industrial Revolution promised to free us from labor, but it often stripped us of craftsmanship. The digital revolution promises connection but often leads to distraction. Now the AI revolution promises efficiency, but unless we are careful, it may erode the very qualities that make work worth doing.
As we rush to install the next generation of “automatic doors” technology, let’s remember the person who once stood by them. Not out of nostalgia but because the future belongs to those who still know how to welcome others.
| You can learn how Mike uses AI as an assistant by joining him on February 11 on the O’Reilly Learning Platform for his live course AI-based API design. It will take you through incorporating AI-assisted automation into human-driven API design and leveraging AI tools like ChatGPT to improve the design, documentation, and testing of web APIs. It’s free for O’Reilly members; Register here.
Not a member? Sign up for a free 10-day trial Before the event to attend – and explore all the other resources found on O’Reilly. |







