In 1901, Samuel Langley made the first attempt to launch a powered aircraft. He had everything going for him: Smithsonian funding, the best team, the latest technology. His aircraft crashed into the Potomac twice. Nine days after the second crash, the Wright brothers flew. With a budget of less than a thousand dollars. In a bicycle shop.
The difference: Langley built to prove that flight was possible. The Wrights built to learn how flight worked. Langley wanted to be right. The Wrights wanted information.
That distinction determines whether an AI proof of concept delivers something or just costs money.
The pilot graveyard
Every organisation of any size has a few. AI pilots that were once launched with enthusiasm, produced an impressive demo, and are now running in a corner somewhere. Three people use it. The steering group asks every quarter "how's the AI going?" and gets a vaguely positive answer.
Too alive to bury, too weak to scale. We call them Zombie Pilots.
It almost always starts the same way. Someone has a good idea. A pilot gets launched. The pilot proves the technology works. And then it stops. Because "the technology works" is the wrong proof.
The question a pilot must answer is: does this deliver value when the boss isn't watching? Do real people use this during their normal working day? Does it genuinely make their work better? Those are the questions that determine whether something scales, and those are precisely the questions most pilots never answer.
What a proof of concept should actually deliver
A good proof of concept delivers decision data. Information on which you make one of four choices: scale, adjust, change direction, or stop. All four are valid outcomes. Stopping after two weeks with clear reasons is more valuable than six months of drifting along with a pilot that is "still under review."
Three things that make the difference.
Real people, real work. Test with the sceptics, not the enthusiasts. Let them use the system during their normal working day, with their own data, their own processes. A demo environment produces demo results.
Measure in pairs. Speed alone tells you nothing. If a process goes twice as fast but quality drops, you have gained nothing. Always measure two things at once: if it gets faster, is it also better? If it gets cheaper, do people still trust the result? One-sided metrics lead to one-sided conclusions.
The litmus test. At the end of two weeks, threaten to switch the tool off. If people protest, you have found value. If nobody notices, you have your answer too. Both outcomes are valuable. The only bad outcome is not finding out.
What it looks like in practice
Two weeks. A defined process. A build team that puts something working together quickly. And real users getting their hands on it from day three.
At the end of those two weeks, there is a working solution. Not a report, not a presentation, something that runs. Plus the data to decide: do we continue, do we adjust, or do we stop? Only once that evidence exists is scaling justified.
The investment for such an engagement is around €15,000. That sounds like a lot for two weeks. But compare it with the alternative: six months of piloting that yields nothing, followed by the conclusion that you need to start again. The most expensive choice in AI is making no choice at all.
And there is a second layer. A good proof of concept tests the technology and the organisation at the same time. How does the team respond? Where is the resistance? Which assumptions about the process held and which did not? Those insights are often more valuable than the prototype itself. And remember that even a successful system needs maintenance: without monitoring, AI drift sets in, silent degradation that nobody notices until it is too late.
Langley built his aircraft out of prestige. The Wrights built out of curiosity. They crashed too, dozens of times. But every crash delivered information. And with that information they built the next model.
An AI proof of concept works the same way. You build to learn. The crash is part of the plan. As long as you take something away from it.