Everyone's talking about how AI writes code faster. How it builds MVPs in a weekend. How a single developer with the right tools can replace a whole team.
They're missing the point.
The hardest part of building software was never typing the code. It was understanding the problem. And in a world where AI-generated projects pop up like churros at a street fair, that understanding is the only thing that separates a product that matters from another repo collecting dust.
Let me show you what I mean.
The Phone Call That Started Everything
A recycling conglomerate in Germany called me. They process thousands of tons of non-ferrous metals — copper, aluminum, brass — buying scrap from suppliers and delivering processed materials to industrial clients.
On paper, their business is simple: buy metal, weigh it, process it, sell it.
In reality, it's chaos.
Listening Before Building
My first instinct wasn't to open a code editor. It was to listen. For weeks, I talked to everyone involved in the operation:
The accountant was drowning in credit notes. Suppliers sent PDFs by email — sometimes handwritten, sometimes machine-generated, always in German, always different formats. She spent hours manually entering figures into their existing ERP system. One decimal error could mean thousands of euros miscalculated.
The transport coordinator managed pickups via WhatsApp. Dozens of messages per day. Which truck goes where. What was picked up. How much it weighs. All living in chat threads that disappeared into the scroll of time. No traceability. No record. No way to verify what was actually loaded versus what was invoiced.
The logistics lead was the human bridge between everyone else. He kept the operation running through sheer force of memory and phone calls. When he was out sick, the operation slowed to a crawl. That's not a process — that's a single point of failure wearing a hard hat.
The Pain Behind the Pain
Here's what a less experienced developer would have heard: "We need a better ERP." And they would have built one. A shiny new system with dashboards and forms and all the modern trappings.
But that's the surface. The real pain was deeper.
The metal disappears. Between the moment a supplier loads scrap onto a truck and the moment it's weighed at the processing facility, material gets lost. Not stolen — just untracked. A truck picks up 2,300 kg of copper. The facility weighs 2,180 kg. Where did the 120 kg go? Moisture loss? Measurement error? The wrong material category? Nobody knew because nobody measured at every step.
Prices change by the hour. Copper was €8,200 per ton on Monday morning and €8,050 by Tuesday afternoon. When you're processing hundreds of tons per month, the difference between buying at Monday's price and selling at Tuesday's price is the margin. The existing system had no way to lock prices at the moment of transaction.
Knowledge lives in people, not systems. When I asked "How do you know the quality grade of incoming material?", the answer was always "he knows" or "the driver calls the coordinator." No classification system. No standardized criteria. Just experienced eyes and phone calls.
Seven Weights That Tell a Story
Through dozens of conversations, site visits, and process mapping sessions, I identified something that no one in the company had formalized: the material passes through seven distinct measurement points.
- W1: Supplier's declared weight (what they say they're sending)
- W2: Weight at pickup (what the truck actually loads)
- W3: Weight at facility gate (what arrives)
- W4: Weight after initial processing (what's left after sorting)
- W5: Weight after quality classification (what's actually usable)
- W6: Weight at delivery loading (what ships to the client)
- W7: Client's confirmed weight (what they accept and pay for)
Each delta between these weights tells a story. W1 vs W2 reveals supplier accuracy. W3 vs W4 shows processing loss. W6 vs W7 exposes delivery discrepancies. Together, they form a complete traceability chain that answers every "where did the metal go?" question.
No one at the company had ever mapped this out. They knew, intuitively, that material was lost. They could feel it in their margins. But they couldn't see it, because the data didn't exist.
That's what requirements engineering looks like in 2026. Not filling out templates. Not writing user stories in a backlog. It's sitting in a dusty facility watching trucks unload and asking "why?" until the real problem surfaces.
What We Actually Built (And Why It Matters Less Than You Think)
Yes, we built the system. Odoo 18 as the backbone. Custom modules for the seven-weight traceability chain. AI-powered PDF processing for credit notes. Mobile apps for drivers and warehouse workers. Dashboards that show exactly where every kilogram of metal is at any moment.
But here's the thing: any competent developer with AI tools could have built those features. Claude Code can generate Odoo modules. Gemini can scaffold a mobile app. GPT can design a database schema.
What AI cannot do is sit across from the accountant and notice that she hesitates before entering certain credit notes — because some suppliers use handwritten corrections that she can't always read but is too embarrassed to call and verify. That hesitation costs the company money. That insight shaped the AI processing pipeline we built.
What AI cannot do is ride along on a Tuesday morning pickup and realize that the driver doesn't actually weigh the material at the supplier's location — he estimates — because the portable scale is broken and nobody requested a replacement since it "wasn't urgent." That broken scale is a data gap. That data gap is lost revenue.
The Human Advantage
We're entering an era where building software is increasingly commoditized. AI writes code, generates tests, creates documentation, and deploys to production. The barrier to creating something has never been lower.
But the barrier to creating something meaningful hasn't changed at all.
The companies that win won't be the ones with the best AI tools. They'll be the ones with people who can:
- Extract the real problem from the noise of what people say they want
- Map invisible processes that no one has ever documented because "that's just how we do things"
- Quantify intuitive knowledge — turning "we lose some material" into "we lose 5.2% between W2 and W3, primarily in copper alloys during summer months"
- Design for the human who will use the system at 6 AM with gloves on, standing next to a truck, in the rain
This is requirements engineering. Not the academic kind with use case diagrams and formal specifications. The field kind. The kind where you get dust on your shoes and grease on your notebook.
The Uncomfortable Truth
Here's what I want every developer reading this to internalize:
Your code is not your competitive advantage anymore. AI can match your output. In many cases, it can exceed it.
Your advantage is your ability to understand a domain so deeply that you can design the right system — not just a working one. To ask questions that reveal hidden costs. To notice a Post-it note on a monitor and understand that it represents a €180,000 annual risk.
The requirements document for this project wasn't a Google Doc with bullet points. It was weeks of observation, conversation, and analysis. It was understanding that a recycling company's real problem isn't "we need software" — it's "we can't see where our money goes between the supplier's truck and the client's warehouse."
That insight? No AI generated it. A human did. By showing up, asking questions, and refusing to accept the first answer.
This is the first article in a series about building traceability systems for industrial operations. Next: how we turned these insights into architecture decisions — and why the technology choices were the easiest part.
— Omar