myaiMake Yourself AI
Back to blog

Go to Gemba: Why AI Should Learn from the Floor

March 30, 2026 · Mark Freedman & William VanBuskirk

In lean manufacturing, "gemba" means "the real place" — where the work happens. When something goes wrong, you don't read a report. You go to the floor. You ask the machinist, not the manual.

It doesn't matter if you have a PhD. The person who runs the machine probably knows more about it than anyone.

The gap between the floor and the tools

Most enterprise software is built from the top down. Someone in an office designs a workflow, an analyst builds a dashboard, and the people doing the actual work get a system that sort of fits but mostly doesn't.

AI is following the same pattern. The tools are built for the people who buy software, not the people who use it. A VP gets a dashboard with AI insights. A sales manager gets automated outbound. But the quality engineer running root cause analysis on third shift? The supply chain planner juggling five ERP systems? They get a chatbot that doesn't know their process.

What "go to gemba" means for AI

We built myai on the gemba principle. Instead of building AI that replaces expertise, we build AI that learns from it.

That means the quality engineer who's spent 20 years developing intuition about failure modes isn't being replaced by a model trained on generic data. Their judgment is being captured, structured, and made available — to the next shift, the next hire, the next plant.

It means the supply chain planner who holds the real picture of what's happening (the one that doesn't match any single system of record) can externalize that understanding. Their mental model becomes something the organization can use even when they're not in the room.

Why this matters now

There's a wave of AI tools that are very good at standardized problems. Outbound emails. Meeting summaries. Code generation. These are important, but they're also the problems where human expertise matters least — because the pattern is already well-defined.

The valuable problems are the ones where expertise matters most. Where the difference between a good decision and a bad one is 20 years of pattern recognition. Where the "right" answer depends on context that no single system captures.

Those are the problems we care about. And solving them starts the same way it always has: go to where the work happens, talk to the person closest to the problem, and build from there.

The practical implication

When we work with a company, we don't start with their data. We start with their people. Who knows the most? What do they know that isn't written down anywhere? What decisions are they making that nobody else can make?

Then we build AI that mirrors that expertise — so it scales without the expert being in every room, on every call, in every decision.

That's what "go to gemba" means in the age of AI. The principle hasn't changed. The tools have.