Around 2016-18, I was spending a lot of time on planes between the UK, Canada, and a weekend trip to Perth. I was working with tech teams and people kept talking about agile - sprints, standups, user stories, velocity.
The jargon was everywhere.
Worse, I'd see people use "we're doing agile" as an excuse to avoid work they didn't want to do or skip proper planning entirely.
I needed to understand what people were actually talking about. Not the ceremonies, not the jargon - the underlying concepts.
I found this video on a flight and it changed how I thought about both physical and software projects:
Watch: What is Agile? Agile Explained... with a Train Set?!?
Agile Explained... with a Train Set?!? - YouTube
Most agile explanations drown you in terminology. This one uses toy trains to show you the fundamental difference between planning everything upfront (waterfall) vs building in small iterations (agile).
No jargon. Just concepts.
After watching it, I downloaded most of the Development That Pays channel and watched them on subsequent flights. That channel became one of my go-to references for explaining software development concepts without the usual buzzword overload.
Understanding the concepts was one thing. Applying them was something else entirely.
A few years later, we were implementing a new courier dispatch system to replace our existing setup. We needed to integrate multiple couriers, handle label printing, manage box packing workflows - the whole operation.
The instinct was to scope everything properly. Map out all the couriers we'd eventually need. Design the complete box packing workflow. Get it all right before shipping anything.
Instead, we focused on getting one courier working - our most popular one, and the easiest to integrate. We built just enough functionality to get labels printing and parcels shipping.
Then we took it to the shop floor.
This is where theory met reality. We'd tested it as thoroughly as we could in development. But putting it in front of the actual dispatch team, watching them use it in their real workflow, taught us things we'd never have learned from requirements documents.
Label reprint was significantly more important than we'd realised. We'd been planning to prioritise various box packing features. The team told us they needed reliable label reprint more than any of that.
We also got UI feedback we wouldn't have anticipated. Small things about button placement, how information was displayed, which actions needed to be faster.
We learned all this in weeks. If we'd tried to "get it right" before shipping, we'd have spent months building features nobody needed while missing the ones that mattered.
When I was working in prepress, we were looking at automation software for preprogramming guillotines. The software required unique filenames for each job.
We used timestamps. It seemed logical - guaranteed uniqueness, built into every system.
But for operators, typing timestamps was cumbersome. They'd mistype them, lose time, get frustrated.
We had months of discussions trying to scope this correctly. What's the right naming convention? How do we enforce it? What happens when people get it wrong? Should we train people differently? Should we change the software requirements?
Eventually, someone suggested putting a barcode in the trim waste area of each job. Scan it, get the filename, done.
We tried it. It worked. The whole thing took a couple of weeks to implement once we stopped discussing and started experimenting.
The lesson wasn't that planning is bad. It's that sometimes you need to try something to learn what actually works. Months of trying to "scope it correctly" couldn't compete with weeks of real-world feedback.
Fast forward to today. I'm working on an enterprise system implementation - warehouse apps, tracking tools, business process automation, the works.
And I still catch myself defaulting to waterfall thinking.
"We need complete requirements before we build this."
"Let's map out all the edge cases first."
"We should wait until we fully understand the process."
Sometimes that's correct. Sometimes you genuinely need to plan properly because the dependencies are real and the risks are high.
But sometimes I'm just uncomfortable with uncertainty. I'm using "proper planning" as an excuse to avoid showing incomplete work and getting feedback on it.
The difference between agile and waterfall isn't about methodology. It's about recognising when you'll learn more by building something rough and testing it versus when you need to plan thoroughly because the cost of getting it wrong is too high.
I'm still learning to tell the difference. A decade into this.
I understand the concepts. I've seen them work - the dispatch system, the prepress barcodes, dozens of other examples.
But applying them consistently? That's harder.
I still over-plan things that would benefit from quick experiments. I still push for complete requirements when a rough prototype would teach us more. I still feel uncomfortable shipping something "incomplete" even when that incompleteness is the whole point.
The agile concepts from that train set video are simple. The execution is not.
I'm getting better at catching myself. Asking "what am I actually trying to learn here?" instead of "how do I make this perfect before anyone sees it?"
Sometimes I get it right. Sometimes I spend three months in planning discussions when two weeks of experimentation would have solved it.
That's the learning journey. Still ongoing.
Watch the video (it's 10 minutes): Agile Explained... with a Train Set?!? - YouTube
Want to go deeper? Check out the whole Development That Pays channel: https://www.youtube.com/@DevelopmentThatPays