After reading Corey Ladas Scrumban a couple of times, I'm finally convinced that iterations are not a necessity anymore. As he brilliantly pointed out in his essay, we started out with 100 features in 1 iteration (waterfall model) and then got to 10 features per iteration (Agile). We are now starting to realize that we can go 1 feature per iteration (Scrumban).
As we converge into this new mindset of modeling our software development system, we can adopt Lean practices: seeing flow and managing our pipeline. By 1 feature per iteration, I'm not saying that the whole team should focus on only 1 feature while all the other ones are waiting outside of the pipeline. The 1 feature per iteration means that we can deliver each time a feature is done, or comes out of the pipeline. We don't need to say were going to wait 6 months before X number of features are done before shipping a new version. While the 1 feature per iteration might be mind boggling for some, Corey really shows how to setup your pipeline to see flow and bottle necks.
Indeed, Scrumban can show you when things start to slow down because of churn. Another key point about Scrumban is how it ties in with another Lean idea. In his books, Lean thinker John Seddon shows you how to find a good measurement for your system. He gives 3 rules to determine a good metric for your system. Rule #1 says that the metric should be derived from the purpose of the system. What is the purpose of a software development system? My answer would be: To deliver features of good quality as fast as possible to its customers. So the metric for our system is speed, or time. What's the link with Scrumban? Corey Ladas says that velocity (user story points) is not the real metric of an iteration. It's a consequence of your system. If your system slows does (features not moving in the pipeline), your velocity goes down. The real metric of your pipeline is how fast can you transform the feature from an idea to an implementation. Hence, time is the real metric of your system.
I applied Scrumban in a fixed priced contract. I was curious to see if it would give us the same metric (number of days it took per feature) compared to the estimates we put when we bid for the project. It didn't work well because not everybody in the team, especially the Project Manager, understood its purpose. An analyst saw this as a great tool to keep track of its features but the PM was so overwhelmed with other issues (enough to fill another post) that he lost track of the Scrumban. But it was obvious how the board showed churn in our process. The long line of yellow cards on the board were in the "Code" pile. And that pile didn't move for 2 weeks. It was right there, in front of our face and we never noticed it...