Tuesday, April 7, 2009

Is Agile "Fragile"?

While I'm not intending to be unduly controversial (well, maybe a little), I have noticed more and more commentary recently expressing various concerns about a current "hot topic" - Agile methods. One example is a recent article by James Shore, "The Decline and Fall of Agile".

In that article he remarks "It's odd to talk about the decline and fall of the agile movement, especially now that it's so popular, but I actually think the agile movement has been in decline for several years now. I've seen a shift in my business over the last few years. In the beginning, people would call me to help them introduce Agile, and I would sell them a complete package that included agile planning, cross-functional teams, and agile engineering practices. Now many people who call me already have Agile in place (they say), but they're struggling. They're having trouble meeting their iteration commitments, they're experiencing a lot of technical debt, and testing takes too long."

Personally I don't doubt there are many potential benefits of Agile methods, provided they are actually used as intended and are appropriate to the context in which they are applied. Sadly, like many other good ideas, Agile is often more "talk the talk" than "walk the talk". Some of Agile's more rabid advocates seem think its a "universal solvent", which even alchemists and sorcerers don't believe any more - nothing turns lead into gold.

On the other hand, I do have some fundamental concerns about the evident lack of hard facts and data - there seems to be a lot of heat, but not much light. Are Agile methods actually more productive in aggregate across a series of iterations compared to alternative methods? As Shore points out, "technical debt" can easily become a major problem. To some extent short iterations are necessarily risky to architectural soundness. Of course Agilists advocate "refactoring" to remedy that risk, but how often is refactoring actually done? What does it actually cost? After 10 or so iterations is Agile really, in total, more productive than another alternative?

And what about test driven design/development? What does it cost compared to Fagan style inspections? What are the actual defect containment rates? Capers Jones data and other sources clearly show Fagan inspections find 60- 80% of the defects present while testing finds 30-50% (per test type). The facts we do have call into question some of the claims made for TDD.

In fairness, my comments about lack of facts and data are by no means restricted to Agile - they apply to a great many fads du jour. Let's hope one day soon we'll begin to do rigorous data based assessments. Actually, a few of us are working behind the scenes to bring that to fruition - more about that as it develops!

If you have any facts and data (vs. anecdotes) that may shed light on this topic, please share!



No comments:

Post a Comment