Skip to main content

Agile Velocity: The Numbers All Go To 11

Jim Highsmith recently posited that "velocity is killing agility!" which is kind of a fun hypothesis.  Jim observes that company leaders he talks with around the world these days are a little too quick to measure the effectiveness of their agile software development teams by keeping track of the teams' velocity (the average amount of estimated software effort the team delivers per software delivery iteration).
From:  http://startofanadultlife.tumblr.com/post/6847194092/nigel-tufnel-the-numbers-all-go-to-eleven-look
This is quite ironic, of course, since one of the rudimentary things you learn when you first study agile software development is that "velocity" is measured in abstract terms like "points," or "ideal hours."  The numbers are relative to each other, and mapping points to time is a fluid process, and only valid to one team at a time.  The idea of tracking velocity is so absurd, in fact, that there is an entire web site devoted to the concept of estimation using fruit (cherry for very small amounts of effort; watermelon for large amounts).  If each team chooses different themes (fruit, mammals, buildings in Chicago, planets), one can see even more clearly that trying to compare one team to another is a recipe for disaster.

But of course these executives aren't always being quite so blunder-headed with their metrics as to compare one team to another.  Instead, as Jim describes it, they:
  • Try to get teams to deliver faster and faster--if you delivered 5 points in this iteration, try for 6 next time.  Or if your average team velocity was 5 in this project, keep the team together and try to get up to an average of 8 in the next.
  • Evaluate effort solely in terms of the software's value at first release to the market--if you measure your effectiveness by "how quickly you get to feature-complete," you quickly lose track of important things like "how quickly can you change" with the market, "how much does it cost" to maintain the thing, and even "how delighted are customers to use the application."
  • Lose sight altogether of the actual business bottom line.  In real life, software developers are being paid to deliver value to their home businesses, whether measured in increased revenue, decreased cost, increased customer satisfaction, decreased risk of government regulation non-compliance, increased greenness, or anything else in the host organization's balanced scorecard.
These leaders are falling into the classic "it goes to 11" trap made famous by Christopher Guest's character, Nigel Tufnel, in the immortal movie, Spın̈al TapTap aficionados will remember that Nigel, lead guitarist for the band, is very proud of his amplifier, specially produced with control knobs that go to "11," not just "10."  Nigel doesn't even understand the question "but why wouldn't you just get a louder amp?" so pleased is he that his goes to 11.

But what is a leader to do, if she wants to measure the productivity of her IT staff?  You need to figure out who to promote and who to put in the basement without a stapler, after all.  I would recommend the following, taking Jim's points in reverse order:
  • Measure the value generated by software investments.  This is not new--it's Jim's point in his velocity blog post, and in it, he also cites Martin Fowler's 2003 bliki post, "CannotMeasureProductivity," on this exact point.  At the end of the day, a business is in business to create value, not to ensure its developers are working at an abstract maximum capacity.  If Team A's software generated $10 million in value over two years, and Team B's software generated $1 million, and both efforts cost the same, then it would be worth while asking some questions about why one effort was so much more valuable to the business than the other.  You may get a range of interesting answers, some having to do with how well the team is working, and some having to do with the overall value of the concept they were delivering.
  • Evaluate your return on investment over the life of the product, not just one quarter.  In IT just as in other investments, leaders often think too much in the immediate term.  Certainly, it's a good idea to rush to market with software when you can get a first-mover advantage.  Even in this case, however, your IT investment should be made with an eye to the long term.  What will you do after you have jumped into the market with the first offering of this kind?  Have you positioned yourself to maintain your lead, because you have what Jim calls a "supple delivery engine"?  Or have you worn yourself out, and put an idea on the market which can easily be copied by others?  Will a competitor with a robust software delivery methodology quickly leapfrog you and pull ahead?  Unless you're on the brink of insolvency, you need to look at the expected return on investment for the life of the product in the marketplace, not just your profit and loss over this budget year.
  • Balance each software-development specific metric with a complimentary one.  You may have good reasons to measure the amount of software a team is developing.  Perhaps you have concerns about specific individuals whom you feel aren't carrying their weight.  Never say never, I say.  But if you are going to measure velocity, then make sure you measure other things as well, including quality (fit of software to use and actual failures of software to work as intended) and complexity (is the software impossible to change, because it's so poorly written?)  These three metrics balance each other out to some degree, and teams should hit minimum standards in all three dimensions, not just one.  If you ask for only speed, that's all you're going to get, and you won't like it.
Jim makes the proposal that agilists have been too quick to give all of the decision-making power to the Product Owner from the business side, and he suggests remedying this problem by instead creating a team to make decisions with one person from the business and one from IT.  I'm not sure I agree with this solution, since it's extremely powerful to have one person (the person who will live with the results of the decision) calling the shots.  However, I do think that if business people begin to embrace the notion that software quality has actual P&L ramifications for the business, they will naturally want to consult the tech lead about what will create the best business results.
Please read Jim's post--he suggests other really good things, like using value points to measure software value as delivered, rather than focusing on the effort it took to get there.  As always, there is a lot there.

Popular posts from this blog

A Corporate Agile 10-point Checklist

I'm pretty sure my few remaining friends in the "small, collocated team agile" community are going to desert me after this, but I actually have a checklist of 10 things to think about if you're a product owner at a big company thinking of trying out some agile today.  Some of these might even apply to you if you're in a smaller place.  So at the risk of inciting an anti-checklist riot (I'm sorry, Pez!), I am putting this out there in case it is helpful to someone else.

Here's what you should think about:

1.Your staffing pattern.  A full agile project requires that you have the full team engaged for the whole duration of the project at the right ratios.  So as you provision the project, check to see whether you can arrange this staffing pattern.  If not, you will encounter risks because of missing people.  Concretely it means that:
a.You need your user experience people (if applicable) and your analysts at the beginning of the project, as always, b…

Requirements Traceability in Agile Software Development

One of the grim proving grounds for the would-be agile business analyst (henceforth "WBABA")  is the "traceability conversation."  Eventually, you will have to have one.  You may have seen one already.  If you haven't, you may want to half-avert your eyes as you read further.  It gets a little brutal.  But if you close them all the way, you can't read.
Dialogue:
WBABA:   ...so in summary, we complete analysis on each story card, and then we support the developers as they build it that same iteration!Corporate Standards Guy:  but how do you do traceability in agile?  You have to have traceability.  It's broadly recognized as an important factor in building rigorous software systems. These software systems permeate our society and we must entrust them with lives of everyday people on a daily basis. [The last two sentences are an actual quotation from the Center of Excellence for Software Traceability website!] WBABA: [cowed silence]Corporate Standards …

The Agile Business Case

Many agile teams have never seen a business case, ever, and they may even be proud of it.

Our mantra is that we deliver "business value," not just "software," quicker, better, and faster, but if so, we certainly don't spend a lot of time reporting on value delivery, and in fact we may be scornful about "analysis paralysis."  As software developers, we consider ourselves to be doing quite well if we can deliver the software every two weeks (or continuously).  And this is particularly if we've enabled this frequent high-quality delivery through automated testing and automated build-and-release techniques.  We've reduced business risk by making results visible more often, and allowing the business to change direction more frequently.  We assert that along the way of course we're also delivering value.  But how would we prove it?

I've recently posited that we shouldn't even think of doing agile projects without capturing and recording s…