Skip to main content

Quantifying Manual Test Technical Debt

Forced by circumstances (and an especially pragmatic client), I've recently been asking peers and "the blogosphere" the apparently naive question, "is it important to do automated testing and clear up what Mike Cohn calls 'manual test technical debt?'"

Theoretically, if you have no Big Upfront Design and you also have no automated test suite, you're pretty much just building a big unplanned mess and you'll never be able to change anything for fear of breaking something else.  I understand that, and so does my pragmatic client.  But can this issue be quantified?  Aside from feeling somewhat inadequate when reading agile theory, I mean.

The reason it's important to ask this question is that setting up and maintaining automated testing is expensive.  Unlike many agile practices, which can be set off with some sharpies and a clean wall, automated testing requires a large scale capital expenditure to bring one or more testing packages into an organization, and then some significant strategy development, staging, and training of resources to put those new packages into use and to keep them running.

You need some compelling financial case to show the likely return on an investment which may be half a million dollars for software alone in a medium sized corporation.  So how much is it worth to clean up your manual test technical debt?

There are some related questions out there with quantified answers.  Gartner and CAST
have recently published estimates of how much technical debt is out there globally and on a per-application basis ($500B and $1M, respectively). "Technical debt" is defined by these studies as "the cost of dealing with delayed and deferred maintenance of the application portfolio."  As most people who worry about technical debt note, this "cost to fix" is not the same as what it costs the business to have a software emergency of some kind, or to be slow in rolling out enhancements to the software.  And of course this "cost to fix" is not exactly the same as the cost accrued by an organization due to the lack of automated testing.  But it's a handy statistic to be able to throw around.

Within your own organization, you may want to look at the following, in terms of making your business case:
  • Regression Testing costs:  this may be your easiest way to quantify a return on investment.  What does it cost you to manually regression test your most expensive features?  How much would it cost you to set up automated testing on those features?  It is likely that you'll have some parts of your application where the one-time cost of setting up an automated regression test will pay for itself within the year, in terms of making your manual testers available to do other important testing of new features as they are written.
  • Software development costs:  if you're in a position to do so, quantify the number of points your project teams deliver per iteration, to get a "cost per point."  If your teams are routinely interrupted to fix production bugs and put in patches, you should be able to quantify the cost of these unplanned interruptions, in terms of undelivered feature points.  The costs will likely add up to tens of thousands of dollars quite easily.
  • Insurance analogycost of unexpected down time for your application.  Calculate the cost to your company for down time to your application.  Even if down time is not catastrophic, since manual work-arounds are available, you should be able to quantify the value of the time of staff diverted to doing manual work-arounds during software down time.  If automated tests give you better quality software and reduced down time, a large-scale investment in automated testing may be justified.
  • Business losses due to lack of speed in adding new features:  this may be harder to readily quantify, but as your software gets harder to change, you will lose the agility you intended to gain when your company first took on agile software development techniques.  You may want to put together one or two hypothetical or real cases of changes which were much more expensive to deliver due to the cost of manual regression testing than they would have been, had automated testing been in place.  Depending on what your software does, you should be able to assign a business value to software flexibility, in terms of business lost due to delayed time to market.
  • Total software replacement cost:  at some point your software will become partially or completely impossible to repair, due to the cost of regression testing the whole.  What is the cost of starting all over?  On the other hand, if your company was planning to totally replace the software during some short time horizon, then maybe your manual testing is fine.
It's all very well and good for us zealots to say things like "without automated testing, you Just Aren't Agile."  But if you're looking to get your CFO to release significant funds, you'll want a case relevant to the company bottom line, not just an alignment with some invisible Agile Correctness measure.

Comments

  1. There are less and less situations (user interface technologies) where automatic testing needs to be expensive.

    ReplyDelete

Post a Comment

Popular posts from this blog

How Do You Vote Someone Off of Your Agile Team?

One of the conundrums of agile conversion is that although you are ordered by management to "self-organize," you don't get to pick your own team.  You may not have pictured it this way, but your agile team members are going to be the same people you worked with before, when you were all doing waterfall!   I know I wasn't picturing it that way for my first agile team, so I thought I should warn you.  (I thought I was going to get between six and eight original Agile Manifesto signers.  That didn't happen.). Why "warn" you (as opposed to "reassure" you, say)?  Because the agile process is going to reveal every wart, mole, quirk, goiter, and flatulence issue on the team within a few hours.  In the old days, you could all be eccentric or even unpleasant in your own cube, communicating only by document, wiki, email, and, in extreme situations, by phone.  Now you are suddenly forced to interact in real time, perhaps in person, with written messag

A Corporate Agile 10-point Checklist

I'm pretty sure my few remaining friends in the "small, collocated team agile" community are going to desert me after this, but I actually have a checklist of 10 things to think about if you're a product owner at a big company thinking of trying out some agile today.  Some of these might even apply to you if you're in a smaller place.  So at the risk of inciting an anti-checklist riot (I'm sorry, Pez!), I am putting this out there in case it is helpful to someone else. From http://www.yogawithjohn.com/tag/yoga-class/ Here's what you should think about: 1.        Your staffing pattern.  A full agile project requires that you have the full team engaged for the whole duration of the project at the right ratios.  So as you provision the project, check to see whether you can arrange this staffing pattern.  If not, you will encounter risks because of missing people.  Concretely it means that: a.        You need your user experience people (if a

Your Agile Transformation Needs to Start with a Quiet Phase

From a really great blog post on agile adoption:  http://smoovejazz.wordpress.com/2011/02/16/an-agile-approach-for-adopting-agile-practices/ I've observed some different agile transformation patterns, and maybe you have too: Just Do Standups   (Shoot, then Aim):   some people feel that since you're "agile," you should just start doing stuff, like daily standups, and then build more of the the plan as you go.  Find a team and start doing some agile with them!  Start a revolution one practice at a time, one team at a time. Pros:   you're very busy from the start. Cons:   what exactly are you doing and why? KPI-Driven Change (Aim, then Shoot) : some people who have worked in large corporations for a while will tell you that to get the respect of the people, you need to start with a plan, support the plan with awesome printed and online collateral.  Then you "kick off," tell teams what to do, and measure them using "Key Productivity Indica