logo design...copyright Elena Yatzeck, 2010-2014

Monday, November 29, 2010

Collocating for Project Inception: It Costs Too Much NOT To Do It


One reality of modern software development is that teams are not collocated.  Not around the same table, not in the same building, not in the same city, and often not in the same time zone.
  • In some cases, companies are taking advantage of exchange rates or pay rates, and locate teams where they can buy more hours for each unit of their currency.  
  • In other cases, companies cultivate a better work-life balance for their employees, and stop enforcing policies around grueling commutes and time clock punching.
Communications tend to deteriorate even more once you let Briana work from home in Wisconsin and Tad work from his mom's place in Hawaii, and everyone has to get on a conference call with Australia at 4:30pm, CST, since a third of the team isn't even in a time zone which is offset by a full hour from you.  Especially if Tad has small barking dogs and a cockatoo.

So here's one incredibly good thing you can do, in the face of this potentially toxic communications cocktail:  start every project with a requirements workshop where in-person attendance is mandatory (and funded) for the product owner, leads from analysis, development, QA, and operations teams.  Sponsors and subject matter experts should be readily available in person as well.

This get-together will require an initial outlay of funds which will hugely repay itself over time.  In her 2004 book, Requirements By Collaboration, Ellen Gottesdiener quotes these helpful statistics from Capers Jones about return on investment for such gatherings.  According to Jones's empirical measurements of large-scale software projects, a facilitated workshop of this type will:
  • Reduce the risk of scope creep from 80 percent to 10 percent, and to 5 percent when combined with prototyping (Patterns of Software Systems Failure and Success, 1996)
  • Cut requirements creep in half (Software Assessments, Benchmarks and Best Practices, 2000)
  • Provide a 5 percent to 15 percent overall savings in time and effort for the project as a whole (Patterns)
  • Reduce defects delivered in software by 20 percent (Patterns)
  • Reduce project failure and cancellation rates by about 50 percent (Patterns)
  • Provide a 10-to-1 return on investment ($10 for every $1 invested; Patterns)
If your CFO is still skeptical, you may suggest running an internal experiment on two projects (try just one with an in-person requirements workshop) and measure for yourself!

Saturday, November 20, 2010

Comprehensive Documentation Strikes Back

IT community members catechized on the Agile Manifesto will recall that the original signers placed a higher value on "working software" than on "comprehensive documentation." But is working software in isolation our ultimate goal?  I think that as the Agile community has moved in the direction of delivering value, rather than just delivering working software, we are now newly realizing we need more than just rich face-to-face conversations at white boards.  We need accessible archives populated by well-structured documents which have been written in full sentences.  It's time to even the balance.

An early Tech Writer
Scott Ambler has done the Agile community a huge service with frequent and comprehensive surveys to determine what people are actually doing in the field, and how well it's working (see this Ambysoft page for a full list of his surveys and results).  As Ambler reviewed his results in 2008, he was surprised to find that Agile projects are actually more likely, and not less likely, than their traditional counterparts to produce "comprehensive documentation" such as "user manual, operations documentations, support documentation, system overview documentation."  Ambler explains this result by saying:
So, I was thinking about this a bit; and what I think was going on is one of the things that we do know, is that Agilists are actually more likely to deliver than traditionalists. That type of documentation is something that you would write towards the end of the project as things stabilize. So, I think what’s going on is Agilists are more likely to get to the end, therefore, they are more likely to do the type of work that you would do at the end, which is finish up this documentation. That’s my theory at least. I can’t back that up with real numbers but that’s what the trend appears to… (both quotations from Ambler, 2008)
One of the joys of early Agile development was its return to a mode of working which is natural for humans: small groups, close proximity, verbal communication, shared whiteboard drawing, project management through tactile movement of cards or sticky notes around on a big visual chart.  And what about actually being able to sit and talk issues through completely in one sitting rather than futilely exchanging emails in stolen moments between other meetings?  The success and happiness of these early teams is literally hard-wired into our DNA, if you believe in socio-biology, and it is intuitively beguiling to the point that today (again by Ambler's count) over 76% of American corporations are at least dabbling in Agile techniques, despite very few on-the-ground measures of what the return on investment might be.

The importance of wide-band face-to-face communication as part of the software development process is undeniable.  But I believe this heady success has led some people to confuse what is "necessary" with what is "sufficient."  As Agile teams embraced this rich and necessary form of communication, proponents began to assert that immediate, personal, and primarily verbal or visual communcations forms were better than their written alternative in most or all settings and contexts.  See for example this widely distributed chart on the "effectiveness" of communications methods.  It is completely brilliant, and worth reproducing even if I weren't quibbling with it:


Anyone who has done any greenfield software development at all will attest to the validity of this chart in expressing the best way to do "modeling," if that is defined as the creative and iterative process of determining what needs to be done, and how it should be done.

Anyone who has ever done production support of a legacy, or "brownfield" application of any vintage, however, will likely find some fault with the lower curve on the diagram, describing the best way to write lasting documentation.  And in fact, even greenfield developers may disagree that incorporating the visual picture is the sole predictor of either "richness" of communication or overall "effectiveness."
  • If you have a "system down" situation, do you want to bring up a video (if you can find it) and watch it through to figure out what might be going wrong?
  • How many times have you silently mouthed the words "Oh thank goodness!" or the equivalent when someone dug up an old yellowed computer printout for you that had the complete set of metadata translations typed out along with pithy descriptions of what the terms meant in context at the time the application was originally written?
  • And truthfully, how many times have you been grateful that someone took comprehensive minutes during your brainstorming meeting to put context around your architectural diagram, capture intent, and record immediate action items?  How handy was that email to you?  If you were there, do you now want to watch the whole meeting again on video, complete with incomprehensible low-volume discussions you can't hear, nose-blowing, groin-area adjustments, and long-winded tangents from people who aren't usually that annoying (or who are)?  If you weren't able to attend the meeting, how much will you get from the video?
When you introduce a time parameter into the evaluation, the graph looks a little different:
I don't mean to trivialize the achievement of early Agile adopters who have brought people back into the process, nor do I propose to claim that comprehensive written documentation is more important than written code.  Using what Jim Highsmith calls "And" Leadership, I'm saying that we should keep both the baby AND the bath water.

No, wait.

I mean we shouldn't throw out the use of written communication for post-facto documentation just because we want to get rid of big speculative requirements drafting.  For the long term, written communication can do some things that video can't do, and even in-person communication can't.  When you've had that really long, contentious meeting, it's extremely helpful for the consensus you've reached to be neatly summarized in words and whatever pictures are necessary.  Otherwise, you're going to have the argument again tomorrow, in two weeks, and/or next year.

Over time, the delivery of business value depends almost as much on the delivery of comprehensive documentation as it does on the delivery of working software.  Let the light saber wars begin.

Tuesday, November 16, 2010

Agile Business Analyst Is Not An Oxymoron

I just read a very angry blog entitled:  Business Analysts And The Million Dollar Question - What Would You Say You Do Here?  The author quotes Scott Ambler's famous line, "Remember, 'BA' is also the abbreviation for band-aid" and he goes on to say that if you hire a typical BA, chances are high that "you're not just wasting your money; you are, as a matter of fact, risking your project by introducing an additional monkey that'll have to be subdued and sedated as your project moves on and the rest of your team starts doing some real work."  Okay, then.

Tom Smykowski, Office Space BA
The Agile Manifesto itself seems to be taking a direct swipe at Business Analysts when it talks about valuing "working software over comprehensive documentation" and "customer collaboration over contract negotiation."  Somehow, Business Analysts have ended up as a proxy for everything bad about the "Waterfall" development method:  Big Upfront Design, Big Documentation, Bad Communication, Silos, Cholera, Blunder-headed Filling Out of Templates, the works.  Oh, wait, not cholera.  But you can see how it would accidentally get into the list--that's what happens when you start to write things down unnecessarily.  Next thing you know, you're gold plating.

But let's take another look at this.  When we're pragmatic, rather than dogmatic, we tend to find that business analysts are proving to be as vital for agile project teams as they were for the first struggling dev-only teams in the dark days before they invented "waterfall."  Let's break this down a little bit.
  1. Even On Agile Teams, someone still needs to facilitate discussion, record consensus once reached, and analyze the business/technical flow of dataThe best business analysts were never the people who doggedly filled out every template to completion--BAs at their best have always facilitated discussions, promoted development ideas to "the business," and urged people to talk directly to each other.  Blaming BAs for "Big Upfront Design" is no more appropriate or helpful than deriding developers as pocket-protector wearing individuals with thick glasses who only communicate in binary.
  2. Time Has Told.  Although the Manifesto and Agile Founding Fathers originally thought of all agile team members as developer generalists (or--dare I say it--old fashioned "programmer/analysts"?), time and experience have dis-proven the hypothesis that BAs aren't needed for agile.  Modern Analyst quotes Alistair Cockburn in 2009 saying:  “the early attempts at Agile development tried to do away with the Business Analyst” because of the potential distortion of communication with a go-between. However, given the complexity of organizations, the disparate business languages that various units may use and the time constraints of subject matter experts, "more recent variations are finding good use for someone with deep business knowledge, who has time to spend talking with the programmers.” 
  3. Corollary:  Time Is At A Premium.  As Cockburn says, some agile teams may gain benefits from a division of labor whereby business analysts stalk, cage, and interview disparate organizational stakeholders, always time consuming practices, in order to present a coherent set of stories to developers, who can then focus on the actual development.  This is not because the developers are not capable of understanding the stakeholders, but because facilitating consensus is a different activity than writing actual code. 
  4. People Have Preferences.  As actual agile teams have been born and gone out into the world, it turned out that everyone didn't want to be a developer.  Speaking generally, some people wanted to work with people, some people wanted to write algorithms, and some people wanted to develop psychotically completest edge cases to guarantee quality.  So although agile teams have often offered everyone the opportunity to do a little bit of everything, gradually the roles of BA, Dev, and QA have emerged as personal expressions of interest by team members.  One could argue that things have now gone too far, and people have hardened into their job titles, but so long as teams (or companies) offer people the chance to continue to learn and grow, it isn't necessary to jump to the conclusion that the presence of BAs or QAs on an agile team is caused by HR departments blindly filling out a template somewhere.
There has been a flood of conversation among my peers recently about the "role of the user experience designer" versus the "role of the BA," and it is beginning to degenerate in a manner very reminiscent of this Dev versus BA conversation.  One colleague posited that the BA position should be eliminated in favor of a combined BA/UX position on each project.  Why must this always be so zero sum?  Some people have excellent design ability, some people have excellent skills for breaking down functionality into easily digested pieces.  Not everyone has both of these skills, and even if they do, not everyone enjoys doing both types of work equally.  So far as I know, the last human who ever knew "all there is to know" was Goethe, and that was a long time ago.  There's a lot more to know these days, a lot more mistakes to be made, and a lot more experience to be hard won.

So to me, the bottom line in all of this discussion is that the fundamental unit of agile production is the team.  The team should combine a number of diverse skills, analytical, communicative, creative, quality-minded, dev/ops, and good-design-informed.  As you put together your team, you should make sure that you have all of the skills needed, from some combination of interests, experience, and skills of the team members.  People will have a combination of generalist and specialist capabilities, and people should hopefully be emotionally as well as analytically intelligent.  But you know what?  It's absolutely fine to start by setting up a ratio of one BA, one QA, and two pairs of Devs, and then correct for missing skills.

It is certainly better to do so than to start tinkering with theories like "<my job title here> can be trained to do it all, and the rest of you are monkeys."

Wednesday, November 10, 2010

Quantifying Manual Test Technical Debt

Forced by circumstances (and an especially pragmatic client), I've recently been asking peers and "the blogosphere" the apparently naive question, "is it important to do automated testing and clear up what Mike Cohn calls 'manual test technical debt?'"

Theoretically, if you have no Big Upfront Design and you also have no automated test suite, you're pretty much just building a big unplanned mess and you'll never be able to change anything for fear of breaking something else.  I understand that, and so does my pragmatic client.  But can this issue be quantified?  Aside from feeling somewhat inadequate when reading agile theory, I mean.

The reason it's important to ask this question is that setting up and maintaining automated testing is expensive.  Unlike many agile practices, which can be set off with some sharpies and a clean wall, automated testing requires a large scale capital expenditure to bring one or more testing packages into an organization, and then some significant strategy development, staging, and training of resources to put those new packages into use and to keep them running.

You need some compelling financial case to show the likely return on an investment which may be half a million dollars for software alone in a medium sized corporation.  So how much is it worth to clean up your manual test technical debt?

There are some related questions out there with quantified answers.  Gartner and CAST
have recently published estimates of how much technical debt is out there globally and on a per-application basis ($500B and $1M, respectively). "Technical debt" is defined by these studies as "the cost of dealing with delayed and deferred maintenance of the application portfolio."  As most people who worry about technical debt note, this "cost to fix" is not the same as what it costs the business to have a software emergency of some kind, or to be slow in rolling out enhancements to the software.  And of course this "cost to fix" is not exactly the same as the cost accrued by an organization due to the lack of automated testing.  But it's a handy statistic to be able to throw around.

Within your own organization, you may want to look at the following, in terms of making your business case:
  • Regression Testing costs:  this may be your easiest way to quantify a return on investment.  What does it cost you to manually regression test your most expensive features?  How much would it cost you to set up automated testing on those features?  It is likely that you'll have some parts of your application where the one-time cost of setting up an automated regression test will pay for itself within the year, in terms of making your manual testers available to do other important testing of new features as they are written.
  • Software development costs:  if you're in a position to do so, quantify the number of points your project teams deliver per iteration, to get a "cost per point."  If your teams are routinely interrupted to fix production bugs and put in patches, you should be able to quantify the cost of these unplanned interruptions, in terms of undelivered feature points.  The costs will likely add up to tens of thousands of dollars quite easily.
  • Insurance analogycost of unexpected down time for your application.  Calculate the cost to your company for down time to your application.  Even if down time is not catastrophic, since manual work-arounds are available, you should be able to quantify the value of the time of staff diverted to doing manual work-arounds during software down time.  If automated tests give you better quality software and reduced down time, a large-scale investment in automated testing may be justified.
  • Business losses due to lack of speed in adding new features:  this may be harder to readily quantify, but as your software gets harder to change, you will lose the agility you intended to gain when your company first took on agile software development techniques.  You may want to put together one or two hypothetical or real cases of changes which were much more expensive to deliver due to the cost of manual regression testing than they would have been, had automated testing been in place.  Depending on what your software does, you should be able to assign a business value to software flexibility, in terms of business lost due to delayed time to market.
  • Total software replacement cost:  at some point your software will become partially or completely impossible to repair, due to the cost of regression testing the whole.  What is the cost of starting all over?  On the other hand, if your company was planning to totally replace the software during some short time horizon, then maybe your manual testing is fine.
It's all very well and good for us zealots to say things like "without automated testing, you Just Aren't Agile."  But if you're looking to get your CFO to release significant funds, you'll want a case relevant to the company bottom line, not just an alignment with some invisible Agile Correctness measure.