Skip to main content

Behavior Driven Development and the Agile Analyst

I have been following up on my recent discovery of "Feature Injection" by trying to track down some more information about Behavior Driven Development, or "BDD," which is the software development technique into which feature injection fits.  Handily, someone really good at BDD has put what seems to be a lot of really high quality information into the wikipedia entry, so I urge you to go check that out immediately, before someone less-good changes it!

But in the mean time, I've been talking with colleagues about the practicalities of BDD, and I'm just enchanted by the helpful structure BDD provides for describing what's different about agile business analysis from its handy bĂȘte noire and sparring partner, "Big Upfront Design."

So, compare and contrast a "requirement" written in detail well before software development begins (the "BUFD" requirement) to the one done "just in time," expressed for perpetuity in business-readable language as a set of acceptance criteria examples linked to an underlying software implementation (the "BDD" requirement).

BUFD requirement (also known as a "system requirement," a "system specification," a "user requirement," or a "non-functional requirement."):
  • is written in great detail long before a programmer has even heard of it based on discussions between analysts and "business people."
  • is approved in writing by a slew of people, along with a vast number of other requirements in a multi-page "systems specification," for Sarbanes-Oxley compliance (SOX).
  • is used by testers as the basis for the "test plan," they put together for testing the software as it is written by development.
  • is accompanied by a "traceability matrix" which links the requirement to the associated written "design" and "test plan" documents, on a point by point basis.
  • must be kept up to date once the programmers get going (along with the design, the test plan, and the traceability matrix)
  • is used as a point of reference if scope must be cut due to lack of time--the requirement is still in the document, but the behavior isn't in the program.  But at least there's a notation in the traceability matrix for reference, once an argument needs to happen.
  • is put into storage of one kind or another once the software goes live (the resemblance of the "archive" above to a waste basket is purely coincidental).
BDD requirement (also known as "a set of acceptance criteria" or "our hero"):
  • is modeled at a high level with very little verbiage during a brief project "discovery" or "inception" phase at the beginning of a project, and recorded in writing on a 3x5 card after a discussion which includes business people, programmers and testers.
  • is explored in detail just before programmers begin work on it, by putting together written business-language scenarios that can be used to test whether the software is doing what the business would expect.  These are called "acceptance criteria," and they are written directly into an automated functional testing tool, where they can be referenced by the people who actually write the corresponding automated functional tests.
  • can be signed off at this time, for SOX compliance.
  • serve as the built-in documentation to the automated tests for as long as the software is operational.  although the software itself may change, and those changes may require corresponding modifications to the actual automated tests, the business language description of what those tests do are not likely to change.
Vast quantities of excess paper are avoided here:  we don't fully document any requirement unless we are actually going to develop it (avoiding waste associated with "comprehensive" up-front documentation for a system which will never be built to this scope).  We avoid writing the test plan and the traceability matrix by recording requirement details directly into the automated tool in the form of acceptance criteria.  We avoid rewriting requirements by not getting the details until just before coding.

But meanwhile, we actually improve business people's ability to judge whether the software meets their needs or not, and keep a record of the software's intent forever.  The analysis activity becomes lean, efficient, and immortal!

As an analyst, I like that.

Popular posts from this blog

A Corporate Agile 10-point Checklist

I'm pretty sure my few remaining friends in the "small, collocated team agile" community are going to desert me after this, but I actually have a checklist of 10 things to think about if you're a product owner at a big company thinking of trying out some agile today.  Some of these might even apply to you if you're in a smaller place.  So at the risk of inciting an anti-checklist riot (I'm sorry, Pez!), I am putting this out there in case it is helpful to someone else.

Here's what you should think about:

1.Your staffing pattern.  A full agile project requires that you have the full team engaged for the whole duration of the project at the right ratios.  So as you provision the project, check to see whether you can arrange this staffing pattern.  If not, you will encounter risks because of missing people.  Concretely it means that:
a.You need your user experience people (if applicable) and your analysts at the beginning of the project, as always, b…

The Agile Business Case

Many agile teams have never seen a business case, ever, and they may even be proud of it.

Our mantra is that we deliver "business value," not just "software," quicker, better, and faster, but if so, we certainly don't spend a lot of time reporting on value delivery, and in fact we may be scornful about "analysis paralysis."  As software developers, we consider ourselves to be doing quite well if we can deliver the software every two weeks (or continuously).  And this is particularly if we've enabled this frequent high-quality delivery through automated testing and automated build-and-release techniques.  We've reduced business risk by making results visible more often, and allowing the business to change direction more frequently.  We assert that along the way of course we're also delivering value.  But how would we prove it?

I've recently posited that we shouldn't even think of doing agile projects without capturing and recording s…

Requirements Traceability in Agile Software Development

One of the grim proving grounds for the would-be agile business analyst (henceforth "WBABA")  is the "traceability conversation."  Eventually, you will have to have one.  You may have seen one already.  If you haven't, you may want to half-avert your eyes as you read further.  It gets a little brutal.  But if you close them all the way, you can't read.
Dialogue:
WBABA:   ...so in summary, we complete analysis on each story card, and then we support the developers as they build it that same iteration!Corporate Standards Guy:  but how do you do traceability in agile?  You have to have traceability.  It's broadly recognized as an important factor in building rigorous software systems. These software systems permeate our society and we must entrust them with lives of everyday people on a daily basis. [The last two sentences are an actual quotation from the Center of Excellence for Software Traceability website!] WBABA: [cowed silence]Corporate Standards …