...copyright Elena Yatzeck, 2010-2017

Tuesday, June 21, 2011

Behavior Driven Development and the Agile Analyst

I have been following up on my recent discovery of "Feature Injection" by trying to track down some more information about Behavior Driven Development, or "BDD," which is the software development technique into which feature injection fits.  Handily, someone really good at BDD has put what seems to be a lot of really high quality information into the wikipedia entry, so I urge you to go check that out immediately, before someone less-good changes it!

But in the mean time, I've been talking with colleagues about the practicalities of BDD, and I'm just enchanted by the helpful structure BDD provides for describing what's different about agile business analysis from its handy bĂȘte noire and sparring partner, "Big Upfront Design."

So, compare and contrast a "requirement" written in detail well before software development begins (the "BUFD" requirement) to the one done "just in time," expressed for perpetuity in business-readable language as a set of acceptance criteria examples linked to an underlying software implementation (the "BDD" requirement).

BUFD requirement (also known as a "system requirement," a "system specification," a "user requirement," or a "non-functional requirement."):
  • is written in great detail long before a programmer has even heard of it based on discussions between analysts and "business people."
  • is approved in writing by a slew of people, along with a vast number of other requirements in a multi-page "systems specification," for Sarbanes-Oxley compliance (SOX).
  • is used by testers as the basis for the "test plan," they put together for testing the software as it is written by development.
  • is accompanied by a "traceability matrix" which links the requirement to the associated written "design" and "test plan" documents, on a point by point basis.
  • must be kept up to date once the programmers get going (along with the design, the test plan, and the traceability matrix)
  • is used as a point of reference if scope must be cut due to lack of time--the requirement is still in the document, but the behavior isn't in the program.  But at least there's a notation in the traceability matrix for reference, once an argument needs to happen.
  • is put into storage of one kind or another once the software goes live (the resemblance of the "archive" above to a waste basket is purely coincidental).
BDD requirement (also known as "a set of acceptance criteria" or "our hero"):
  • is modeled at a high level with very little verbiage during a brief project "discovery" or "inception" phase at the beginning of a project, and recorded in writing on a 3x5 card after a discussion which includes business people, programmers and testers.
  • is explored in detail just before programmers begin work on it, by putting together written business-language scenarios that can be used to test whether the software is doing what the business would expect.  These are called "acceptance criteria," and they are written directly into an automated functional testing tool, where they can be referenced by the people who actually write the corresponding automated functional tests.
  • can be signed off at this time, for SOX compliance.
  • serve as the built-in documentation to the automated tests for as long as the software is operational.  although the software itself may change, and those changes may require corresponding modifications to the actual automated tests, the business language description of what those tests do are not likely to change.
Vast quantities of excess paper are avoided here:  we don't fully document any requirement unless we are actually going to develop it (avoiding waste associated with "comprehensive" up-front documentation for a system which will never be built to this scope).  We avoid writing the test plan and the traceability matrix by recording requirement details directly into the automated tool in the form of acceptance criteria.  We avoid rewriting requirements by not getting the details until just before coding.

But meanwhile, we actually improve business people's ability to judge whether the software meets their needs or not, and keep a record of the software's intent forever.  The analysis activity becomes lean, efficient, and immortal!

As an analyst, I like that.