How Agile is Danube?

Danube’s ScrumWorks development team, along with other agile tool vendors, was recently challenged by a well-intentioned fellow named Mark Levison to see just how “Agile” we really are.  In other words, do we eat our own dog food.  Mr. Levison was making a broader point that often “Agile” is used to mean “fast and crappy quality” instead of what it means to companies like ours “high quality with frequent releases”. While I responded in a comment, what follows is a more polished response.

Danube’s ScrumWorks Pro team eats its own dog food.  We are lucky not to be plagued by many of organizational difficulties inhibiting many teams moving toward Agile software development.  As a result, we get a high quality product out the door frequently with a relatively small group of contributors.

Key to successfully delivering on the promise of Scrum management (rapidly responding to business needs) are the technical practices that enable fast and furious product change.  Most important is the need to “bake quality into the product” from the start.  This means having test and QA be an integral part of development early and continuously.  For many organizations, this is no small feat, so here’s a little run down on how we use the “Definition of Done” to bake quality into our products from the start:
1) Defining Done.  Quality starts with requirements.  You must have clarity around your business goals as well as your technical expectations for implementation so that everyone is on the same page.  In a way it’s unfortunate that there’s so much hype around “user stories”.  We use user stories but a far more important component of our product backlog items is the “Definition of Done”: a bulleted list of “agreements” between the product owner and the team relating to the user story.  For us, explicit done criteria limit the functional scope of an item.  We don’t, however, imply specific UI implementation; detailed requirements analysis is handled in sprint, UI design included.

2) Implicit Done Criteria.  We have a lengthy set of implicit done criteria in a wiki. This lists all the technical criteria work must meet before it’s accepted as done by the product owner.  Wiki is not ideal in that it’s not always visible.  But if you’re on our team, you know about this list.  What’s on the implicit list?  It’s lengthy but here are the major categories with some examples of each (no way a thorough list):

  • Coding Standards. Refactor toward unit-testable code, javadoc comments, coding styles…
  • Design standards include preferring composition to inheritance, model/view/presenter, avoid copy/paste classes…
  • Related to design and coding is TDD.  This doesn’t mean we always test first, but we write software from a testability standpoint.  The point here is the design benefits of TDD.
  • Quality Standards.  We automate tests using tools programmers can write and run easily (X-unit and extensions).  Code is unit tested to the extent reasonable/possible. We have a UI intensive app, so devs also write integration tests using J-Unit based testing systems (no record playback crap that only people with specific tools can run).
  • Continuous integration is a big part of enabling Agile.  We currently split our (overly) large integration test harness between multiple coordinated servers, enabling simultaneous test runs and faster feedback.
  • Mandatory code reviews before anything is “done”.
There’s more to this list of course, but this gives a sense for why done criteria is so important.  It’s really the hinge on which agility swings.  If you’re doing “agile” but you’re not doing something like this to ensure high quality product, you won’t be able to keep up the pace once your product grows beyond trivial.  That is to say, as your product gets bigger, the effort to manually test it becomes exponentially higher.  In addition, as your code base gets bigger, the maintainability goes down unless you work very hard to keep things tidy.
So when the team says “this story is done”, they mean all of the explicit and implicit done criteria has been satisfied.  If any one item is not, we toss it out of sprint review.  With this system we’ve managed a relatively low rate of defects while maintaining a high rate of release.  And we do this all with a single manual/exploratory QA engineer.

Victor Szalvay

Victor Szalvay currently leads product development for CollabNet’s ScrumWorks® product suite. In that capacity, he works closely with customers, stakeholders, and the development teams to deliver high business value each release cycle. With more than 150,000 active users worldwide, ScrumWorks is used by more than half of the Fortune 100 and boasts the largest market share of any Agile management tool.

Posted in Agile
3 comments on “How Agile is Danube?
  1. Dave Rooney says:

    Saying Mark was “well-intentioned” is damning him with faint praise.

    I believe he raised a legitimate point, and I’ve been quite pleased by the responses from companies such as yours who have detailed how they deliver their tools. Of course, I’m sure that tool-builders who aren’t really agile likely won’t respond. 🙂

  2. Michael James says:

    In case it wasn’t clear (and I thought it was, but if written words were perfect we wouldn’t need teamrooms), we’ve always had the highest respect for Mark!

    –mj

  3. Victor Szalvay says:

    I don’t know Mark and I wasn’t giving him praise… false, lavish, or otherwise. I meant it as a digest of his post, which was well-intentioned regardless of the “challenging” tone.

Leave a Reply

Your email address will not be published. Required fields are marked *

*