Software Quality

June 12, 2010

Defect-Driven Design (DDD)

Filed under: Practices, Testing — David Allen @ 9:11 pm

I have heard of Responsibility-Driven Design (RDF), Test-Driven Design (TDD), and many other software design approaches. But here is a new one I discovered: Defect-Driven Design (DDD).

How does it work?

You start with a feature that sounds easy but is really complicated. Then you assign it to a programmer who has a record of succesfully doing the simpler type programs, but who is still very new to Test-Driven Development approaches. That way he will revert to his traditional code-only approaches (no automated testing, certainly not test-first).   Be sure to pressure the programmer to get it done quickly, and give him complicated requirements. Make sure to assign his business analyst to several other projects so that he does not have time to give the programmer the quality guidance he requires.

 Since the architecture is similar to other programs he has successfully written without test driven development, he proceeds to code the program with confidence that it will be as successful as the other programs. Once he has “completed” it, the testers test it, and start sending him the defects. He fixes them one at a time, thinking each time that he is  “done.” Eventually, the testers discover a HUGE pile of defects. Then you realize “uh-oh. It was not really simple after all.” And all these test cases are the requirements that they never gave the programmer in the first place.

This really happened recently. The program was a data transfer from an old system to a new system, as part of a conversion. Several data transfer programs had already been written successfully, using a traditional architecture based upon stored procedures, without automated tests. The actual data transfer program was NOT written using test driven design (TDD). If it had been, it would have been more easily testable. When a large number of defects were reported by the testers, the team realized that it had completely missed all the requirements.

So we took a step back, and redesigned our approach. First, we gathered together the requirements that were missed on the first attempt. We confirmed that the available test cases were considered a good representation of the requirements. Then we developed techniques to automate the testing, even with the existing traditional stored procedure architecture. And we proceeded to follow good test driven design practices. Applying the test driven development approach has been very successful so far. The benefits of our following a test driven approach could have come straight out of a textbook on test driven design. These benefits include

  • By writing the tests first, we uncover ambiguities in the requirements, and we resolve them before we start coding.
  • By translating the higher-level requirements into concrete test scenarios, we now have a more realistic understanding of the scope and magnitude of this feature. For example, a couple of abstract sentences of requirements may drive us to design several tests to verify that the requirement has been met.
  • By producing automated tests, we are able to detect regression as it occurs and correct it promptly.

This problem would have never happened in the first place if a test driven design process had been followed. We are using this story as an opportunity to re-energize our teams with respect to the value of using test driven design. I have always preferred to have requirements expressed as tests. But after experiencing this, I am even more passionate about it. Others in my organization share this passion. That is is why our organization has a standard that requires all features to have acceptance criteria. And we have deliberately created a template for acceptance criteria that reads exactly like a test scenario:

Given <initial conditions>
When <you perform the action to be tested>
Expect <the expected results>.

Each feature must have at least one such test-like acceptance criteria. In a typical feature,  there are several tests (acceptance criteria) that are used to express the requirement. For those of us who enjoy test driven development because of the many benefits it brings, using such test-like acceptance criteria provide us with a very easy transition to writing the automated test that began our development process. This approach works well even for people who will not write automated test cases first. While such an approach is inferior to strict TDD, it is better than the sort of vague and rambling requirements that are sadly common in our industry.

Of course, none of these standards matter if your staff will not follow them. And they will not follow them unless you give them the encouragement, support, training, and direction to do this. They certainly won’t learn to do this if they are told explicitly that we don’t have time to do it well and that we will call all the missing requirements “defects” and fix them later.


  1. Very interesting!
    When you say you are “writing the tests first”, it sounds like you’re not coding them in C#/Java, but documenting them in the format you described: “Give… When… Expect…”

    Are you coding the tests first (and of course, they can’t even compile)?

    I am a huge proponet of unit tests, but I find it very hard to code them before I understand the class/design under test.

    I would be very interested in your thoughts on this.



    Comment by Robert A. McCarter — June 13, 2010 @ 8:02 pm

    • Oh, and many people find it hard to code before they understand the class/design under test. See references for tips.
      But essentially, you create your design on the fly, driven by the tests. If it says “Given a customer with a balance > 1000, send an email about the high balance.” Then you need to create a class to send the email. If you don’t have one, then you design one. If not now, then when? You will design it anyway as you start coding. But the test reveals what to code when.

      Comment by David Allen — June 15, 2010 @ 7:37 am

  2. Thank you for asking for clarification.

    I am talking about two different but similar things: 1) writing requirements that resemble test cases, 2) writing automated unit tests. They look very similar.

    Our first attempt at coding this feature was based on written requirements in a matrix form that was hard to interpret. Those requirements were not written in a test-like fashion. Further, no automated tests were written either before coding (Test-first) or after coding. This is the worst possible world. But honestly, the approach had been good enough in several easier cases before. So I don’t fault the developer for thinking the approach would work again.

    After we found how badly we missed the mark, we made several changes:
    1) pair program to address the complexities that would easily trip anyone
    2) extend the requirements by writting them in a test-like fashion
    3) write automated unit tests before coding (TDD)
    4) modify the code until the tests pass

    This should answer your first two questions. We are first writing the requirements in the English language, in a format that resembles a test, and then we proceed to write actual tests in C#.

    I think I created confusion by introducing both test driven development and test like requirements in the same case study.

    But your last question is the really interesting one. We also have talented developers like you, who find it hard in many cases to code tests before they have contemplated the class/design. I myself used test driven development frequently, but there are certain situations where I find it difficult to write the test first. I have not given it enough thought to make a definitive statement. But I think that some of the cases where I avoid test driven development are cases where I am exploring new techniques and technology. In those cases, I really don’t know enough about the underlying technology architecture to understand how it works. I have to play with it. However, that statement is not 100% true. Sometimes I use test driven development and spike situations as well. Regardless, let us accept the fact that there are situations where a given programmer cannot or will not code tests first. In these situations, writing requirements in a test like fashion offers many of the benefits of coding the test first.

    In fact, my prescriptive recommendation is that every programmer should always first make sure that the requirements are expressed in test like acceptance conditions. Then, if possible, code unit tests from these acceptance conditions. I think this will also help people to design their tests in a manner that more closely it’s the business problem rather than the technical solution. After all, when you are expressing a requirement as a test, you are less likely to think about the details of the implementation, and you’re more likely to express the test in terms of the business domain. I believe, that if you write your requirements in a test like form, in the language of your business, you may actually find it easier to write automated test cases that drive a testable architecture which better mirrors your business domain.

    Anyway, writing these test like requirements is a cheap and easy experiment. If you give it a try, please let me know how it goes for you. I would be very interested in your experience and feedback.

    Comment by David Allen — June 14, 2010 @ 7:05 am

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at

%d bloggers like this: