I have heard of Responsibility-Driven Design (RDF), Test-Driven Design (TDD), and many other software design approaches. But here is a new one I discovered: Defect-Driven Design (DDD).
How does it work?
You start with a feature that sounds easy but is really complicated. Then you assign it to a programmer who has a record of succesfully doing the simpler type programs, but who is still very new to Test-Driven Development approaches. That way he will revert to his traditional code-only approaches (no automated testing, certainly not test-first). Be sure to pressure the programmer to get it done quickly, and give him complicated requirements. Make sure to assign his business analyst to several other projects so that he does not have time to give the programmer the quality guidance he requires.
Since the architecture is similar to other programs he has successfully written without test driven development, he proceeds to code the program with confidence that it will be as successful as the other programs. Once he has “completed” it, the testers test it, and start sending him the defects. He fixes them one at a time, thinking each time that he is “done.” Eventually, the testers discover a HUGE pile of defects. Then you realize “uh-oh. It was not really simple after all.” And all these test cases are the requirements that they never gave the programmer in the first place.
This really happened recently. The program was a data transfer from an old system to a new system, as part of a conversion. Several data transfer programs had already been written successfully, using a traditional architecture based upon stored procedures, without automated tests. The actual data transfer program was NOT written using test driven design (TDD). If it had been, it would have been more easily testable. When a large number of defects were reported by the testers, the team realized that it had completely missed all the requirements.
So we took a step back, and redesigned our approach. First, we gathered together the requirements that were missed on the first attempt. We confirmed that the available test cases were considered a good representation of the requirements. Then we developed techniques to automate the testing, even with the existing traditional stored procedure architecture. And we proceeded to follow good test driven design practices. Applying the test driven development approach has been very successful so far. The benefits of our following a test driven approach could have come straight out of a textbook on test driven design. These benefits include
- By writing the tests first, we uncover ambiguities in the requirements, and we resolve them before we start coding.
- By translating the higher-level requirements into concrete test scenarios, we now have a more realistic understanding of the scope and magnitude of this feature. For example, a couple of abstract sentences of requirements may drive us to design several tests to verify that the requirement has been met.
- By producing automated tests, we are able to detect regression as it occurs and correct it promptly.
This problem would have never happened in the first place if a test driven design process had been followed. We are using this story as an opportunity to re-energize our teams with respect to the value of using test driven design. I have always preferred to have requirements expressed as tests. But after experiencing this, I am even more passionate about it. Others in my organization share this passion. That is is why our organization has a standard that requires all features to have acceptance criteria. And we have deliberately created a template for acceptance criteria that reads exactly like a test scenario:
Given <initial conditions>
When <you perform the action to be tested>
Expect <the expected results>.
Each feature must have at least one such test-like acceptance criteria. In a typical feature, there are several tests (acceptance criteria) that are used to express the requirement. For those of us who enjoy test driven development because of the many benefits it brings, using such test-like acceptance criteria provide us with a very easy transition to writing the automated test that began our development process. This approach works well even for people who will not write automated test cases first. While such an approach is inferior to strict TDD, it is better than the sort of vague and rambling requirements that are sadly common in our industry.
Of course, none of these standards matter if your staff will not follow them. And they will not follow them unless you give them the encouragement, support, training, and direction to do this. They certainly won’t learn to do this if they are told explicitly that we don’t have time to do it well and that we will call all the missing requirements “defects” and fix them later.