Test-driven development (TDD) is a simple concept. The TDD cycle is: Write a little bit of test, ensure it fails, write just enough code, ensure all tests pass, clean up any new deficiencies in code quality, and finally, ensure all tests still pass. Repeat. There are few rules beyond this definition of how to apply TDD, only nuances such as:
- Don’t check in code unless all tests are passing.
- Test everything.
The latter rule is simply a restatement of the technique.
Learning to do TDD necessitates learning how to build mocks or stubs, techniques that involve faking out collaborator code to enable easy unit testing**. Although mocking isn’t a day-one technique, neither is it an advanced concept. Mocking is a workhorse for many programmers, a tactic that is an essential part of learning how to test effectively. Mocking concepts are taught as part of any comprehensive introductory training on TDD.
I’m often asked for training on advanced TDD. “What do you consider advanced TDD?” I ask. Typical answers are, “how do I test databases,” “how do I test GUIs,” “how do I test multithreaded code,” and so on.
Yet, these aren’t advanced TDD challenges. Solving these challenges involves applying fundamental TDD concepts. The basic TDD technique itself promotes decomposing larger problems, such as how to test database interaction into smaller chunks. Why? Because they are easier to test-drive! Interacting with databases can be broken into smaller problems, such as:
- Generating proper SQL strings
- Obtaining database connections
- Loading configuration information
- Executing SQL statements
- Populating objects from query results
Each of these smaller concerns can easily be unit tested by a developer with a knowledge of basic TDD concepts. Note that some level of integration testing is still necessary in addition to the unit units. Integration tests ensure that all these database “subsystems” are tied together correctly. They also ensure that the application’s understanding of the database is still in synch with its actual definition.
The decomposition concept applies equally well to other complex challenges. Any TDD programmer worth his or her salt can tackle any of these challenges with a bit of tenacity. Ultimately, I don’t believe there is such a thing as “advanced TDD.”
No Advanced TDD?
TDD is a simple technique. It’s not mindless, however, and is something that most developers continue to refine the more they practice it. Learning to do TDD well is much like learning to play the piano. Any beginner is capable of hitting the same notes that the expert hits. Yet the difference in the efficiency of pressing the keys, as well as in the quality of the output, is marked.
Developers who commit to doing TDD become masters by treating it as a discipline. The secrets of becoming a master often boil down to applying simple, common-sense notions. The difference between a master and a journeyman is that a master constantly seeks out and constantly applies these small but valuable secrets.
The remainder of this article contains my collection of tips, elements that I believe have helped me take my practice of TDD to a more advanced level. I’m not sure I’ve mastered it yet!
When I do TDD, I keep in mind three primary ideals: the notion of specification by example, the relationship between testability and design, and incrementalism. I recently wrote about the relationship between testability and design. The interest in specification by example is significant: tests are most sustainable when they not only verify code but also act as highly readable documents that describe class capabilities. Finally, TDD’s idea of incremental growth of a quality system is integral to agile’s notion of consistently delivering new functionality every week or two. This desire for incrementalism can distill all the way down to the most basic code movements and additions.
I also adhere to one overriding ideal, that of simplicity. TDD is a simple but rich technique. In my experience, efforts to introduce advanced techniques and complexity into a TDD-based process have provided only little value for significant cost.
Moving from beginner to master in any arena requires practice. Expert pianists have invested years of effort and diligent practice. Software development is no different. To grow in skills and capabilities, developers must practice their craft.
Unfortunately, most developers practice only “on the job.” They figure that it’s enough to build things once. Perhaps they are right—every experience can bring improvement and progress toward becoming a master. But, the best programmers understand that there’s always a better way to do things. Each re-iteration of a solution is a step towards insights that might not come otherwise.
I highly recommend finding a personal problem, something small enough to complete in at most a few hours, but not so trivial that there’s room for only small improvement. Ron Jeffries has rebuilt the bowling game many times. I’ve built components of the “interacting with databases” problem many times. Each time, I’ve improved upon my solution and its design, upon my technique, and upon my effectiveness with the tools I use.
Two heads are better than one. There’s no better way to learn a new technique, and improve upon an existing one, than to work through it with, and get feedback from, a capable partner. Make sure that you switch partners often enough to get more than one viewpoint.
If tests are to act as specifications by example, they must be readable to someone other than their author. Pairing is a great technique to ensure that at least one other person understands a test. A disengaged third party, however, is a great resource: Ask someone to quickly read each new test, paraphrasing it. If the third party understands it, ship it! Otherwise, clean it up and try again.
Many developers think they’re smart enough to know what need not be tested. Some developers will make excuses for what they deem trivial; others will give up on tougher testing challenges. At least two significant downsides of these decisions exist: The untested code will not be adequately documented, and it will not support high-confidence refactoring. Each of these downsides can create significant long-term costs.
Not all tests must be unit tests. You should use integration tests if necessary to fill holes in the unit tests. Minimize these, however, because they generally are more volatile and costly to maintain.
** Some rare few developers have the luxury of building applications that don’t talk to databases, external APIs, or other such slow/volatile collaborators. These developers can proudly say they don’t have to use mocks.
The TDD Cycle
Ensure that you follow the rule of running your test through your test tool first, to observe test failure. Not following this rule takes you out of the TDD rhythm. I’ve seen people burn an hour because they hadn’t been compiling before building. They thought the constant green bar was a good thing.
Take smaller steps than you are taking today. Rapid feedback is part of what makes TDD work. The shorter amount of time between introducing a problem and corresponding negative feedback, the easier it becomes to correct, and thus the more rapidly a quality solution can be built.
Adhere to the ten minute rule: If you haven’t seen a green bar in the last ten minutes, your solution is going awry. Discard code until you revert to the last green bar, and start over again. Take smaller steps this time.
Run All the Tests
Don’t skimp on the number of tests you run for each change you make. Many developers will run only the tests for the current class they’re developing. Others will run only the tests for the current package. Few run “all the tests, all the time,” as the TDD mantra goes. It’s seemingly faster this way.
But, not running all those other tests can temporarily mask problems. The longer it takes before you find out that you’ve introduced a problem in another area of the system, the longer it generally takes to resolve. Further, the insistence on running all the tests may go a long way toward ensuring that the tests do run fast, and toward ensuring that they are closer to true unit tests.
Keep the Build Green
I’ve seen some teams allow developers to check in code without running a complete suite of tests. The premise is that the continuous integration server will kick off soon enough, notifying the team so that any mishaps can be corrected. Although this may save time for the individual, the rest of the team that happens to check out the offending code is only going to lose time on the occasions that the integrated code is broken. Further, effort that the offending developer expends between the time of code checkin and getting failure notification is often wasted.
The mark of an expert TDD developer is not so much in the quality of their tests but in the vehemence with which they attack problems in code during the refactoring portion of the TDD cycle. Newly introduced duplication and unexpressive code is corrected immediately so that the master developer doesn’t let the code base get any worse. A single case of two lines of duplication isn’t too trivial to eliminate. A construct name can often be improved, and that might happen several times.
Don’t Forget OO 101
Fundamental design concepts, such as the notions of high cohesion and low coupling, aren’t discarded when practicing TDD. The refactoring portion of the TDD cycle allows the more experienced developers to take advantage of all such design tools and concepts in their background.
Heed Cohesion in Tests
A single unit test is a test case—a single scenario. A given public method might have a half dozen or more tests around it. Tests should be short, concise, and verify one thing; their name should accurately describe the goals of the test. Too many asserts is an indicator of either a test or a design that’s doing too much.
Don’t hesitate to split a test class into two, if that allows you to take the most advantage of setup common to a subset of the tests.
Rename Tests Continually
Don’t dwell on the first name for a test, because you should revisit it frequently. Upon completing the test, revisit its name, and improve upon it. Revisit the name within the context of other tests for the same class, and ensure that the names are consistent. Consider naming your tests using BDD (Behavior-Driven Development) naming conventions.
Don’t Overuse Mocks
Mocks by definition violate encapsulation: They require test client knowledge of a target class’s implementation details. Used judiciously, mocks are an extremely valuable tool. Used extensively, the violation of encapsulation can actually inhibit refactoring: Changes that only move around implementation specifics can break tests, requiring developers to take the time to fix them. Therefore, most developers faced with this problem choose not to refactor. Making refactoring more difficult is a rapid path to a rigid, lower quality system.
Master Good Tools
Although this has little to do with TDD, I’ve seen too many TDD developers struggle with their tools. This includes editors/IDEs, languages, build tools, operating environments, the keyboard itself, and so on. Most importantly, a developer must be the master of his or her preferred developer’s environment. Pairing can allow developers to learn tips and shortcuts that they might not learn otherwise. The rapid cycles in TDD are most effective when the programmer is capable of rapid coding through tool mastery.
Use a real tool, and take advantage of it. Eclipse and vi are real tools when mastered. Notepad is not. Generic programmer’s editors like UltraEdit are good, but you can usually do better.
Don’t Code for the Future
Yes, you will need a std::vector in about five minutes, once you’ve gotten the current assertion to pass. For now, a simple counter is sufficient. Sometimes, the need for the vector never comes, and you’ve added unnecessary complexity (and thus cost) to the system. More importantly, you’re adhering to the true incremental nature of TDD. This will force you to write more assertions, to drive specific solutions into generic ones.
TDD is not a mindless technique. Instead of worrying about the near or distant future, you’re best off thinking, and hard, about the present. Relevant actions include digging hard to ferret out all of the code’s problems in the refactoring portion of the TDD cycle.
As you work, you’ll no doubt be reminded of things that you must take care of at some point in the future. Keep a simple to-do list, whether it be in comments or on a napkin. You might check in these to-do items, but resist letting them ship past an iteration.
Never Be Blocked!
Bob Martin repeated this mantra at Agile2007, in his talk about professionalism in software development. Agile and TDD promote the notion that you can build solutions incrementally, without complete details in place before you can start. Per Uncle Bob, you can use abstractions and decoupling to avoid having to wait for complete definition. It’s important to initiate the process and feedback loops.
Techniques are changing constantly, and new tools arise on a weekly basis. Keeping up is difficult, but a true professional does what he or she can to stay current. There are wonderful books, magazines, web-zines, pod casts, Yahoo! groups, and of course Internet searches at your disposal.
Work as Part of a Team
Good teams learn how to get on the same page and understand each other. Developers on such a team look for standards that can help remove barriers to understanding, and that can minimize masked problems. Tactics such as pairing and other forms of review can help. The best teams learn about each other and the systems through frequent communication.
TDD challenges experienced developers who have coded in a less effective fashion for their entire career. Learning requires an open mind, one willing to accept that there might be a better way to do things. In fact, the emphasis on feedback means that there will be continual learning and improvement. TDD masters use failures as opportunities to learn. They don’t fret over throwing away bits of code that represent failed approaches.
Practice, Practice, Pratice
How do you get good at TDD? Build a firm understanding of the basics of TDD. Do it, and practice it. Repeat.
About the Author
Jeff Langr is a veteran software developer celebrating his 25th year of professional software development. He’s authored two books and dozens of published articles on software development, including Agile Java: Crafting Code With Test-Driven Development (Prentice Hall) in 2005. You can find out more about Jeff at his site, http://langrsoft.com, or you can contact him via email at jeff at langrsoft dot com.