Having worked on quite a few teams where I've been the lead responsible for helping implement TDD or been an Agile Coach - I've seen a group of habits develop when teams first start utilizing TDD. These are ...
1. No test isolation
All functionality that is being tested is in one test. Tests are heavily over specified.
How to fix: Spend some time pairing with the developers that are having the issue. Help explain why we want to break tests apart into smaller units, aka: "UNIT" tests. Let them know that we do not want to test the entire class in one unit test. Just because it can be done doesn't mean it should be done. Ya know, I COULD drive into oncoming traffic, but its just not a good idea. :)
2. Failure to recognize the TDD Mantra
Most new TDD'rs will go from Red to Green to - I'm done! Nope. I'm sorry guys, we have one more in that list - Refactor. Let's say it together. Red/Green/ReFactor, Red/Green/ReFactor :)
How To Fix: Help the developers realize that we want to refactor for a reason. We have a passing test, we know what to expect from our results. Now lets clean up our code, maybe introduce some interfaces, change some variable names, introduce some fields, extract some methods, etc. Doing this allows us to clean up the code as well as have the confidence that our changes did not muck up the expected result. Run the unit tests after you refactor. If it breaks, oops, lets go figure out WHY it broke. We caught the error early, its a lot cheaper now. :)
3. The previous lends itself to Insufficient Test Coverage/TDD is my new Debugger
The developers usually will do enough to get the test passing and then move onto the next task. They think of TDD as a new debugger.
How To Fix: Sit down with the developers that have this issue and explain: Just because we can now unit test a method/object and step through it with a debugger doesn't mean that this is the new debugger. That's not the point. Just because the test passes doesn't mean we're done. Just because your method passes if you pass a valid object into it doesn't mean that its fully tested. What happens if I pass a null object into it? What happens if I pass a new'd up object with nothing instantiated? Have those instances been tested? Have we tested that our objects are interacting as we expect them to? Do we know if the Logger got called? How do we know if we didn't test it? That its the classic question: How do we know it works, if we don't have a test to prove it?
4. No concept of separation of concerns or IoC/dependency injection.
This is not a big deal and I expect it of most new TDD converts. Usually while learning TDD the first few tests do not need DI/IoC, but this is something that will help in the future and will need to be understood to really grasp. This really comes into play when we start separating our concerns and breaking dependencies between objects. The refactoring that takes place here can really get deep into DI/IoC sometimes.
How To Fix: Give them a great intro into interfaces and how they're used. Show them great articles on DI/IoC written by Martin Fowler, Jeremy D. Miller, Ayende, and have them join certain mailing lists such as altdotnet to join in/read the discussions. On that list alone, DI/IoC discussions occur on a daily basis.
5. TDD Complaints
I've seen complaints such as:
- It takes too long to write the tests
- I write more test code than real code.
- The tests are harder to write than the real code.
- How is this going to help me get my project done? I can just hit F5 and test it and I can tell if it works or not with my own eyes.
- Who cares if it passes a unit test, I'm the only person who works on this project.
How To Fix: This is one of those habits I've seen in every place I've coached. These are classic examples of someone who hasn't "seen the light" of TDD yet. In my experience, the easiest way to help them see the light is to pair with with them while writing unit tests. They will instantly see why the unit tests help them and the team. Every single time I do this I help the TDD newcomer realize that by writing this test they've exposed X bugs that they didn't think about before. It usually goes like this: "Ok, that test passes, what would happen if an empty string was passed into this? Would the DB accept an empty string?" Them: "Oh... good point, I didn't think of that." Instantly lights start turning on. Another thing that works great to help developers see the light is to ensure that you're utilizing Continuous Integration such as CCNET, or TeamCity. Have them create a unit test for a section of code. Check it in. Watch it build and watch unit tests pass. . Then have them change the code (the code which is under test) and ensure it returns a different result. Maybe it returned a user's name, such as "bob", now have it return the users name with some characters appended to the end of it, such as "bob123". Have them check it in, watch the build fail because the unit test failed. Explaining to them that this helps catch errors before the product is live is crucial. Once they have the "Ohhh!! I get it now moment", they won't want to stop writing tests.