Lesson #3: I wanted unit tests.
I’ve been on a lot of projects where we took a very short sighted view of the world, and planned for the initial release, and did no planning for life after that. It was always blamed on a lack of time in the project. “When we get this done, we’ll do our documentation.” or “Next time, we’ll do it right, and write some test cases”. It never works out that way. We all know it, and every time we utter those words, we know we’re lying to ourselves.
If you have enough time after a project to go back and write tests, your business is about to go down the crapper because the sales guys haven’t sold anything new, and there’s no backlog of work to get done, and you sure as heck know that your customer, who didn’t plan enough time for you to get your work done in the first place, isn’t going to plan budget for you to go back and write tests they expected you to write in the first place. If you ever admit to a customer that you didn’t test all of your code, you’ll probably lose that contract. They assume you have tested everything.
I started buying in to the idea of Test Driven Development last year on another project that involved building a dynamic search engine completely written in LinqToSql. The engine sat over a huge database, and some of the combinations of search terms they wanted to use involve 23 table joins. There were 50+ possible search terms, range searches, boolean searches, ‘like’ searches, and enum searches. This all took place in a legacy .NET Win Forms app built in VS2005. I built a VS2008 dll with Linq2SQL, and began adding test cases, one at a time, as I added the capability to do each type of search.
The engine itself was rewritten 2 or 3 times as it evolved, sometimes for speed, sometimes for maintainability, but those test cases that I wrote on day 1 were still my measuring stick to see how close the new engine was to returning the right results after each refactoring. To this day, I still trust those cases, even though I haven’t touched that app in almost a year.
At the same time that project was going on, I was working on another project that had no tests. It was a large ASP.NET application, with a couple of dlls. The majority of the code was in code-behind files of the pages, user controls, and a few internal classes. It was a pain in the butt to test, and every time I opened the code, there were code smell areas I wanted to fix, but was too scared to. I couldn’t afford to regression test the whole app just to make one small change. I’m still scared to work on it.
At the start of the Demo Showcase project, I did a lot of reading about Test Driven Development, looked into mock objects, and tried to get the team to buy into the concept. Unfortunately, I misunderstood a key premise of TDD, and made promises to the team that would prove to be inaccurate. I thought that we could have the QA group take the specs, and begin to write the test harnesses, and the developers would just have to make the tests turn green. I thought that MSTest and TFS would work together flawlessly, and take away any excuses that tests were too hard to write, or they didn’t know what was out there. I thought everyone would see the value of Mocks and Stubs and have an epiphany, and that TDD would sell itself. All I had to do was to sit back, show that my tests were working, and how great my life was, and everyone else would be jealous and follow suit. Not so much.
It took a conversation with Glenn Block and Jim Newkirk at a NerdDinner in January to help me clear up a key misconception. The QA’s could not, and should not, write the tests. At least not the first tests. The first tests were supposed to help the developer flush out the functionality of the code. When I brought this change back to the group the next day, the QA’s were relieved. The developers were disappointed.
The biggest problem we had with our TDD approach after that, was depending on MSTest and TFS. MSTest in VS2008 (and 2005) has a really annoying bug (that is supposed to be fixed in VS2010), relating to VSMDI files getting checked out and locked by developers or testers. I’d get our tests all organized and working, then next thing I know, the test lists were blank or out of date. I’d get upset with our team, and wonder who kept wiping out all my work. It turned out that VS2008 kept creating new versions of the VSMDI file if it couldn’t get a lock on the file, and would jump you into a new version without telling you. Running a specific set of tests became a frustrating step, and people stopped doing it. And when they stopped doing it, they stopped writing them in the first place.
The other big issue I found was that if you wrote a bunch of tests too far in advance, and set ‘Not Implemented exceptions’ in the code, the light on the builds stayed red for days, weeks or possibly months. People stopped trusting the tests, and stopped looking at them. I fell into this trap as well. I’m not sure how to fix this, except to flag tests that I know are not ready yet as ‘Ignored’. Write them, then ignore them until the code is ready to be worked on, then re-enable them. Put a To-Do Task in the code to remind yourself to come back and re-enable them. Never take a Not Implemented exception out of the code without writing a test for it first.
With regard to Mocks and Stubs, I chose RhinoMock, but I didn’t understand that there were two versions of it floating around, and started with the older version, which, in my opinion, is much harder to use than the Act, Arrange, Assert style of the 3.5 version. And I didn’t truly understand how to use it correctly until I read Roy Osherove’s book ‘The Art of Unit Testing’ just last month. I highly recommend this book, and have been pushing it down the throats of all the developers and QA’s here.
Finally, trying to break the cycle of developers not thinking they have enough time to write tests is a huge effort. It does take time to write good tests. It takes a lot more time when you don’t know what you are doing, when the technology and the concepts are all new. Learning TDD and RhinoMock, on top of learning the other new technologies that were part of this project became a daunting task for everyone. Learning TDD was pushed to the bottom of the list as we strived for results in early stages of the project, and the unit tests suffered. Later in the project, we came back around, and made a few other valiant attempts at it, but we definitely could have done better, and I know we will on the next one. It’s about educating ourselves, a little at a time, on processes that do and don’t work, and making adjustment as we go. We tried a lot of big changes on this project, and it may have been too much, too fast. But TDD is definitely something I am not giving up on, and will use it where ever we can in the future.
No comments:
Post a Comment