Geeks With Blogs

Tim Murphy

Tim is a Solutions Architect for PSC Group, LLC. He has been an IT consultant since 1999 specializing in Microsoft technologies. Along with running the Chicago Information Technology Architects Group and speaking on Microsoft and architecture topics he was also contributing author on "The Definitive Guide to the Microsoft Enterprise Library".

I review for the O'Reilly Blogger Review Program

Technorati Profile

Tim Murphy's .NET Software Architecture Blog Adventures in Architecting and Developing .NET

Lately I have gotten quite an awakening to the realities of waterfall testing processes.  This has come in the form of writing tests for my current client as they are preparing for a major release.

So what have I learned?  First, large development groups end up with large testing teams.  This makes for challenges in coordinating efforts within the group.  To overcome this I believe there needs to be well defined expectations and a single owner of the effort.

The second thing I have learned is that such testing generates volumes of documentation.  This is documentation that need to be constantly revised and reviewed.

Of course the ever present question of our times is why are you not using TDD?  In order to answer that I think we need to break it down into further questions.

Digging into this issue brings two main questions to mind.  First, how do we ensure business partners that thorough testing has been done?  Second, and this is very important, what does the environment we are in dictate?

The first question is exactly what the waterfall method is intended to address.  The main problem is that it is very costly.  If a missed requirement is found during testing not only does the problem have to be corrected, but all of the testing documents need to be scanned for conditions that might be affected.  Cost is also increased due to the fact that these are not automated tests.  There is no button that can be pushed that will rerun all the test.  Weeks can be spent doing a proper regression test.

As a consultant the second question is a key consideration.  A contractor can advise, but the client still has the final say on what methodology will be used.  The best thing that you can do is present facts objectively in a way that highlights benefits.

The one main draw backs that I see with TDD is that while it self documents for the developer, it does not do the same for the business stake holder.  Their one assurance is that they have been seeing the product in regular increments.  Tools may help address this concern, but there are no silver bullets.

I am sure I am over simplifying the problem.  In the end I want to have my cake and eat it too.  I want to cover my butt with documentation, but I also want the benefits of finding out early when that documentation is not accurate.  To that end I will continue to learn from experiences such as this.

Posted on Friday, February 2, 2007 3:48 AM Development | Back to top

Comments on this post: Thinking About Testing

# re: Thinking About Testing
Requesting Gravatar...
TDD tests are developer tests that aren't really meant to be business facing, and usually at a much lower level than requirements. Do TDD yes, but it's not a substitute for testing.

I think what you're looking for is automated acceptance tests ala FitNesse that can be and should be customer facing.
Left by Jeremy Miller on Feb 02, 2007 11:11 AM

# re: Thinking About Testing
Requesting Gravatar...
Although Jeremy is correct in that there are tools such as Fitness out there to build the client's confidence, I have found that implementing those type of tools require that you not only design you code to satisfy your requirements, but also force you to design your logic to be testable by Fitness, which will require some time that has to be accounted for. In other words, your design process will have to have some thought in implementing Fitness Testing. For example it does require some knowledge of their fixture model. In addition, Fitness has an unforgiving editing tool.

My point is that if you either TDD or (if your diciplined enough) write your tests afterwards (and I don't mean after the application is done) then you should have the confidence that things will work. However, you should also think releasing in an iterations. That way you begin to get the client used to getting involved early. Getting feedback as soon as you can is important. I'm currently experiencing this within an environment, which was following an XP methodology. Ironically we are now more agile then before, ever since we went away from XP. Not to say that we don't do TDD at times. Just not all the time.
Left by Rolo on Feb 11, 2007 4:44 AM

Your comment:
 (will show your gravatar)

Copyright © Tim Murphy | Powered by: | Join free