Geeks With Blogs
Mark Pearl


In the team that I have been working with lately they use NDBUnit to control the test database state for Integration Tests. Today I thought I would blog briefly on one of the issues we had and how we are going to go forward in the future in structuring our solution and working with NDBUnit.

Let me say upfront that I am no expert with NDBUnit, so if you are an expert with this tool, and you can see glaring mistakes in this post, add a comment (I would appreciate that). That said, let me highlight the biggest issue we have had so far.

Initially with NDBUnit, the approach we took was to have one central snapshot of the database (a central XSD file and xml file), that all our tests could call and initiate, which would set our database in a known state. This seemed to work fine initially when we began using NDBUnit, however after a few months as our tests covered more and more of our database, the xsd layout got more and more detailed (see blurred pic below which gives you a basic idea of just a portion of what the xsd looked like).


KeyBlade - Microsoft Visual Studio (Administrator)_2011-12-30_09-29-21

The issue with this was that every time we had an issue with NDBUnit we were uncertain where the issue was because of the complexity of the data schema… it had in effect (for our team at least) become to complex and an all encompassing monster. In addition we were making use of Proteus, which abstracted a fair amount of the work away, but in our instance again the team felt that it had abstracted to much away. This has led to my suggested solution below, which really is a partition and isolate strategy.

Suggested Solution – Partition and Isolated

Instead of having a all encompassing NDBUnit file, we have decided to isolate sections of our database for NDBUnit. These are really intended for smaller integration tests, that test specific routes of access to the database.

First, when creating the integration test, we need some sort of grouping mechanism. In our solution we decided to group integration tests using folders… so the following structure would make sense for a specific set of integration test – Folder named after the section of the tests we are going to do, and then the files under that folder. (See pic below)

Grouping Tests


Whenever we use NDBUnit, we include the xml and xsd file directly in the folder with the naming convention of

  • [name of test] + Database.xsd
  • [name of test] + TestData.xml.

So, if we had a test called ApplicationVersionLogRepository,

  • the test file would be called ApplicationVersionLogRepositoryTest.cs
  • the xsd file would be called ApplicationVersionLogRepositoryDatabase.xsd
  • the xml file would be called ApplicationVersionLogRepositoryTestData.xml

Within the xsd file we only include the tables that are related to what we are testing in that instance, so in this particular test all I am checking is whether the data is being saved to the ApplicationVersionLogTable, thus my xsd file would look as follows…


This is a lot simpler compared to the original xsd file – one table versus potentially hundreds. Likewise the xml file has data only relevant to the test…

<?xml version="1.0" standalone="yes"?>

Compare this to the 700 hundred lines of xml file in our original setup.

And there you go… you would apply this pattern to every integration test where NDBUnit is used. The benefit of this approach is that you no longer have a heart attack when an error occurs and you think it is within the realms of NDBUnit, as it is simplified and manageable to consume.

Taking Proteus out of the Equation

Another aspect that we took our of the equation was Proteus. Not that I have any strong feelings against Proteus, but as a team we felt it abstracted too much away. This means that our integration tests look much like the following (using NUnit)…

We have a setup section, and then our specific tests… it would look much like the following…

public class ApplicationVersionLogRepositoryTests : IntegrationDatabaseUnitTestBase
        private IApplicationVersionLogRepository _repository;
        private bool _result;

        public void beforeEach()
            _repository = _container.Resolve<IApplicationVersionLogRepository>();            

        private void SetupTestDatabase()
            INDbUnitTest database = new SqlDbUnitTest(DatabaseConnectionString);

        public void ShouldBeAbleToSaveData()
            var guidId = new Guid("00000000-0000-0000-0000-000000000002");
            var _applicationVersionLogInfo = new ApplicationVersionLogInfo(guidId, "EnvironmentName", "ApplicationName", "MachineName", "UserName", "Version", DateTime.Now);
            _result = _repository.SaveItem(_applicationVersionLogInfo);
            Assert.That(_result, Is.True);            

        public void ShouldBeAbleToLoadSavedItems()
            var guidId = new Guid("00000000-0000-0000-0000-000000000001");            
            var loadedItem = _repository.GetItem(guidId);

            Assert.That(loadedItem.Id, Is.EqualTo(guidId));
            Assert.That(loadedItem.EnvironmentName, Is.EqualTo("EnvironmentName"));
            Assert.That(loadedItem.ApplicationName, Is.EqualTo("ApplicationName"));
            Assert.That(loadedItem.MachineName, Is.EqualTo("MachineName"));
            Assert.That(loadedItem.UserName, Is.EqualTo("UserName"));
            Assert.That(loadedItem.VersionNo, Is.EqualTo("VersionNo"));


Is this the way to go?

The question needs to be asked, is this the way to go?

The benefit of the this approach is simplicity, however I can also see a downside when you have far reaching integration tests that exercise a large section, you might have extra work. Also, if you have a number of tests that are testing large sections of your database, and they are all using different xsd files, when your schema changes, you will have some work involved in updating them all.

From my side it is still early days to say which way is the best approach, I can see a situation where we might have a combined approach of a central xsd for mega tests, and then smaller xsd’s for smaller tests.

Any suggestions / ideas would be much appreciated.

Posted on Friday, December 30, 2011 10:07 AM C# , Useful Tools | Back to top

Comments on this post: Working with NDBUnit – my suggested project structure for removing complexity.

# re: Working with NDBUnit – my suggested project structure for removing complexity.
Requesting Gravatar...
Hello and thanks for your great article.
I have a question and I want an advice about my project.
I have a database (sqlserver),it is big.and I am working with vs2010. I want test my database. But I don't know how and what do this work.please help me about this subject,Excuse me I don't found you mail.
Left by M.Bagheri on Jan 03, 2012 5:38 PM

Your comment:
 (will show your gravatar)

Copyright © MarkPearl | Powered by: