A little while back I was trying to add some automated Unit tests to a Continuous build/deployment solution, and doing so brought me up against a structural issue in the way the solution had been designed. A full solution build produced a set of output folders, one for each project. Some of these projects additionally contained test assemblies; some of them didn't, but where test assemblies were present, clearly I wanted to run any tests inside them.
When working within a build pipeline like this I find it very helpful to stage my content in a way that's appropriate to each build activity in question. For example, staging checked out content for compilation, staging public-facing assemblies for documentation, staging assemblies for unit test, staging web resources for deployment, staging installables along with external dependencies (release notes, change sets) for archive, etc. Each staging arena contains the minimal resources required to complete a specific build activity, decoupling the various activities, and making it easy to accommodate slight differences in the way things need to be laid out and configured.
Here it wouldn't have made sense to try to flatten the directory structure and copy all the unit tests and assemblies into a single folder. Firstly, the directory structure was significant; secondly there were instances of late binding through dynamic assembly load, so it was important not to make assumptions about which assemblies were present. I knew however that every test assembly had the suffix ".Tests.dll", so this was sufficient to let me stage only the output folders that contained test assemblies, ignoring all the other folders. I then copied in the NUnit and Moq dependencies before running the NUnit tests via the console runner.
I opted for target batching, using the SDC CopyFolder task to copy the folder tree beneath a set of files matching a specified filter. Initially I tried to achieve the same effect using MSBuild's Copy task, without batching. The issue here was that while I could transform the ItemCollection of files into a collection of folders, the Copy task would only accept a file (rather than folder) path for input. Meanwhile the SDC CopyFolder task implemented Source and Destination properties as strings rather than ITaskItems, meaning that target batching was the only way to iterate through a collection of folders.
<!-- Create test directory -->
<!-- Copy NUnit files locally -->
<PrerequisiteAssemblies Include="$(NUnitToolsPath)**\*.exe;$(NUnitToolsPath)**\*.dll;$(NUnitToolsPath)**\*.config" Exclude="$(NUnitToolsPath)**\*.tests.dll"/>
<Copy SourceFiles="@(PrerequisiteAssemblies)" DestinationFolder="$(TestRoot)"/>
<!-- Set up item groups later used in target batching; we want to copy recursively every folder that contains a test
<Target Name="CopyFolderTreesWithTestAssemblies" Inputs="@(SrcFoldersToCopy)" Outputs="@(DstFoldersToCopy)" DependsOnTargets="SetupUnitTests">
<!-- Copy recursively the contents of any folder that contains a test assembly -->
<CopyFolder Source="%(SrcFoldersToCopy.FullPath)" Destination="$(TestRoot)"/>
<Target Name="RunUnitTests" DependsOnTargets="CopyFolderTreesWithTestAssemblies">
It also helps me answer the question: "Why/when should one use target batching?" To which one answer is: "Use target batching when you need tasks with simple input types to process item collections."