August 2011 - Posts
This post is part II of Moles Framework, if you did not read part I (Moles overview and how to use Stubs) you can find it here.
In my previous post I explain how can we use Stubs to fake a class that implements an interface.
We also learnt that Stub uses virtualization technique (like OOP) to achieve isolation.
Moles Framework - Moles
But what if our DAL class does not implementing an interface…?
Of course we don't use interfaces in all our code, how can we still gain isolation without interface? Moles! Or M J
Remember that after adding Moles assembly we had 2 types of classes: S and M:
The M is for Mole.
Mole class is the one that really does a great trick!
First, it does not use virtualization technique as Stubs but Instrumentation. Just like code coverage for instance.
When a test uses Mole class while running, the Mole object redirects the method's address to a new fake method we implemented
We don’t need to create a class for that (like a Stub or a Mock object), we use AllInstances object property instead:
This unit test can be a little bit confusing because we don't really change the dal property of the Manager object; moreover, I even used Manager and not its accessor.
Here I used MResourceReader object and its AllInstances property. It means that when a test is executed, all the instances of ResourceReader within the test environment that will be loaded into memory, will replace the GetFileContent methods address to my new one…
· In order to tell the MSTest framework to use moles and to do its trick, we use HostType attribute.
· In the last unit test above, GetFileContentString delegate signature accept 2 parameters. The first one is of Capito.BranchUtility.DAL.ResourceReader type - unlike the real GetFileContent method that accept only one string parameter. Because we use AllInstances property, Moles come for help again and pass the DAL object as variable to the extension method (see above (dal, s ) => …) so we can use its properties\methods if we want.
· You can also Mole static methods – in this case you don’t use AllInstances but the M class alone:
MResourceReader.StaticMethofString = ….
Unit testing is something that we all do, even if we don't really use any framework for that such as MS Test or NUnit. If we're writing class library we usually create another console application project to test the methods and the behavior of the new classes.
You can read my previous post about writing TDD and unit tests in Microsoft Visual Studio 2010 here.
One of the goals of unit testing is isolation. When we unit test a method, we usually want to check that it behaves as accepted, without any other objects to interfere. For example, let's take a very simple scenario:
We want to test GetInfo method (in Manager Class inside our BL) that calls GetFileContent method in DAL library.
When we test GetInfo, we want to check that it returns a valid result according to our inputs, but we also want to achieve isolation – meaning, we don’t want to be depend on ResouceReader object. We don't want our test to pass or fail due to GetFileContent result. How can we achieve this? How can we achieve isolation?
Before Moles Framework
In order to detour DAL.ResourceReader.GetFileContent method, we need to somehow replace the real DAL object (m_dal) in the BL.Manager object to a fake one that its implementation of GetFileContent is no more than 1 safe line… By saying "safe" I mean that it won't do anything "dangerous" such as reading from a resource that can\might fail. Because our ResourceReader class is implementing an interface (IResourceReader), we can create a new class in the test project that implements IResourceReader and its GetFileContent. Then we can use Private Accessor of the Manager object to set m_dal to our fake object.
This technique is called Mocking. Here is an example:
One disadvantage of regular Mock objects is that we'll need to create number of mock objects that implement GetFileContent method differently. The example above implements GetFileContent that returns an empty string. We need another implementation that will return NULL, the file name, etc in order to test GetInfo completely. that's where Moles come in…
A new pet in town! With Moles framework we can easily test objects and methods and detour other methods and objects to achieve isolation.
Moles is a test stub and detour framework that can be downloaded for free here
Moles offer 2 techniques to achieve isolation
Moles Framework - Stubs
In order to create a fake DAL, let's first add Moles Assembly to the project we want to detour, in our case the DAL:
After compilation, a new assembly is referenced in our test project: Capito.BranchUtility.DAL.Moles.
Check this assembly namespace and you see that it has 2 major classes that starts with S and M. The S is for stub:
Stub uses Virtualization technique – that means it uses OOP to achieve isolation…pretty similar to what we just did with Mock objects.
S classes are used mostly for Interfaces. The SIResourceReader is a class that looks like the real one but all its properties and methods are actually delegates! We can create new "implementation" to all the methods we want in that object without creating new class for that… in example:
The selected code demonstrates how I created a fake stub object and with extension method (delegate) I gave an implementation to GetFileContent method.
Note: The GetFileContentString delegate ends with String – that's meant to support method overloading.
But, What about M ? Read the next post here about Moles
There is an option to configure and run code coverage manually for running manual tests such as test cases with Microsoft Test Manager. It can be done with few command line actions that need to be run. Here are the actions:
· Run VSInstr.exe to replace the DLL\PDB files with the instrumented ones
· Run VSPerfCmd.exe to start the code coverage
· Run VSPrefCmd.exe to stop code coverage
In order to instrument our DLL(s), let's run the command(s): · VSInstr.exe "AssemlbyFile.dll" /coverage
After running, you can see that VSInstr.exe created the following files:
Actually, VSInstr renamed the original AssemblyFile.dll to AssemblyFile.dll.orig and created new instrumented AssemblyFile.dll (See its size – much larger…)
Also, it created an instrumented pdb.
Note: You can run VSInstr command multiple times for each assembly (DLL\EXE)
To start logging the coverage, we use VSPerfCmd command with start\shutdown arguments: · VSPerfCmd.exe /start:coverage /output:Test.coverage
That's it! Now we are ready to run our tests. When finished, we need to shutdown the coverage action:
· VSPerfCmd.exe /shutdown
You can look and find coverage file (in our case Test.coverage), just drag it to Visual Studio and it'll open in the Code coverage Window.
Important: usually, you can open Visual Studio Command Prompt and run the commands from there, but if you are on 64bit machine, the Test.Coverage file will be empty and you might get this error:"Empty results generated: none of the instrumented binary was used"
That's because you need to run the command from x64 folder:
· For x32 the folder is: <VSDir>\ Team Tools\Performance Tools
· For x64 the folder is: <VSDir>\ Team Tools\Performance Tools \x64
I've create a small console application that loads all information from XML files and executes the commands above, it may help if you have more than one assembly to instrument and instead of running the VSInstr action multiple time, you can just update the XML and run the app. Here it is have fun
Configuration managers in companies that use TFS need the ability to manipulate and change work Item templates according to the company's methodology.
With Microsoft Process Template Editor that comes within Visual Studio Team Foundation Power Tools, we can edit work item templates: add new fields, change the UI and the state machine of work items and fit it to the company needs. All the sets of the customized work items, source control rules and build templates gathered and called process template.
The best practice is to have one process template in our organization so all development teams use it and obey all rules inside it. Multiple process templates in medium-sized companies can caused disorder in the development. Developers that work is cross teams or even move from one team to another will need to learn the new methodology of the team as if they move to new company…
In large scale companies, configuration managers usually have more than one process template.
When changes are made in the organization methodology and we need to update work items, we first need to compare between the current work items in Team Projects, to view the differences between them. Compare tools like the one that comes within Visual Studio can compare work items because they are XML based, but still cannot give full comparison overview and explain what the differences are.
Process Template Comparer
Process template comparer is a tool that was written to ease the comparison that configuration managers usually do with primitive tools. This tool gives the following features:
1. Compare between Team Projects work items
2. Full comparison overview that shows where the differences are
3. Drill-down option that shows the changes in 4 categories:
Choose the base team project to compare with the rest of the Team Projects.
Ignore option – We can ignore fields that we don't want to include in our comparison.
After clicking compare, a list box with results is filled.
The result view is displayed with all team projects that were chose. Each team project explains what the differences are: how much different work items and how much same or missing work items compared with the based Team Project.
When expanding the team project node, we can view the work items that missing, different or same. Clicking View button will displays the result in more details:
About Next Version
The next version of Process Template comparer will have the following features:
· Comparing by process template – the current version can compare between Team Projects, the next version will have the option to compare Team Projects with Process Templates that defined in TFS (i.e. Compare Team Projects with MSF for CMMI v5 Process Improvement).
· Coloring options for compare results· Export compare results to Excel
Managers and team leaders that work with TFS use Work Items to track and monitor the team's work and progress.
Work Items are based-work-packages that define scope of work for a team member, a work item can be a developer task, investigation task and test task.
With work items team leaders can watch and track of his team and the project progress. 2 important fields that I like to mention are Remaining-Work and Completed-Work; these fields show the real status of a task progress.
When a developer finishes his task and likes to perform check-in to the changed code, he links the work item that describes the work that was done to the code. After that (or before) he needs to update the 2 field to report how much time is left to completely finish the task and how much hours completed.
It's not an easy task for developers… because these fields cannot be updated while check-in. only before or after. That's why I created a custom check-in policy that allows developers to update these fields while check-in. My custom check-in policy is a small window that pops up after pressing check-in button in the pending changes window. The window displays all work items (that are linked to the check-in) as tabs, each tab contain Remaining-Work and Completed-Work fields. The developer must update his work by changing the fields in all tabs.
After using my Update work item check-in policy, team leaders can use and read reports such as requirement progress, Burn down and burn rate and Status on all iterations
to read more and download