Using Microsoft Fakes to test code that sends email

In the past I used to use Neptune to test any code that sent email. Neptune is still very useful with Web Test but with unit testing I should be able to use Fakes. Below is a typical block of code that uses the SmtpClient class to send an email message.  public static void SendMessage(string subject, string body, string to){   try   {      var server = ConfigurationManager.AppSettings["MailServer"];      var serverPort = ConfigurationManager.AppSettings["MailServerPort"];      var password = ConfigurationManager.AppSettings["MailServerPassword"];      var username = ConfigurationManager.AppSettings["MailServerUsername"];      var fromAddress = ConfigurationManager.AppSettings["MailServerEmail"];       if (string.IsNullOrEmpty(fromAddress))      {         fromAddress = "info@didiwinthelotto.com";      }       var msg = new MailMessage(new MailAddress(fromAddress, "Did I Win the Lotto"),                                 new MailAddress(to))                                {                                   IsBodyHtml = true,                                   Priority = MailPriority.High,                                   Subject = subject,                                   Body = body                                };       var smtp = new SmtpClient(server);      if (!string.IsNullOrEmpty(password))      {         smtp.Credentials = new NetworkCredential(username, password);      }      if (!string.IsNullOrEmpty(serverPort))      {         int port;         if (int.TryParse(serverPort, out port))         {            smtp.Port = port;         }      }      smtp.Send(msg);   }   catch (Exception e)   {      var msg = string.Format("Could not send message to {0}\r\n{1}", to, e);      WriteToEventLog(msg, EventLogEntryType.Error);      if (e.InnerException != null)      {         WriteToEventLog(e.ToString(), EventLogEntryType.Error);      }   }} Of course I could refactor this code to use a generic interface for sending messages but I want to test the code as it was originally written.  One of the most powerful features of fakes is the ability to detour concrete classes using Shims. Shims will allow you to unit test legacy code in isolation without having to refactor the code first. If you are not familiar with Microsoft Fakes I would suggest you first watch these short getting started videos before you continue http://tinyurl.com/MSFakesIntro. If you have an app.config file in your test project with the correct AppSettings values you don’t have to fake the calls to ConfigurationManager.AppSettings.  But for a learning opportunity we are going to fake those as well. First we need to create a Fakes Assembly for System.dll where SmtpClient is defined and System.configuration.dll for ConfigurationManager.  Expand the references folder of your test project and right click on System.dll and select Add Fakes Assembly. Now repeat for System.configuration.dll. A Fakes folder will be added to your project with a mscorlb.fakes, System.fakes and System.configuration.fakes files in it.  By default Fakes attempts to create Shims and Stubs for every type in each assembly which can lead to warnings and increase your build time.  We are going to override this default behavior and only fake the classes we need. Open the mscorlib.fakes file and replace is contents with the following xml: <Fakes xmlns="http://schemas.microsoft.com/fakes/2011/" Diagnostic="true">   <Assembly Name="mscorlib" Version="4.0.0.0" />   <StubGeneration>      <Clear />   </StubGeneration>   <ShimGeneration>      <Clear />   </ShimGeneration></Fakes> Now replace the contents of System.fakes with: <Fakes xmlns="http://schemas.microsoft.com/fakes/2011/" Diagnostic="true">   <Assembly Name="System" Version="4.0.0.0"/>   <StubGeneration>      <Clear />   </StubGeneration>   <ShimGeneration>      <Clear />      <Add FullName="System.Net.Mail.SmtpClient"/>   </ShimGeneration></Fakes> Finally replace the contents of System.configuration.fakes with the following: <Fakes xmlns="http://schemas.microsoft.com/fakes/2011/" Diagnostic="true">  <Assembly Name="System.configuration" Version="4.0.0.0"/>   <StubGeneration>      <Clear/>   </StubGeneration>   <ShimGeneration>      <Clear/>      <Add FullName="System.Configuration.ConfigurationManager"/>   </ShimGeneration></Fakes> This will make sure Fakes only creates Shims for the two classes we need. When you are using Shims you must create a ShimsContext so let’s begin with the boilerplate code for using Shims. [TestMethod]public void Util_SendMessage(){   using (ShimsContext.Create())   {      // Arrange      // Act      // Assert   }} Now we need to use the ShimConfigurationManager class created by Fakes to detour all the calls to AppSettings.  AppSettings returns a NameValueCollection so we are going to do the same, prepopulated with the needed values for our test.  Fakes has added a Fakes namespace to the System.Configuration namespace that has all the generated fake types.  Fakes uses delegates to allow the test to provide its own implementation of the property or function.  For AppSettingsGet we simply provide an anonymous function that returns a NameValueCollection. The only other call we need to detour is the Send call of SmtpClient.  We have two options. We could detour the constructor of the SmtpClient class so that each time one is created during the execution of our test we get to give our implementation. This method I feel requires a lot more work and instead we are going to take the second option and detour the Send method for all instances of SmtpClient.  When Fakes generated the Shim for us it also implemented an AllInstances property that allows us to provide a custom implementation for all instances of a particular type.  The delegate that we provided for Send will be passed two parameters.  One which is the actual instance of the SmtpClient class that the call is being made on and the message to be sent.  Within this delegate we can use the Assert class to test any conditions we like.  However, if the code under test never calls the Send method our test may pass in error because none of the Asserts defined in the Send delegate would be tested. So we set a Boolean value to true when the Send delegate is called so we can verify that we actually sent the email.  Once you put it all together your test will look like this:  [TestMethod] public void Util_SendMessage() {    using (ShimsContext.Create())    {       // Arrange       System.Configuration.Fakes.ShimConfigurationManager.AppSettingsGet =         () => new NameValueCollection {{ "MailServer", "127.0.0.1" },                                        { "MailServerEmail", "info@diwtl.com" },                                        { "MailServerPort", "25" },                                        { "MailServerPassword", "unittest" },                                        { "MailServerUsername", "info@diwtl.com" }};      var msgSent = false;      System.Net.Mail.Fakes.ShimSmtpClient.AllInstances.SendMailMessage =        (client, message) =>      {         msgSent = true;         Assert.AreEqual("unitTest@nowhere.com", message.To[0].Address);      };       // Act      Util.SendMessage("Testing", "This is a test", "unitTest@nowhere.com");      // Assert      Assert.IsTrue(msgSent);   }} As you can see Fakes is an extremely flexible and powerful Isolation framework.

How to data bind a Visual Studio 2013 Coded UI Test

Problem I need to run the same Coded UI Test with different data. Solution Data bind your Coded UI Test.  To data bind a test in Visual Studio you just need access to the data source and add attributes to the test. For this example we are going to use a simply CSV file.  So add a new text file to your project with a CSV extension. Create a comma delimited file of the desired data.  Make sure when you save it you first select “Advanced Save Options” from the File menu and select “Unicode (UTF-8 without signature) – Codepage 65001” before you save the file. Now right click on the item in Solution Explorer and select Properties. From the Properties window change the “Copy to Output Directory” to “Copy always”.   To your test you will need to add two attributes.  The first is the DeploymentItem attribute. This attribute takes a single string argument of the name of the CSV file. The second attribute is the DataSource attribute.  This attribute is where you define the class used to read the data, what table to read from and how the data should be accessed.  For a CSV file the first argument will be “Microsoft.VisualStudio.TestTools.DataSource.CSV” which identifies the correct class to use to load the CSV file.  Next we need to let the test know where to find the data with “|DataDirectory|\\data.csv”.  Then we have to identify the table to read the data from with “data#csv”. Finally we have to give it an access method “DataAccessMethod.Sequential”.  The final attribute will look like this: [DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\data.csv", "data#csv", DataAccessMethod.Sequential)] With the attributes in place you can now use the DataRow property of the TestContext to access the columns in your CSV for example: TestContext.DataRow["FirstName"].ToString(); Good luck.

How to run Microsoft Test Manager Suite in vNext Deployment

While preparing to speak at TechEd Europe I really had to run Release Management and Desired State Configuration (DSC) through their paces. I already blogged about one of my challenges of implementing tokenization in a DSC based deployment here.  I also wanted to run Coded UI tests as part of my release.  To do this I simply ran the attached PowerShell script during my release.  The script is very similar to the tool I wrote for the Agent based deployment here.  In this post I simply share the ps1 file that is attached and explain how to add it to your pipeline.  For this post I will be using Update 4 of Release Management. The first thing you have to do is make sure the ps1 file is in the drop location of your team build.  There are several ways you can do this, however, I simply added a Configurations folder to my Coded UI Test project in Visual Studio.  The most important part is to make sure that “Copy to Output Directory” is set to either “Copy if newer” or “Copy always”. This will ensure the file is in the drop location of the build. Now queue a build and examine the drop location. You will need to know the relative path from the root of the drop location of your ps1 script for use in your “Deploy Using PS/DSC” action of your vNext Release Template. Now let’s add a new vNext component in Release Management.  Select the “Builds with application” radio button and simply enter a “\” for the value. On the “Configuration Variables” tab you have to setup all the parameters for the ps1 file. Add each variable as a Standard configuration variable (Collection, TeamProject, PlanId, SuiteId, ConfigId BuildDirectory, TestEnvironment, Title and TestRunWaitDelay). You can refer to my previous post on where to locate the values for each configuration variable.  One of the new features of Update 4 is the ability to define default values for variables that do not change very often.  As you can see in the image below I set default values for Collection, TeamProject, TestEnvironment and TestRunWaitDelay. With our component created we can move on to the vNext Release Template.  Create a new one and add your components.  Now drag the “Deploy Using PS/DSC” action onto the deployment sequence.  Select the desired server from the ServerName dropdown. That server must already have a Test Agent configured on that machine with tcm.exe.  The UserName and Password that are provided is an account that has permission to establish a Remote PowerShell connection to the target machine.  Please note that in Update 3 of Release Management this is NOT the account that will execute your ps1. The ps1 in Update 3 is run by local system.  Which means for this to work local system must have access to your TFS or you will get an access denied error.  Select your test component in the ComponentName dropdown.  Now enter the relative path to the ps1 file in the drop location.  You can leave the rest of the values blank. We now have to add the Custom configuration variables to the action.  Click the plus button or use the down arrow and select “Standard variable”.  On the newly added row select the configuration variable you want to set and enter the correct value.  For the BuildDirectory variable enter “$applicationPath”. This variable is provided by Release Management and points to a location that contains the files of our component. Now you should be able to run a new release and execute automated test. RunTests.ps1 (4.94 kb)

Speaking at St. Louis Days of .NET 2014

Shortly after returning from speaking at TechEd Europe I will be speaking at St. Louis Days of .NET 2014. A Practical View of Release Management for Visual Studio 2013 This session provides a detailed technical presentation of the functionality and architecture of Release Management for Visual Studio 2013. We start by presenting an overview of the key concepts, architecture, and configuration of the various components. We discuss the out-of-the box deployment actions available to compose automations for common deployment scenarios and how to use extensibility to cover the not-so-common scenarios. In more detail, we discuss the mechanism to manage variables across environments, how to include manual intervention, automated tests and rollbacks as part of a deployment, how to trigger release as part of a build, how to leverage logs to diagnose failed releases, and finally how to add custom actions to the inventory. These are presented through specific scenarios encountered in the field. Level: Intermediate  Automating a Manual test using Microsoft Test Manager and Coded UI Test This session demonstrates taking a manual test from Microsoft Test Manager (MTM) and automating it with Coded UI testing with Visual Studio. Level: Beginner  Cross-Platform Development with Team Foundation Server 2013 This session provides an overview of how cross-platform teams can utilize the power of Team Foundation Server (TFS). The power of TFS can be leveraged by developers of any language and platform. We will demonstrate how to use and access TFS from Eclipse and deploy to Linux with continuous Deployment. Level: Intermediate 

Running Microsoft Test Manager Suite with Release Management

Goal I want to run a Microsoft Test Manager Suite after my bits are deployed by Microsoft Release Management (InRelease). Solution: There appears to be a bug in the current "MTM Automated Tests Manager" tool that will fail if your build definition name contains spaces. So I took this opportunity to write my own PowerShell script to run as part of my deployment within Release Management.  Attached to this post is a PowerShell script that corrects the space issue. The remainder of this post will explain how to add a new tool to Release Management and add a testing component to your release template. Begin by starting Release Management and clicking the Inventory tab then Tools.  Once on the Tools tab under Inventory click the New button.  Feel free to enter any Name and Description you like.  For the command simply type “powershell”. For the Arguments value enter: -command ./MyTcmExec.ps1 -Collection __Collection__ -TeamProject __TeamProject__ -PlanId __PlanID__ -SuiteId __SuiteID__ -ConfigId __ConfigID__ -BuildDirectory __BuildDirectory__ -Title __Title__ If you took the time to read the MyTcmExec.ps1 file you may have noticed not all the parameters are listed in the arguments above.  That is because not all of the arguments are required to have a successful test run. However, if you need to pass in the additional parameters for your situation simply add them to the Arguments.  Each value that begins and ends with double underscore “__” will be assignable on each stage. Now that we have our tool in Release Management lets create a component that uses our tool.  Click the Configure Apps tab then Components. Once on the Components tab under Configure Apps click the New button.  Feel free to enter any Name and Description you like.  On the Source tab simply enter a \ for the Path to package value.  On the Deployment tab is where we select our tool from the Tool dropdown.  Unless you would like to adjust the arguments for this component simply click Save & Close.  Note that any changes to the arguments will only affect this component and not the tool definition. The final step is to add the component to our release template. Click the Release Template tab under Configure Apps.  Right click on Components in the Toolbox and select Add from the context menu.  Select the desired component and click the Link button or simply just double click the component to add it to the toolbox.  Now simply drag and drop the new component to the desired location in your stage’s workflow. Although Release Management has helper variables for example $(PackageLocation), $(TfsUrlWithCollection) and $(TeamProject). I discovered the $(TfsUrlWithCollection) and $(TeamProject) variables do not expand to the expected values.  $(TfsUrlWithCollection) includes the team project and $(TeamProject) expands to an empty string. Therefore, I suggest you simply hardcode those values with the exception of $(PackageLocation) which appears to work as expected. You will have to harvest the PlanID, ConfigID and SuiteID from Microsoft Test manager.  Use the images below to locate those values.  You will also need the URL to the Team Project Collection and the name of the Team Project. Using the values from Microsoft Test Manager and TFS you can now fill in the values of your component.  You can enter any value you like for the Title.  For the BuildDirectory enter $(PackageLocation). The final step is to make sure you have associated automation to test case in your selected suite and the assembly that contains that automation is being built with the solution being released.  Good luck. MyTcmExec.ps1 (6.17 kb)

I keep getting error when I run tcm.exe

Problem: I keep getting the following error when I attempt to run tcm.exe from the command line: “A test run must be created with at least one test case.” Solution: Open MTM and check the status of the test case from the Test tab.  Make sure the test case state is not Error.  If it is reset it to active and try your command again.