What are the System variables for PS / DSC deployments in Release Management?

by Donovan Brown 23. January 2015 00:58

By default when you execute a “Deploy Using PS/DSC” action in Release Management U4 you have access to these System variables in your PowerShell or DSC Scripts

  • $applicationPathRoot - Application Path Root* 
  • $buildNumber - Build Number (for component in the release) 
  • $buildDefinition - Build Definition (for component) 
  • $tfsUrl - TFS URL (for component) 
  • $teamProject - Team Project (for component) 
  • $tag - Tag (for server which is running the action) 
  • $applicationPath - Application Path (destination path where component is copied) 
  • $environment - Environment (for stage) 
  • $stage - Stage (for release path)
  • $packageLocation - Package Location (for component)
  • $releaseName - Release Name (The ID of the release; this is assigned by Release Management)
  • $releaseId - Release ID (The number of the release)

*The ApplicationPathRoot variable defaults to an empty string that instructs Release Management to copy the files to c:\windows\DtlDownload folder. However, if you set that value to a different location Release Management will copy the files to the provided location.

Don’t forget that U4 also has configuration variables at three other levels:

Global

 

Server

 

Component

If you define the same configuration variable at multiple levels the value set closest to the Release Template is used.  Think of Cascading Style Sheets, the configuration variables work the same way.

You can get more information from this MSDN article.

Tags:

Work

There appears to be a bug in the XCopy Deployer tool for Release Management

by Donovan Brown 20. January 2015 05:38

I was looking around on the User Voice page for Release Management and noticed a feature request about using /exclude flag with the XCopy Deployer tool. The user was requesting the tool be updated to support that option.

My knee jerk reaction was just add a Custom tool yourself that does it!  So that is exactly what I am going to show you how to do in this post.  Instead of starting from scratch I thought I would leverage the work of the Release Management team and harvest the irxcopy.cmd file and see what they had done.  You can see my previous post on how to harvest the out of box tool resources if you need help finding the irxcopy.cmd file.

Once I had a copy of the file I noticed that the tool did appear to support the /exclude flag of xcopy.  All the user has to do is provide a file named excludedfileslist.txt in the component's source folder.  However, after some testing and further investigation it appears the irxcopy.cmd file has a bug that overwrites the original contents of the file instead of appending to it as intended.

My original excludedfileslist.txt file is below.

This is what it looks like after it is used with irxcopy.cmd.

The irxcopy.cmd file contained the following commands.

As you can see on line 3 the single greater than '>' should be a double greater than '>>'. The single is actually replacing all the contents with irxcopy.cmd instead of adding irxcopy.cmd to the end.

We are going to write our own cmd file named myxcopy.cmd below.

Now all we have to do is add it to the tools inventory and set our component to use our tool instead.

Now our excludedfileslist.txt file looks correct and the xcopy excludes the desired files.

Tags:

Work

How to harvest the out of the box Agent Based tools from Release Management

by Donovan Brown 20. January 2015 00:24

One of the great features of Release Management is the ability to add your own custom tools.  Many times the custom tool I want to add is similar to one of the tools that comes out of the box.  Yet, there is no way from within Release Management to download the resources that make up the out of the box tools.

However, during the execution of a release the Release Management Server copies all of the tools resource files to the agent machine. Therefore, all you have to do is create a Component that uses the desired Tool and search the agent machine for the resource name.  For example let’s locate the irxcopy.cmd resource of the XCopy Deployer tool.  First open a Command Prompt as Administrator. We need to run as Administrator so we can search user folders of other users.  The files will be under the AppData folder of the user that the Microsoft Deployment Agent is running as.  From the root of your system drive type:

dir irxcopy.cmd /s

This will search all the directories of your system drive recursively for irxcopy.cmd.  Depending on how many releases you have processed the list might be very long. 

Nevertheless, all the files are the same. Simply open one and modify as needed.

You will simply have to add your modified file as a new Custom Tool in Release Management. There is no way to overwrite the out of the box tool resources.

Tags:

Work

How to trigger a rollback based on user input from Release Management

by Donovan Brown 19. January 2015 19:56

Quite often I am asked can a Rollback be triggered in Release Management if a user rejects a step of the release.

By default rejecting a step of the release will simply stop the release but will not rollback the system to a previous state.  However, you can use the Manual Intervention action to achieve the desired effect.

To trigger a rollback based on user input simply add a Manual Intervention action at the bottom of your stage’s Deployment Sequence.

If the Recipient rejects this Manual Intervention all the identified Rollback sequences will be executed.

Tags: ,

Work

How to get my reports in Chef Server working on Ubuntu

by Donovan Brown 18. January 2015 17:54

Problem:

I installed the Chef Server but my Reports are not working.

Solution:

The reporting piece of Chef is actually installed separately from the server.  To get reports to work simply issue these commands.

sudo chef-server-ctl install opscode-reporting

sudo opscode-reporting-ctl reconfigure

Tags: , , ,

Work

How to install Chef Server on an Ubuntu server in Hyper-V or Azure

by Donovan Brown 17. January 2015 23:56

Today we are going to build a Chef Server on an Ubuntu server in Hyper-V or Azure.

Hyper-V

Step one is to follow the post below and return when you are done.  Set the name of the server in step 19 to "Chef".

Installing Ubuntu in Hyper-V

Azure

If you are using Azure create an Ubuntu 14.04 LTS server.

You will also have to install a SSH client if you are using Azure. I used PuTTY.

Installation

Now that you have an Ubuntu server either connect using Hyper-V or PuTTY and issue the following commands.

Download Chef Server
wget https://web-dl.packagecloud.io/chef/stable/packages/ubuntu/trusty/chef-server-core_12.0.1-1_amd64.deb

Unpack download
sudo dpkg -i chef-server-core_12.0.1-1_amd64.deb

Now we must configure the server.

sudo chef-server-ctl reconfigure
sudo chef-server-ctl install opscode-manage
sudo opscode-manage-ctl reconfigure

Now navigate to http://[chefServerDNS]/ from your client machine and you should see the following screen. Note you might get a certificate error just select Continue to this website.

Tags: , , ,

Work

How to resolve Windows machine names from Ubuntu server

by Donovan Brown 17. January 2015 17:54

Problem:

I can’t resolve my Windows machine names on my Ubuntu Linux box.

Solution:

Install libnss-winbind and then update your /etc/nsswitch.conf file.
To install libnss-winbind run the following command: 
sudo apt-get install libnss-winbind

When asked “Do you want to continue? [Y/n]” type Y and press enter.

Now we have to update your /etc/nsswitch.conf file. We are going to use vim.

Type sudo vim /etc/nsswitch.conf
Type i (to enter insert mode)
Add “wins” to the “hosts:       files dns” line so that it is “hosts:      files dns wins”
Press Esc (to exit insert mode)
Type :wq (to save and quit)

Now ping your Windows machines by name.

Tags: ,

Work

Is DSC an upgrade to Agent-based pipelines?

by Donovan Brown 15. January 2015 19:02

There have been a lot of questions from customers and on internal distribution list lately on Release Management, Agent-based vs vNext and how best to deploy to PaaS.  So instead of answering the same questions over and over again I decided to write this post and just point them here.

There are a couple of points I want to clarify.  First nothing stops you from running a DSC script via the Microsoft Deployment Agent.  The agent can run any PowerShell and DSC is just a PowerShell extension and can be executed via the Microsoft Deployment Agent.  Second DSC is not an “Agentless” solution.

From a Release Management perspective some people describe Desired State Configuration (DSC) as an agentless deployment. That is not a true statement.  The LCM or Local Configuration Manager running on the machines is the agent.  The nice thing about this is if you are targeting Windows Server 2012 R2 or Windows 8.1 the LCM is already installed and ready to go. But don’t kid yourself: it is an agent.  If you are targeting older versions of Windows you have to install Windows Management Framework 4.0 before you can use DSC. Therefore, the experience of setting up an agent based or vNext (I prefer calling these DSC pipelines and will for the rest of this post) based pipeline both require the installation of an “agent” on the target machine.

Many users of Release Management see DSC as an “Upgrade” or replacement for the agent based solution.  I could not disagree more.  There are situations that DSC simply does not do well and others it is great for.  If you really look at DSC from the Get, Set, Test perspective it limits its use.  A resource that is hard coded to return false from its Test method has no business being a resource.  Therefore, running test via DSC makes no sense. 

As with DSC the same can be said for the agent based solution.   There are some things it does great and others where it does not. Many people are running to DSC because it is new and shiny but it is not a panacea.  Don’t get me wrong I am a big fan of DSC and can’t be more excited about getting it running on Linux but it is simply a tool in my tool box.

I don’t see the DevOps world as a one or the other situation. DevOps is about People, Process and Products and getting them to work and communicate better while automating your pipeline with whatever makes sense for your desired result.  If it is DSC, great.  If it is PowerShell, Chef, Docker or Puppet, fine. Or maybe it is a combination of all of the above.

The goal is a way to easily track, manage and automate the promotion of our code from one environment to another.

The agent based solution is alive and well.  The goal of deploying to PaaS for example can be achieved today using an Agent based solution that scales much better than the DSC alternative. Let me explain why.  In a previous blog post I describe a technique of using a DSC pipeline to deploy to a PaaS website.  In that post I simply deploy to a single stage using an IaaS VM as a proxy to execute my DSC.  Release Management today does not allow you to have the same server in multiple environments for a DSC pipeline.  This means for each stage of my pipeline I would have to stand up a proxy.  However, compare this to the Agent based pipeline where the same machine can appear in multiple environments. This allows you to reuse a single proxy machine to target all your stages.

I don’t feel DSC is the answer to all our problems.  I feel very confident that it is not.  We are not in a DSC or bust situation.

Solve your problem with the best tools you have which might not necessarily be the newest tool you have.

Tags: , , ,

Work

So sick of Microsoft.WebApplication.targets was not found build errors!

by Donovan Brown 15. January 2015 02:40

Problem:

I was recently connecting an on premises build server to my Visual Studio Online account (that is crazy easy by the way) but my first build failed with the following error.

The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk.

Solution: 

This is not the first time I have come across this error. Just replace “v11.0” with whatever version you want and you have been there.  In the past I would just copy files on my build machine or install countless SDK versions trying to make the build machine happy.

Not this time I wanted a much clearer solution. I connected to the build machine and found the desired file in a “v12.0” folder instead of a “v11.0” folder being referenced.   So how can I simply have build use the correct version?

Turns out you can simply pass Visual Studio Version on the Process tab of your build definition.  Under the Advanced section just add the following text to the MSBuild Arguments.

/p:VisualStudioVersion=12.0

Problem solved and I don’t feel all dirty after. 

Tags: ,

Work

Achieving Continuous Delivery with VSO and RMO (it is easier than you think)

by Donovan Brown 14. January 2015 10:26

Release Management Online (RMO) monitors the build definition configured to a release template. When the build is completed a release is automatically kicked off. This is a great step forward from the past where we had to use custom build templates or resort the CLI or REST API to trigger a build.  This works regardless if you are using a hosted or on premises build controller.

It has never been easier to achieve continuous delivery than it is today with the combination of Visual Studio Online (VSO) and RMO.  Simply check a project into VSO and add a new build definition.  Now in Release Management create a new Release Template associated with that build and check the box "Can Trigger a Release from a Build?" That is it! The next build will start this release.

One thing I noticed was missing and the Product Team appears aware of is the inability to set a target stage.  As it sits today the target stage will always be the final stage of your Release Path.  That is a relatively small compromise for how easy they have made it to trigger a release from a build in VSO.

Tags: , ,

Work

About the author

My name is Donovan Brown and I am a Technology Specialist for DevTools with Microsoft with a background in application development.  I also run one of the Nation’s fastest growing online registration sites for motorsports events DLBRacing.com.  When I am not writing software I race cars for fun.  DLBRacing.com has given me the opportunity to combine my two passions writing software and racing cars.

AdSense

Month List

AdSense