Skip to main content

Cloud

SharePoint and TFS: Automating Deployment to Separate Domains with Team Foundation Server 2010 Team Build

Here’s the situation: you have a Team Build server on your main domain, but you have separate development environments that aren’t on the same domain as the Team Build server (for whatever reason). You want to be able to use the automated drop functionality built into the Team Build workflow, because it’s robust and heavily integrated into the workflow.

The Problem

You want to use the drop functionality, but, being on different domains, the credentials of the build service are less than useless on the development environment. There are two ways around this:

The Easy Way

If your domain administrator will allow you to have a one-way external trust between your environment and the build server’s environment, you can completely ignore this post. You’ll be able to give the identity of the build service access to a drop folder on your deployment environment.

The Other Way

If you’re not fortunate enough to have a willing domain administrator who will set up external trusts on your corporate domain for every environment that needs one, then you’ll need another way around this problem. The solution is actually rather simple. WScript has an object (WScript.Network) that provides a MapNetworkDrive function that takes a set of credentials and will actually map a drive on your local file system.

The Hard Way

The hardest way is to complete the steps described below in code and then create a workflow step based on them. This requires access to the build server and rights to add a new class library to the build set. This is not recommended unless you are the build administrator for the enterprise.

The Solution

You’ll need a mapping and unmapping script to add and remove the mapped drive from the build server at the beginning and end of the build workflow. This is actually relatively straight forward, and I’ll walk you through this. You also need the mapping and unmapping scripts to execute as part of the workflow. PowerShell works so much better than the Command Prompt here because of its object-oriented-ness.

Modifying the Build Process

In the root of your team project in the Source Control Explorer, you’ll find a folder named "BuildProcessTemplates". You don’t need to worry about this right now, just make sure that it’s there.

The first thing you need is a build definition. In Team Explorer, right click the Builds node of the target Team Project and select New Build Definition…. Give the definition a name and a workspace and then go to the Process pane. It will show which process template you’re using (usually the Default Template). Select New… and be sure to select "Copy an existing XAML file" in case something goes terribly wrong. Give the file a name and click OK. The file will be created, added to source control, and opened in Visual Studio.

Add arguments to process and Metadata

Before we can add the Mapping or Unmapping steps, we need to add a couple of arguments to the template. Select the Arguments tab at the bottom of the workflow designer and click on Create Argument at the very bottom of the list of arguments. Here you’ll add the arguments MapScript and UnmapScript. These will hold the TFS locations of the mapping and unmapping scripts respectively.

In order to change the arguments at build time, we need to provide them through the build definition window. To do this, select the Metadata argument and click the button in Default Value. Click Add in the Process Parameter Metadata Editor once for each argument you want to add. The parameter name must match the name of the argument. Display Name is used for readability, while Category controls placement on the form. Description is used to provide information about what should go in the field. Don’t worry about Editor, Required, or when to view this parameter.

Add Mapping Process

Once here, navigate to the "Sequence" in the outermost "Try/Catch" inside the Run on Agent sequence. At the top of the sequence, you want to add an If statement from the toolbox. Give it a name like Check for Mapping Script. In the Condition field, type (Not String.IsNullOrWhitePace(MapScript)) And (Not String.IsNullOrWhiteSpace(UnmapScript)).

In the Then field, drop in a new Sequence and give it the name Map Drive.

Then, drop in a ConvertWorkspaceItem element and give it the name FindMapScript. This item converts a server address to a file system address on the Team Build server. For Input, type MapScript. For output, type MapScriptFile. Workspace should be Workspace.

Now we need to change the scope of the MapScriptFile variable that you just created. Click on the Variables tab and select MapScriptFile from the list. If it’s not there, click on Create Variable at the bottom and create it as type String. Change the Scope value to Map Drive and close the Variables tab.

The next step is to add an InvokeProcess task. Give it the name MapRemoteDrive. The two important properties you need to change are FileName to "PowerShell.exe" with quotes, and Arguments to String.Format("& ‘{0}’", MapScriptFile). If you have any arguments for your MapScript, append them on here.

Finally, drag over a WriteBuildMessage and WriteBuildError task and place them on the Standard Output and Error Output handlers respectively. You’ll need to change the Message property to be stdOutput or errOutput, accordingly. This will allow any status messages returned by your script to be included in the build log.

Add Unmapping Process

Adding the process to unmap the drive is identical to the process to add the mapping script, so I won’t repeat it. What is important is the location where the process step should go. It must go in the Finally step of the "TryCatch" block. This ensures that, no matter what happens with the build, the drive gets unmapped and is ready for the next build.

Creating the Scripts

Once you have the process steps in place, it’s time to create the scripts you’ll be using. The map script is a relatively simple script that creates a COM object and uses it to map a network drive based on a username and password. The unmap script does the exact opposite, creating the COM object and unmapping the drive. I use several additional process arguments to pass into the scripts, but you don’t have to if you don’t want to. These arguments are created in the exact same way as the MapScript and UnmapScript arguments. They’re passed to the scripts as shown in the InvokeProcess task step.

These are my map and unmap scripts.

Check Scripts into Source Control

Once the scripts are created, you need to check them into source control. The location needs to be a part of the Workspace you’re building in, basically the Team Project. I like to create a separate folder titled DeploymentFiles and to place the files in there. The location is important because you’ll need it for the next step: referencing the files in the build process definition.

Add arguments to Metadata and Test

Now that your scripts and build process template are checked in to source control, you’re ready to update your build definition to match the template. Make sure the template is selected from the drop down and click Refresh. This will reload the template from source control and populate the parameters list.

From here, expand the Deployment section (or whatever you named your metadata category) and fill in the necessary parameters to run the map and unmap scripts.

Now you’re ready to save the script and run it. If all goes well, your project should be built correctly and deployed to the folder you specified.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Andrew Schwenker

Andrew Schwenker is a Sr. Technical Consultant within Perficient’s Microsoft National Business Unit East's SharePoint practice. Andrew has nearly 2 years of experience in consulting and has participated in projects that have touched nearly every aspect of SharePoint 2010. Andrew earned his Bachelor’s degree in Computer Science as well as Master’s degrees in Computer Science and Information Systems from Indiana University. He’s interested in creating winning solutions to generate business innovation using SharePoint. Prior to starting at Perficient, Andrew completed internships with General Electric, ExactTarget, and Great American Financial Resources. During his studies, he actively participated in Alpha Phi Omega National Service Fraternity and competed in the first annual Cluster Challenge at SC07 in Reno, NV. Andrew was a part of the dynamic PointBridge team that was acquired by Perficient.

More from this Author

Follow Us
TwitterLinkedinFacebookYoutubeInstagram