Web Deployment Builds in TFS 2010 - Part 2

by Aaron 29. December 2010 23:00

I've been patting myself on the back for a few days now because of our QA builds.  The next step in creating our automated builds is creating the release builds.

For our release builds, we don't want to automatically deploy the code.  Sounds simple, right?  I thought so too, but I found that the build doesn't apply the web.config transformations.  I accept the challenge!

After a bunch of trial and error, and staring at the targets files "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" and "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets", I finally found the answer in the former targets file:

  <!--
 ============================================================
 _CopyWebApplication

 This target will copy the build outputs along with the
 content files into a _PublishedWebsites folder.
 
 This Task is only necessary when $(OutDir) has been redirected
 to a folder other than ~\bin such as is the case with Team Build.
 
  The original _CopyWebApplication is now a Legacy, you can still use it by setting $(UseWPP_CopyWebApplication) to true.
  By default, it now change to use _WPPCopyWebApplication target in Microsoft.Web.Publish.targets.   It allow to leverage the web.config trsnaformation.
 ============================================================
 -->

So, I set the UseWPP_CopyWebApplication to True in the MSBuild arguments.  I then got this error:

C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets (868): These two Properties are not compatable: $(UseWPP_CopyWebApplication) and $(PipelineDependsOnBuild) both are True. Please correct the problem by either set $(Disable_CopyWebApplication) to true or set $(PipelineDependsOnBuild) to false to break the circular build dependency. Detail: $(UseWPP_CopyWebApplication) make the publsih pipeline (WPP) to be Part of the build and $(PipelineDependsOnBuild) make the WPP depend on build thus cause the build circular build dependency.

I took the advice of the error message and set PipelineDependsOnBuild to False.

My MSBuild Arguments for a simple release build that applies the web.config transformations are now

  • /p:UseWPP_CopyWebApplication=True
  • /p:PipelineDependsOnBuild=False

I can then get the web application from the drop folder under the _PublishedWebsites folder.

Tags: , , , ,

Web Deployment Builds in TFS 2010

by Aaron 26. December 2010 02:01

This has been a great Christmas!  Not because I finally got the pony I've always wanted, and certainly not because I finally got my two front teeth.  It's because I got to see the new Doctor Who Christmas special, and I now know that it's only a few more months until the new season starts.  It feels like it's been forever since the last season ended, and based on the Christmas special, it seems like Matt Smith might really be starting to own the role.

Oh yeah!  It's also because I finally started working with the TFS 2010 build service!  I also have a couple of working builds that do automated deployments too.  Fantastic!  But it wasn't easy for me.

Background

Now, I know that I should have done this a long time ago.  Anytime you go through the steps of a completely repeatable process more than one or twice, you should see how to automate it.  I've been busy, and I've also been analyzing my process to see what I do for these builds.

My other issue is more personal.  I've been feeling like a constant failure.  My clients want quality work done.  But, they want it done right now.  I put a lot of pressure on myself to get my work done, but as a result, my quality has gone down.  When I start taking the time to put the pieces in place to increase the quality, my productivity goes down.  I'm constantly trying to find that balance, and I feel like I'm failing miserably.

Now I have several projects in flight that I'm leading the efforts on.  I have a junior developer that I'm trying to mentor and assign work to that will challenge him and help him get better.  I need to review his work from time to time to help pick out areas of improvement.  I have multiple developers on a couple of the projects and more than that on another one.  They're constantly making changes that need to be deployed for QA.  I have a QA resource that asks me for deployments on a regular basis.  My response time to his requests isn't fast enough.  I have client requests that I need to turn-around answers to quickly.  I have-  ENOUGH!

I have a lot of excuses.  I don't have enough solutions.  At least, I haven't done enough to solve my problems.  That's where we enter with automated builds.  There are lots of benefits, but I won't get into all of those.  You can find those elsewhere.  This is where I tell you about my experience in automating the builds.

Getting Started

We're using TFS 2010.  We have multiple project collections.  Natively, TFS 2010 only allows you to set up a single build service per machine, and each build service can support a single project collection.  That's great if you have multiple servers that you can set up as build machines, and/or you're a larger organization that you really need multiple servers like that.  We don't fit either of those categories, so I found a hack (courtesy of Jim Lamb) where I can support multiple build services on a single box.  I followed the instructions here to actually implement the hack because there were pictures (courtesy of Mark Nichols).

Once I got the second build service running, I created a new controller and an agent.

Web.Config Transformation

I had already performed this step a week or two prior, but I'm going to talk about it as if this was the next thing I did.

I needed to set up the web.config files for transformation.  I only needed the debug and release transformations, but you can set up as many as you need (Integration/Debug, QA, UAT, Production/Release, etc).  I have really simple transformations where I just replace some key value settings in the appSettings section, and a couple of connection strings.  Setting these up for your web application (web site?) is a necessary step if you want the deployment to be completely hands off.

Need more information on web.config transformations?  Go here.

Creating the Build

The next step was relatively easy.  I created a new build.  I pointed the build to my solution file.  I specified some things like whether or not to run the unit tests.  I even specified where to copy the build assets to.  That part was a "gotcha".

There were two reasons it was a "gotcha".  The first reason is that it doesn't copy over what you think it would copy over.  I don't remember the details, but it essentially copies over the binary assemblies, then a package folder with the web application in it as a subfolder.  The root folder itself is named after the build.  Not what I was hoping for.  So I decided to tell it not to copy the files to the drop folder.  I unchecked the box and tried again.  That was the second reason it was a "gotcha".

Apparently, the standard build template requires it even though it's an option.  There's a second configuration entry for whether or not to copy the build results to the drop folder.  That one is for the actually build assets.  But the process still needs a place to put the log files for the build process.  I ended up specifying a generic location for the builds to go.  Now on to the web application deployment.

Web Deployment

For the sake of brevity, I'm going to cut out a significant amount of the details of all of the things that happened to get me to a successful build.  Let's just say, 30-some failed builds to get me to this point.  Also, keep in mind that the server I deploy to is internal only.  It's not public-facing, nor does it have any hope of ever being public.

To set up the web server:

  1. Enable WMSvc on the web server.
  2. Install the Web Deployment Tool on the web server.  You may have to update the installation after installing it the first time to enable the "Management Service Delegation" in IIS.
  3. Make sure the Web Management Service (WMSvc.exe) is running.
  4. Make sure the Web Deployment Agent Service (MsDepSvc.exe) is running.
  5. Set up Management Service to use either just Windows Credentials if you'll be deploying with a Windows account, or also allow IIS Manager Credentials if using an IIS Manage User like I am.
  6. Allow at least an IP address, maybe even a range, in the IIS Management Service configuration.  I allowed all IP addresses on a specific subnet.
  7. In Management Service Delegation, add a rule for "Deploy Applications with Content".  I also had to add a rule with "createApp" and "setAcl" because I used an IIS Manager User.
  8. Make sure that the account the Web Management Service is running as has full access to the file system for the web site that is being deployed.
  9. In the web site, under "IIS Manager Permissions", allow the user you're using for deployment.  In my case, I have my "webdeployment" IIS Manager User that I allowed on the individual web sites.

I think that's it for the web server.  I may have forgotten something, but one of the places I got good troubleshooting info was the 4th post down on this page.

Now for the build.  There are a series of arguments that I added to the "MSBuild Arguments" in the advanced section of the process area of the build definition.  I started with the info from this page on Johan Danforth's blog, and updated them to their final settings as follows:

  1. /p:DeployOnBuild=True
  2. /p:DeployTarget=MsDeployPublish
  3. /p:MSDeployPublishMethod=WMSVC
    I'm deploying to a remote server using Web Management Service.
  4. /p:CreatePackageOnPublish=True
  5. /p:DeployIisAppPath=WebSiteNameinIIS
  6. /p:MsDeployServiceUrl=WebServerNameerver
  7. /p:AllowUntrustedCertificate=True
    Web Management Service created an SSL certificate but prefixed the server name with "wmsvc-", so the certificate wasn't valid.  This argument tells MSDeploy to ignore it.  Safe for internal use, but probably risky for a public-facing server.
  8. /p:UserName=DavidHasselhoff
    This is my IIS Manager User, but even when I was using Windows credentials, I had to specify the domain\username because the credentials didn't carry forward to the server.
  9. /p:Password=ForPresident
    Again, this is for my IIS Manager User, but even when using the Windows credentials, I had to specify the password because the credentials didn't carry forward to the server.
  10. /p:SkipExtraFilesOnServer=True
    This prevents the deployment process from deleting everything first.  If you have extra files that the site instance uses that aren't part of the deployment, they'll get deleted unless you use specify this argument.

 These arguments work for me and for my scenario.  This may not be the best configuration for deployments to a production server, but it works for us on our internal server.

Also, you should note that the MSBuild arguments for MSDeploy are not well documented, if they're documented at all.  I had to dig through the targets file located (for me) at "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets".

Now, that I'm automating our builds, and I have a template to work from, I can move on to other areas that are in desperate need of attention.

Tags: , , , ,

Migration to TFS 2010

by Aaron 17. May 2010 09:10

We recently migrated from Team Foundation Server 2008 to Team Foundation Server 2010.  Here's the scenario that we had:

  • New Active Directory domain
    • The domain was built new and not migrated from the old domain
    • As a result, the users were brand new user accounts with no SID history
  • New server hosting TFS
  • The TFS data was now going to be stored on a separate SQL Server instance instead of having everything contained on a single server
    • The old instance of TFS was using SQL Server 2005
    • The new instance of TFS is using SQL Server 2008 with an instance of SQL Server Reporting Services
  • We weren't concerned about retaining version history, work item history, or closed work items
  • We didn't want to keep all of the projects, just some of them
  • We wanted to move from Scrum for Team Systems 2.2 to Scrum for Team Systems 3.0
  • We weren't concerned about the data in SharePoint
  • We weren't concerned about the reports in SQL Reporting Services

I read through this after writing it and decided that some people may be interested in the "short" version.  So, to save you from having to read my monologue, I've created a bulleted summary.

To summarize what we did to accomplish our migration:

  • Install the new instance of TFS 2010
  • Wire up the new instance to a separate SQL Server 2008 instance with SQL Server Reporting Services
  • Back up the databases from the TFS 2008 instance, and export the SQL Server Reporting Services encryption key
  • Restore the databases on the new SQL Server instance
  • Run Tfsconfig import /collectionName:UpgradedCollection /sqlinstance:<SQL Instance Name> where UpgradedCollection is a project collection that doesn't yet exist, and is the name you want to use for your new project collection
  • You may have to configure SharePoint and set up the site in the TFS configuration manager for web access.
  • You'll have to configure SQL Server Reporting Services with the encryption key you exported from the old instance
  • If you want to reorganize the projects into different project collections, then if you can get it to work properly, TFS Integration Platform is supposed to be very good.  Take your time to understand it though, which might help you succeed in using it.  I didn't have the time.

And now...the LONG version...

The first thing that we did was stand up the new instance of TFS 2010.  I should clarify that I didn't do that part.  My boss did that part.  I still had a role in the installation though.  I nagged until he finished it, and the whole time I had to listen to the excuses and problems he was having.  It's tough being me.

Once he got that part done, it was up to me to figure out how to get the source code from the old TFS 2008 server into the shiny new TFS 2010 server.  I looked into the different methods that were available to me for migrating the data.  Probably the best information that I had available to me for this was the TFS installation documentation.  It's a bit extensive and makes reference to a lot of OTHER pieces of documentation, but it was helpful.

At first, I looked into using the TFS Integration Platform to migrate the code from TFS 2008 to TFS 2010.  After trying several times, I discovered that I really wasn't going to be able to connect to the database in the old domain as well as the database in the new domain.  I began looking into other options.

I settled on putting the data in the new TFS installation, and upgrading it to the 2010 format.  I did some searching around, and I found it's not difficult to do.  The TFS documentation links to a bunch of different places to tell you how to do it.  In a nutshell, back up all of the databases on the TFS 2008 database.  You'll recognize which ones they are by the "tfs" prefix on each one.  The databases will probably also be named after their corresponding project.  There are SharePoint databases also.  If you have SharePoint data, you'll need those databases also.  And finally, if you want to keep the reports from SQL Server Reporting Services, those databases are fairly easy to recognize too by the name.

The easy way to get all of the databases is to create an on-demand maintenance job.  That job will allow you to choose all of the databases to do a complete backup.  If you're backing up the Reporting Services data for restore in the new TFS instance, which we didn't, I can tell you that you need to back up the encryption key in the Reporting Services configuration console.

I restored the databases to the new database server.  I even restored the SharePoint databases, just in case.  I ignored the Reporting Services databases.

After the databases were restored, I went through the process of upgrading those databases.  I referenced a post on Brian Harry's blog for migrating the data.  The tfsconfig import command changed a little bit since that article.  It should be something like:

Tfsconfig import /collectionName:UpgradedCollection
/sqlinstance:<SQL Instance Name>

The connectionString is no longer a valid option.  Also, the "UpgradedCollection" named by the "collectionName" parameter must be a collection that doesn't exist.  I tried doing this with an empty project collection that I created, but of course it failed.  It failed because you're creating the project collection with the command.  So...don't do it.

You should note that the project collection is comprised of all of the projects that you had in the old TFS instance.  I'm not sure if you can specify individual projects to put into the project collection.

Because we only wanted some of the projects, I again tried to use the TFS Integration Platform.  Again, it was without success.  Actually, it told me success, but it lied.  The only way it could lie more is if it told me how cool and good-looking I am.  I'm on to you TFS Integration Platform...

So as a last resort, I decided to get the code out of the project, and add it to the new one.  That means that you have to get latest version in Visual Studio, remove the source control bindings, switch to the new project, then add all of your stuff as new files.

The next thing was moving the work items.  Since TFS Integration Platform let me down, I decided to try the witadmin command line utility.  After exporting all of the work items out of TFS, I had to go through some clean up on the XML to get the import to work.  Or at least, that's what I thought.  Again, it was giving me success messages, but it wasn't working.  I think it must have partnered up with the TFS Integration Platform in an attempt to make my life miserable.  I won't say if their plan worked or not.

Finally, I decided to simply use Excel.  TFS integrates with Excel for managing work items.  If you've never used it before, it's less painful than using Team Explorer 2008, but only slightly.  To actually create the work items as new work items, I had to make some updates to them.  They were finally migrated though.  Fortunately, I didn't have to work with many.  Maybe 100 work items.

That was our migration.  While it was a fairly bumpy road, it's definitely been worthwhile in the long run.  I highly recommend moving into a TFS 2010 environment if you have the opportunity and the resources.

You'll notice that I didn't talk about connecting the SharePoint data into the new TFS 2010 instance.  You can do that through traditional SharePoint configuration.  You may have to use the stsadm command to attach the database.  SharePoint is out of the scope of what I want to talk about.  The information is out there, and if you have a SharePoint guru, you've got somebody with the knowledge to do it.

I also didn't talk about attaching the Reporting Services databases.  Again, once you've restored the databases, you'll have to go through the SSRS configuration to import the encryption key connecting you to the databases.

Good luck!

Tags: , , , ,

It's...coming alive!

by Aaron 30. April 2010 09:56

So this is my first blog post.  Not ever, just on here.

I sat on this site and hosting for almost two years.  I kept saying, "I really need to get that done."  I'm still saying that right now actually, because this isn't the look that I want to keep, nor is this the content that I want to push to everybody.  I'm better than that, and you deserve better than that too.

This is going to be primarily a technical blog.  Sure, I'll post the occassional personal post, but I'll tag it as such so that you can ignore it or revel in the fact that I have a real personality.

There were a couple of things that prompted me to finally jsut put something on the internet for the public to read.  The first was an internet business that my wife was going to take on.  I want to discuss some of the things that I took part in (from a technical perspective, not from a husband abandoned perspective) up until she decided that it was going to be too much effort for no gains.

The other thing was Jeremiah Peschka asking me to blog about my experience with a Team Foundation Server 2008 migration to 2010.  I promise, besides this post, that will be the first thing I post.  Hopefully sometime this weekend, but maybe early next week.

From a blog perspective, there were a couple of engines that I was considering using.  I settled on BlogEngine.NET after getting some feedback and doing a little research.  It seemed like it might be an easier engine for me to skin with a theme that I chose and will adapt.  Of course, there are a couple of things that I would like to change with the engine.  At least I wanted to.  I might not need to anymore, but we'll see once I get back into customizing the look and feel.

I chose Godaddy for my domain registration and hosting because I got to purchase the stemen.me domain.  That allows me to host a site for my me, my wife (if she ever decides she wants to take advantage of it, my son, the family dog, and even the creepy kid down the street that eats nothing but saltines and ketchup.  If his last name was Stemen that is.

There are some Windows hosting issues with Godaddy though.  One is that if there's an issue, their support is great, but their turnaround time kinda sucks.  The other is that subdomains point to virtual directories on the web root, if that's how you choose to set it up.  That wouldn't be a problem, except there's a strange issue where the subdirectory is accessible from the subdomain.  For example, http://lucas.stemen.me is URL I keep for my son's blog.  The subdomain points to a folder accessible under http://stemen.me/lucas.  MNot a problem, except that http://lucas.stemen.me/lucas/ also works.  It shouldn't.  But it does.

This wasn't a problem until I started to set up this blog.  If I went to http://aaron.stemen.me/blog, I was redirected to http://aaron.stemen.me/aaron/blog/.  That isn't happening anymore because I have aaron.stemen.me pointing to the root.

I have three solutions: live with it, move on to a better host provider, or come up with a solution.  I actually think I'm going to come up with a solution.  While Godaddy isn't my favorite provider right now, but they're reasonably priced, and I think I can solve it myself.  In fact, it'll be a blog post complete with source code.

The solution that I plan on implementing will be two-fold.  One part will be to make sure that requests coming in get redirected to the appropriate folder and not the extended "wrong" URL.  The other part of the solution will be to make sure that URLs written out in the pages don't contain the subdomain's folder in the URL.  That was the other part that I forgot to mention.  ASP.NET application were saying that the applciation root (~/) resolved to /aaron/blog/ instead of /blog/.

Please come back often, or subscribe to the RSS feed.  I may be visitng your blog too.  I'm a developer, I'm lazy, and I prefer to plagiarize the good work of others.  Of course, anything that I do that with, you'll get credit (and a link) for it.  If you find that I did copy your work or article, but I didn't give you credit, just let me know.  Sometimes I forget.

That's all for tonight.

Tags: , , , , ,

General