Donovan Brown

Technology blog

Do you, Release Management, take this feature, Deployment Slots, to be your DevOps partner?

Abstract When I first started reading about Deployment Slots I had more questions than answers. My most obvious concern was what was swapped and what was not.  The original documentation made statements like the following: “A slot that you intend to swap into production needs to be configured exactly as you want to have it in production.” This was a nonstarter for me.  There is no way I intend to have my Dev, QA or even my Staging code pointing at a production database.  I am sure we all agree that pointing Dev or QA at a production database is foolish.  But some might argue that Staging is production code just waiting to be swapped into production.  But I would argue that there are times where this is not true.  If the changes in stage require “breaking” database schema changes things fall apart quickly. However, some companies address this issue by requiring all releases never break the previous release.  That way if you need to quickly swap back to the previous version your application will be able to run. Nevertheless, I prefer Stage to point at a different database than Production until the time I want to swap them. When I preform the swap I don’t want my connection strings from Stage to follow me into Production. I need the Production connection strings to stick with the Production slot as I update the Virtual IP (VIP) address.  A featured introduced in 0.8.10 of the Azure PowerShell Tools actually allows just that. The SlotStickyConnectionStringNames switch enables connection strings not to be moved during swap operations. With this new feature in place I decided to see if I can combine Web Deployment Slots with Release Management.  I will use Visual Studio Online (VSO) and Release Management Online (RMO) to manage the movement of the code from each slot with database changes being promoted with SSDT via Proxy Servers.  A Proxy Server is a machine that sits between RMO and the target compute instance (PaaS Website, Cloud Service, Linux Machine, etc.). I will deploy to the first slot and simply swap my way into production. Outline [  ] Add PowerShell to Project [  ] Configure build to create package [  ] Create Azure Website with Dev and QA slots [  ] Create 3 Azure SQL Databases [  ] Create 3 Azure IaaS Proxy VMs (install SSDT, Web Deploy, Azure SDK) [  ] Add Azure Subscription to RM [  ] Configure Environments in RM [  ] Create Release Path in RM [  ] Create Components in RM [  ] Create Release Template in RM [  ] Trigger Build Application The application I am deploying is my People Tracker application I first introduced at TechEd North America 2014.  It is a simply ASP.NET MVC application backed by SQL Server.  I am using Entity Framework database first with SSDT to deploy.  A key point to be aware of is you need to make sure your SSDT Database Project is configured to target Microsoft Azure SQL Database. My goal is to setup an Azure Website with two deployment slots Dev and QA.  Each slot of the website will be backed by an Azure SQL Database (Production, Dev & QA). In the MVC project I create a configurations folder to hold the PowerShell scripts I need to deploy the database, deploy the website and swap the slots. It is very important that you set the Copy to Output of each of these files to Copy always.  This will ensure they are copied to the drop location of your build. Below you will find the three required files to deploy using Deployment Slots. Make sure you use -Verbose to produce command output. All the configuration variables we add to Release Management and several system variables will be available to our PowerShell scripts. The $AzureWebsiteName is a global configuration variable I will set under the Administration tab.  The $Slot is the component level configuration variable.  Finally $applicationPath and $PackageName are system variables automatically passed in by Release Management.  The Publish-AzureWebsiteProject cmdlet copies the files from the drop location to the provided slot.  This script will only be used during the first stage of our release path. Publish-AzureWebsiteProject -Name $AzureWebsiteName -Slot $Slot -Package "$applicationPath\_PublishedWebsites\$($ProjectName)_Package\$" -Verbose To move the code the rest of the way we are simply going to swap it from slot to slot. Make sure you don’t forget the -Force.  If you do it will fail because it will display a dialog asking for confirmation. Switch-AzureWebsiteSlot -Name $AzureWebsiteName -Slot1 $From -Slot2 $To -Force -Verbose Finally we have the PowerShell script that calls SqlPackage.exe to deploy our SSDT dacpac. & 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\SqlPackage.exe' /Action:Publish /SourceFile:$applicationPath\$FileName /TargetConnectionString:"Data Source=$SqlServer;User ID=$UserID;Password=$Password;Initial Catalog=$DatabaseName" Now check in the code to VSO. With the solution ready we can turn our attention to the build. Build Thanks to the integration of VSO and RMO you can use the out of the box build process template to trigger a release from a build.  The only thing we need to do is add arguments to pass to MSBuild so it creates a package for our web project.  Simply set the value of MSBuild Arguments to ‘/p:DeployOnBuild=True’. With the solution and build configured we need to create the environments in Azure to deploy to. Resources Below is the contents of the resource group I created to put this demo together.   The new Resource Group feature of the Preview Portal makes it very easy to see all the related resources. As you can see it is more than you might first expect.  This demo requires a Website with two deployment slots, three IaaS Virtual Machines, three Azure SQL Databases, a virtual network and a storage account. Your first question might be why if we are using PaaS Websites do I have three IaaS virtual machines?  When working with PaaS you need a machine from which you can execute your PowerShell because you cannot connect to the computing instance for your Website.  Because Release Management currently has a restriction where a server used in an Azure deployment can only appear in one environment, we are required to create an IaaS virtual machine to act as a Proxy Server for each slot.  I have been assured a better solution is coming.  Another alternative is to forego RMO and use an On Premises installation of Release Management and use a single Agent based proxy for all the stages. Azure SQL Databases I had to select a Service Tier of at least Standard to get my deploys to work.  Basic would always timeout. If you intend to connect to your Azure SQL Databases from your development machine be sure and add your machine IP Address to the databases firewall. Otherwise you will never be able to connect. Azure Websites Deployment slots are only available in the Standard Web Hosting Plan Mode so make sure you select Standard or you will have to upgrade them. After you have created the Dev and QA slots make sure you set the desired connection strings for each slot pointing to the correct Azure SQL Database.  Because I am using Entity Framework Database first (same would be true for Model first) when setting the connections strings for the slots, I had to select Custom as the connection string type. Now we have come to the most important point of this post. Now I have to use the Set-AzureWebsite cmdlet with the Slot Sticky Connection String Names switch to make sure the connection strings stick to the slot during a swap. Set-AzureWebsite -Name mysite -SlotStickyConnectionStringNames @("DemoDBEntities") You only have to run this command on the production slot. The other slots will share the sticky connection string names settings. Proxy Servers We are now to the oddest part of this demo.  We cannot connect directly to a PaaS website to install an agent or target it with DSC.  So we have to use another machine to execute our PowerShell script using the cmdlets from the Azure SDK to deploy our websites and swap our slots.  These machines are called Proxy Servers. They sit between RM and the Websites and provide a place for us to execute our scripts. When I create virtual machines in Azure I prefer to create the Cloud Service and Storage Account that is going to hold it first so that I can control the names. If you simply use the Virtual Machine wizard it will create a Cloud Service and Storage Account with crazy names.  You can store all three Virtual Machines in the same Cloud Service and Storage Account. Proxy servers do not have to be powerful machines.  I created the smallest Windows Server 2012 R2 machine I could.  You need to create one for each slot you intend to deploy too.  Once the servers are provisioned and running do yourself a favor and disable IE Enhanced Security Configuration. Now we need to install the following components on the machine:   Microsoft Azure PowerShell with Microsoft Azure SDK Microsoft SQL Server Data-Tier Application Framework (DACFx) (June 2014) Install Web Deploy   Once the platform installer starts you can click the back arrow. Search for all the components and install them all at once instead of one at a time. Once the components are installed we need to configure the Azure PowerShell tools to connect them to your Azure Subscription.  To begin run the Add-AzureAccount cmdlet.  If you have more than one subscription connected to the account you will need to set the default subscription.  You can see how to do that here. Deployment Below is the flow of the code through the Deployment Slots: At time 0 when you have simply created the PaaS Website with a Dev and QA slot and three empty Azure SQL Databases.  All three slots show the default page. Once the release deploys to the first stage using the deployDb.ps1 and deployWebSite.ps1 files the environments will look like this. Now that the code has been copied to the Dev slot all we have to do is update the QA database and swap the Dev and QA slots. At this point the QA slot will be the only slot that can be accessed.  Moving to Production is very similar to moving to QA.  Simply update the database in Production and swap the QA and Production slots. I think the images really drive home the fact that it is indeed a swap and not a copy from one environment to another.  The next three images show the movement of V2 through the stages. The V2 version of the site has a column for Middle Name. If we elect to enforce the requirement that each release must be backwards compatible with the current version both the Production and QA slots would be accessible at this point. Release Management We now have to connect Release Management to your Azure Account.  From the Administration tab click on “Manage Azure”.  Click the New button to open the “New Azure Subscription” page.  Use any name you like.  You can locate your “Subscription ID” from the Azure Management Portal.  From the Azure Management Portal select “Settings”. The settings page list all your Subscriptions with their Subscription ID’s.  Just copy and paste the GUID into Release Management. Next you will need to get your “Management Certificate Key” from Just save the file to somewhere safe and open it with a text editor. Copy the ManagementCertificate value without the quotes and paste into Release Management.  Finally you need to provide a Storage Account Release Management will use to move files to Azure.  You can use the same Storage Account we created to hold our virtual machine. With your Azure account configured in Release Management we can now create our Environment.  Click the “New vNext Azure” button on the Environments tab under “Configure Paths”. Then click “Link Azure Environment”. Select your subscription and select the storage account we created and click Link. Now click "Link Azure Servers", select our virtual machine and click Link. You can now Save & Close the Environment.  You need to repeat this for all three proxy servers. Next we have to create the “vNext Release Path” using our new environments. Using the environments we just created add as many stages as you need (one for each slot). The bits of our build will be copied there and the PowerShell we be executed from these machines. The PowerShell will publish our application to the Azure Website, deploy our database changes and swap our deployment slots. Next we need to define a component. I decided to create a component for each PowerShell I am going to run (DB, Swap and Website).  From the “Configure Apps” tab select “Components”.  Click “New vNext” and give it a name and set the “Path to package” to “\”.  You will also need to create some Configuration Variables. There is one configuration variable that I need in several scripts so I decided to define it at the global level under Administration Settings. The final step is creating a “vNext Release Template”. Click “New” from the “vNext Release Templates” page.  Give it a name, select the release path we just created and set the build definition to the correct build then click Create.  Right click on Components in the Toolbox and click Add.  Link the components to the Release Template.  Now drag the “Deploy Using PS/DSC” to the Deployment Sequence. Select the server name from the dropdown.  Enter the username you selected when you created the virtual machine in .\UserName format.  Enter the password, select the component, set PSScriptPath to “Configuration\deployDb.ps1” and set SkipCaCheck to “True”.  Add a Custom configuration variable for DatabaseName and set it to the Dev database name. Now add another “Deploy Using PS/DSC” to the Deployment Sequence. Enter the username you selected when you created the virtual machine in .\UserName format.  Enter the password, select the component, set PSScriptPath to “Configuration\deployWebSite.ps1” and set SkipCaCheck to “True”.  Add a Custom configuration variable for Slot and set it to the Dev slot name. The QA and Production stages are very similar except instead of using the deployWebSite.ps1 script in the second action you are going to use the swapSlots.ps1 file and add a Slot1 and Slot2 configuration variables. QA Stage Production Stage Now simply trigger a build and your release will begin. deployDb.ps1 (246.00 bytes) deployWebSite.ps1 (163.00 bytes) swapSlots.ps1 (174.00 bytes)

How to connect SQL Server Data Tools 2012 to TFS 2013

Problem: I installed SQL Server 2012 Standard Edition with SP1 with SQL Server Data Tools and I need to version my projects in TFS 2013. Solution: The IDE for SQL Server Data Tools is Visual Studio 2010 Shell. By installing Microsoft Visual Studio Team Explorer 2010, Microsoft Visual Studio 2010 Service Pack 1 and Visual Studio 2010 SP1 Team Foundation Server Compatibility GDR you will be able to connect to TFS 2013. You can read more about the compatibility between Team Foundation clients and Team Foundation Server here.  Please note that when you connect to a more recent version of TFS than that of the client, you only have access to those features supported by your client. You cannot access any features that Visual Studio 2010 does not support.
Donovan Brown | Neptune (personal SMTP testing server)

Donovan Brown

Technology blog

Neptune (personal SMTP testing server)

Updated! There is a new version.

While working on a current project I found myself faced with testing code that sends email.  In the past I would end up with an inbox full of test messages or unhappy customers that wonder why they just received a flood of emails from my site.  I was also frustrated that there was no way to easily automate this testing.

Enter Neptune. Neptune is a SMTP Development Server targeted for use in automated testing. I simply asked the question, “what if I had a SMTP server that did not relay the message and allowed me to query for messages and their content”. I would be able to use a server of this type to act as my SMTP server during testing and write custom plug-ins, validation and extraction rules to communicate with the server.

The goals of Neptune were to facilitate automated testing and be easy to use. I did not want to have to install a service or understand everything there is to know about running a SMTP server. I simply want to start Neptune and run my test.

Full documentation on how to use Neptune is provided in the msi you can download here. NeptuneSetup.msi (1.27 mb) Original version

Neptune with POP3 NeptuneSetup.msi (1.40 mb) Newer version

If you find Neptune usefull feel free to Donate for future development.


Comments (12) -

  • Karim Kameka

    10/20/2008 7:24:06 AM |

    This tool is awesome and has allowed me to test my code with all the email portions on.  In the past I'd have to comment this code out and then remember to uncomment when we went live.  Also allows me to narrow down the problem to my code, some setting or the email server quickly and easily as this was just taking shots in the dark in the past.  Thanks!

  • Abel

    10/20/2008 8:10:58 AM |

    Awesome tool.  I just downloaded it and it does everything you said.  I was dreading setting up an smtp server and I really didn't want to stuff my mailbox full of test messages.

    Cool stuff.


  • Colin

    10/21/2008 3:35:53 AM |

    thanks, D.  This will solve some my problems and allow me to clean up my code a little.

  • Scott Allender

    10/21/2008 5:08:13 AM |

    slick idea.  i'll definitely be playing with it tonight.

  • David Kemp

    10/22/2008 1:12:40 AM |

    How do I get this to work in a Continuous Integration environment?

  • brady gaster

    10/22/2008 1:38:35 AM |

    DKemp - I would say use Process.Start to fire the app up. Then send your emails.
    DBrown - GREAT idea, sir. I'll be using this stat. Mui Bueno!

  • matthew

    10/22/2008 11:16:48 AM |

    Any chance of this showing up on codeplex (or the like) soon?

  • Simone

    10/27/2008 3:43:45 AM |

    That's awesome tool...
    till now I had a debugging configuration that automatically changed the email of the recipient to the test one so that users are not flooded by testing emails.. but this is completely transparent: just change the SMTP and everything works.

  • biggie9385

    1/12/2009 7:56:02 PM |

    If I put Neptune in the same machine with our test web server and have it point to, can I access Neptune from my desktop (this is a different machine from the web server) and run webtests?

  • smehaffie

    3/17/2009 5:06:39 AM |

    Are you considering posting this to codeplex?  Are you planning on releasing an upgrade?  If the answer is no to any of the questions above, would there be any way to get the source code for this?  The reason I am asking is that I would like to add some features to this.  This is great for automated/unit testing, but it would also be nice to have some additional features so it could be used for general testing.

    1) Ability to persist the messages to disk.  That way you can send the message to other who might want to see them (project managers, BU, etc).  This would also include deleting persisted messages.

    2) Ability to view the email messages.  When I am writing new functionality I want to be able to open the emails I just created to see the content.

    3) Ability to configure the application to load on startup.

    4) UI to make editing the config files easier.


  • EDI Services

    3/28/2009 7:10:30 PM |

    I just make a dummy email that way neither my customers or I are affected by the influx of a thousands emails. But Neptune sounds like a very good idea.

  • Paul Kohler

    5/21/2009 1:32:55 PM |

    Very nice.
    I like the idea of unit testing via the admin port, clever.
    PK  Wink

Pingbacks and trackbacks (5)+

Comments are closed