Using PowerShell to Manage Windows Azure Applications

I was recently asked to do a presentation on managing Windows Azure applications with PowerShell by the Charlotte PowerShell users group.   At first I was a bit unsure about doing the presentation – with Jim Christopher and Ed Wilson in the group as well as other PowerShell gurus, it can be tough to ramp up to a respectable level of knowledge.  But, these guys are awesome and Ed welcomed me to pen a blog post for them, so check it out on the Hey, Scripting Guy! blog. I learned a ton in the process.  While I knew there were the Azure PowerShell cmdlets to aid in virtually everything you  can do via the management interface, getting hands on and really putting it through the paces still wows me at what you can do with PowerShell. In the blog post, an sample website is deployed to two different datacenters, and then load balanced geographically using the Traffic Manager.  Instances can be easily altered, storage accounts created, etc.  And then, finishing it with only a few lines a code to tear it all down.    All in all, it integrates really nicely with any application lifecycle process. Check out the post and try the code out for yourself!

Windows Azure Dev Camps Soon!

It’s that time – Windows Azure Dev Camps are coming really soon.  Here’s the schedule: Date Location   May 24th, 2012 Alpharetta, GA Register May 30th, 2012 Reston, VA Register May 31st, 2012 Iselin, NJ Register We’re pretty excited to mix up the format a little, with some time to jump into some new areas we haven’t typically talked about in our previous shows: 1. The Azure Platform – An Overview (60 minutes) Let’s start off the day with a dive into Windows Azure. We’ll talk about what Windows Azure offers, from hosting applications to durable storage. We’ll look at Windows Azure roles types, hosting web applications and worker processes. We’ll also cover durable storage options, both traditional relational database that is offered as SQL Azure, or more cloud-centric offerings in Windows Azure Storage for files, semi-structured data, and queues. 2. Hands on @home with Azure (120 minutes) For this hands-on portion of the day, we’ll work on the @home with Windows Azure project. The @home project will give you a solid understanding of using Windows Azure in a project that contributes back to Stanford’s Folding@home distributed computing project. We’ll walk through the code, provisioning an account, and getting the application deployed and running. 3. Caching – A Scalable Middle Tier (45 minutes) Creating a stateless application is a difficult but fundamental aspect of building a scalable application in the cloud. In this session, we’ll talk about the Windows Azure Cache service and using it as a middle tier to maintain state and cache objects that can be shared by multiple instances. 4. SQL Azure, Data Sync, and Reporting (45 minutes) SQL Azure offers a scalable database as a service without having to configure and maintain hardware. We’ll look at the subtle differences between on premises SQL Server databases and SQL Azure, and how Data Sync can be used to synchronize data between multiple databases both in the cloud and on premises. We’ll also look at SQL Azure Reporting. 5. Windows 8 and Azure – Better Together (60 minutes) The consumer preview of Windows 8 is out, and it’s the perfect time to ramp up on developing native Metro-style applications. In this session, we’ll give an overview of Windows 8, and delivering a richer user experience by leveraging a cloud backend.

Windows Azure Trust Center

The Windows Azure team recently posted about the Windows Azure Trust Center.   One of the most frequent conversations that comes up when discussing moving applications to the cloud revolves around security and compliance, and it’s also one of the most challenging conversations to have.  What makes it particularly challenging is the fact that the responsibility of compliance is typically shared between the hardware, platform, and software. The site has a few sections that in particular drill down into security, privacy, and compliance related information.  Definitely good information to refer to when evaluating a move into the cloud!

Folding@home with the SMP Client in Windows Azure

Want to run @home with Azure for another team or use a more powerful CPU? For the true geeks out there, running the Folding@home client involves tweaking, high performance computing, and knowing the difference between the GPU and CPU clients.  We heard from a couple of folks about maximizing their Windows Azure usage, and Jim made some changes to the client piece to accommodate.  In truth, we did a little of this last time we ran @home, but we didn’t draw much attention to it for fear it would just add confusion – so, this info is presented as optional and not necessary to do @home. First, when setting up the 90 day trial, accounts have a $0 spending limit – which means, unless you intentionally disable the cap, you will never be charged a dime.  This also means your account will be shut down when you reach your monthly usage quota.   The 90 day trial allows 750 compute hours per month, which is 1 small instance 24x7.   If you’d like, you can run either 8 single-core instances, or a single 8-core instance, however you’ll burn 192 hours per day and exhaust the limit in about 4 days.  You could also run a dual core instance for half a month. However, Visual Studio Ultimate MSDN subscribers receive 1,500 hours/month, so you can run either 2 instances of @home, or more preferably, a single dual core – or a quad core for half a month.  It’s better for @home, and here’s why: Folding@home rewards speed over quantity.  The faster a work unit is completed, the more points are awarded.  Consequently, you and your team (by default, the Windows Azure team) does better!   To do this, you first need a passkey.  A passkey acts as a unique identifier for a user and is required to get these bonus points.   In the project properties, you can add the passkey, and specify the team number.  184157 is the Windows Azure team, but we allow you to change this if you’re already on another team: Next, if you have downloaded the bits already, you might need to re-download them.  To know for sure, check if you have both clients in the client folder of the AtHomeWebRole project, as pictured up above.  Specifically, you want to see the FAH6.34-win32-SMP.exe.   If you don’t have both executables as shown in the green box above, re-download the solution from the get started page. Within the project properties, you can now configure the app to use whichever VM size is most appropriate for you.  The larger VM will run faster and accumulate more points, but will get shut down quicker.  If you aren’t on a trial or don’t have the spending cap in place, monitor your usage carefully and be sure you’re staying within the plan limits or be willing to pay for the usage! That’s all there is to it!  Make sure your storage account is configured per the instructions – if you had a deployment already, the new code will start and run automatically as it will pick up the settings from the storage account. So what kind of results can you predict?   My main user, which is primarily using Azure, but also some Folding using my home 4-core i5 box has the following numbers: Looking at these stats, I’m pulling in an average of 146 points per WU (and 146 points per CPU).  This is actually a tiny bit better than it should be, because my home machine folds at a much higher rate and contributes to this account.  I then deployed some 8-core and a few 4-core instances with a different account and different pass-key: This account is pulling almost 4,000 points per WU!  If we assume they were all 8-core boxes (which they weren’t, so these numbers are unfavorable) then dividing that by 8 is 474 points per CPU per WU.   The bottom line:  CPUs working together pull significantly more points than CPUs working alone. See?  I told you this was geeky stuff.  In either event, the Folding@home project is a fantastic project to contribute to, and hopefully, a fun way to learn Windows Azure in the process.  Finally, if you’re not using Windows Vista, 7, or Server 2008, or otherwise just want to download the package files directly, you can do that by reading the instructions here!

Debugging and Troubleshooting in the Cloud

Thursday, April 5th, at noon, we’ll be having our second from the last @home webcast, this time focusing on debugging and diagnostics in the cloud.   While a lot of what we show is in the context of our @home app… … much of what we’ll be doing is fairly general in nature, especially some of the diagnostics material we’ll be covering this week.  From this week’s abstract: In this third webcast episode, we talk about debugging your application. We look at debugging locally and how the emulator works for local development, and we talk about configuring diagnostic data to capture logs and performance counters. For the especially tricky troubleshooting issues, we discuss IntelliTrace, an advanced debugging tool, to gather more information about your application—essentially building a timeline of events that can be examined to quickly find the root of a problem. We also look at remote desktop options for troubleshooting. We’ll talk with you then!

Getting Started with the Windows Azure Cache

Windows Azure has a great caching service that allows applications (whether or not they are hosted in Azure) to share in-memory cache as a middle tier service.  If you’ve followed the ol’ Velocity project, then you’re likely aware this was a distributed cache service you could install on Windows Server to build out a middle tier cache.  This was ultimately rolled into the Windows Server AppFabric, and is (with a few exceptions) the same that is offered in Windows Azure. The problem with a traditional in-memory cache (such as, the ASP.NET Cache) is that it doesn’t scale – each instance of an application maintains their own version of a cached object.  While this has a huge speed advantage, making sure data is not stale across instances is difficult.   Awhile back, I wrote a series of posts on how to do this in Windows Azure, using the internal HTTP endpoints as a means of syncing cache. On the flip side, the problem with building a middle tier cache is the maintenance and hardware overhead, and it introduces another point of failure in an application.  Offering the cache as a service alleviates the maintenance and scalability concerns. The Windows Azure cache offers the best of both worlds by providing the in-memory cache as a service, without the maintenance overhead.   Out of the box, there are providers for both the cache and session state (the session state provider, though, requires .NET 4.0).  To get started using the Windows Azure cache, we’ll configure a namespace via the Azure portal.  This is done the same way as setting up a namespace for Access Control and the Service Bus: Selecting new (upper left) allows you to configure a new namespace – in this case, we’ll do it just for caching: Just like setting up a hosted service, we’ll pick a namespace (in this case, ‘evangelism’) and a location.  Obviously, you’d pick a region closest to your application.  We also need to select a cache size.  The cache will manage its size by flushing the least used objects when under memory pressure.  To make setting up the application easier, there’s a “View Client Configuration” button that creates cut and paste settings for the web.config: In the web application, you’ll need to add a reference to Microsoft.ApplicationServer.Caching.Client and Microsoft.ApplicationServer.Caching.Core.  If you’re using the cache for session state, you’ll also need to reference Microsoft.Web.DistributedCache (requires .NET 4.0), and no additional changes (outside of the web.config) need to be done.  For using the cache, it’s straightforward: using (DataCacheFactory dataCacheFactory = new DataCacheFactory()){ DataCache dataCache = dataCacheFactory.GetDefaultCache(); dataCache.Add("somekey", "someobject", TimeSpan.FromMinutes(10));} If you look at some of the overloads, you’ll see that some features aren’t supported in Azure: That’s it!  Of course, the big question is:  what does it cost?  The pricing, at the time of this writing, is: Standard pay-as-you-go pricing for caching 128 MB cache for $45.00/mo 256 MB cache for $55.00/mo 512 MB cache for $75.00/mo 1 GB cache for $110.00/mo 2 GB cache for $180.00/mo 4 GB cache for $325.00/mo One additional tip:  if you’re using the session state provider locally in the development emulator with multiple instances of the application, be sure to add an applicationName to the session state provider: <sessionState mode="Custom" customProvider="AppFabricCacheSessionStoreProvider"> <providers> <add name="AppFabricCacheSessionStoreProvider" type="Microsoft.Web.DistributedCache.DistributedCacheSessionStateStoreProvider, Microsoft.Web.DistributedCache" cacheName="default" useBlobMode="true" dataCacheClientName="default" applicationName="SessionApp"/> </providers></sessionState> The reason is because each website, when running locally in IIS, generates a separate session identifier for each site.  Adding the applicationName ensures the session state is shared across all instances. Happy Caching!

Antimalware in the Windows Azure

On most (or perhaps even all?) of the production servers I’ve worked on, antivirus/antimalware detection apps are often not installed for a variety of reasons – performance, risk of false positives or certain processes getting closed down unexpectedly, or the simple fact most production machines are under strict access control and deployment restrictions. Still, it’s a nice option to have, and it’s now possible to set this up easily in Windows Azure roles.   Somewhat quietly, the team released a CTP of Microsoft Endpoint Protection for Windows Azure, a plug in that makes it straightforward to configure your Azure roles to automatically install and configure the Microsoft Endpoint Protection (MEP) software.  The download includes the necessary APIs to make it simple to configure.  Upon initial startup of the VM, the Microsoft Endpoint Protection software is installed and configured, downloading the binaries from Windows Azure storage from a datacenter of your choosing.  Note: *you* don’t have store anything in Windows Azure Storage; rather, the binaries are kept at each datacenter so the download time is fast and bandwidth-free, provided you pick the datacenter your app resides in. So, to get started, I’ve downloaded and installed the MSI package from the site.    Next, I’ve added the antimalware module to the ServiceDefinition file like so: <?xml version="1.0" encoding="utf-8"?><ServiceDefinition name="MEP" xmlns="http://schemas.microsoft.com/ServiceHosting /2008/10/ServiceDefinition"> <WebRole name="WebRole1" vmsize="ExtraSmall"> <Sites> <Site name="Web"> <Bindings> <Binding name="Endpoint1" endpointName="Endpoint1" /> </Bindings> </Site> </Sites> <Endpoints> <InputEndpoint name="Endpoint1" protocol="http" port="80" /> </Endpoints> <Imports> <Import moduleName="Antimalware" /> <Import moduleName="Diagnostics" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> </WebRole></ServiceDefinition> Specifically, I added Antimalware to the <imports> section.  The other modules are for diagnostics (not needed necessarily but useful, as you’ll see in a bit) and remote access, so we can log into the server via RDP.   Next, the ServiceConfiguration will configure a bunch of options.  Each setting is spelled out in the document on the download page:   <?xml version="1.0" encoding="utf-8"?><ServiceConfiguration serviceName="MEP" xmlns="http://schemas.microsoft.com/ ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*"> <Role name="WebRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="xxx" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.ServiceLocation" value="North Central US" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.EnableAntimalware" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.EnableRealtimeProtection" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.EnableWeeklyScheduledScans" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.DayForWeeklyScheduledScans" value="1" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.TimeForWeeklyScheduledScans" value="120" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedExtensions" value="txt|log" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedPaths" value="e:\approot\custom" /> <Setting name="Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedProcesses" value="d:\program files\app.exe" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="xxx" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="xxx" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2013-03-21T23:59:59.000-04:00" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" /> </ConfigurationSettings> <Certificates> <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="xxx" thumbprintAlgorithm="sha1" /> </Certificates> </Role></ServiceConfiguration> Many of these settings are self-explanatory, but essentially, we’re setting up weekly scans at 2am on Sunday, excluding app.exe, and everything in e:\approot\custom.   We’re also skipping txt and log files.  Also, the MEP bits will be pulled from the North Central US datacenter.  It’s not a big deal if your app is outside of North Central– it’s just that the install takes a few moments longer (the default is South Central).  (And, technically, since bandwidth going into the datacenter is currently free, the bandwidth isn’t an issue.)  If we log into the box (the role must be RDP enabled to do this) we’ll see these settings reflected in MEP. Weekly scans: Excluding app.exe:   And skipping txt and log files: Finally, we can also set up the Windows Azure Diagnostics agent to transfer relevant event log entries to storage – in this example, we’re just adding the antimalware entries explicitly, though getting verbose information like this is probably not desirable: private void ConfigureDiagnosticMonitor() { DiagnosticMonitorConfiguration diagnosticMonitorConfiguration = DiagnosticMonitor.GetDefaultInitialConfiguration(); diagnosticMonitorConfiguration.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1d); diagnosticMonitorConfiguration.Directories.BufferQuotaInMB = 100; diagnosticMonitorConfiguration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1d); diagnosticMonitorConfiguration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose; diagnosticMonitorConfiguration.WindowsEventLog.DataSources.Add("Application!*"); diagnosticMonitorConfiguration.WindowsEventLog.DataSources.Add("System!*"); diagnosticMonitorConfiguration.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1d); //Antimalware settings: diagnosticMonitorConfiguration.WindowsEventLog.DataSources.Add( "System!*[System[Provider[@Name='Microsoft Antimalware']]]"); diagnosticMonitorConfiguration.WindowsEventLog.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1d); PerformanceCounterConfiguration performanceCounterConfiguration = new PerformanceCounterConfiguration(); performanceCounterConfiguration.CounterSpecifier = @"\Processor(_Total)\% Processor Time"; performanceCounterConfiguration.SampleRate = System.TimeSpan.FromSeconds(10d); diagnosticMonitorConfiguration.PerformanceCounters.DataSources.Add( performanceCounterConfiguration); diagnosticMonitorConfiguration.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(1d); DiagnosticMonitor.Start(wadConnectionString, diagnosticMonitorConfiguration); } To filter the event logs from MEP, we can add some filtering like so (adding the Level 1, 2, and 3 to the filter so we’re skipping the verbose level 4 stuff): diagnosticMonitorConfiguration.WindowsEventLog.DataSources .Add("System!*[System[Provider[@Name='Microsoft Antimalware'] and (Level=1 or Level=2 or Level=3)]]"); After deploying the role and waiting a few minutes, the entries are written into Azure table storage, in the WADWindowsEventLogsTable.  In this case, I’m looking at them using Cloud Storage Studio (although, for diagnostics and performance counters, their Azure Diagnostics Manager product is fantastic for this kind of thing): While not everyone needs or desires this functionality, it’s a great option to have (particularly if the system is part of a file intake or distribution system).  

Webcast: Intro to @home with Windows Azure

Tomorrow (Thursday, 3/15/2012) at noon ET or 9am PT, we have our first screencast in the @home series:  an introduction to the @home distributed computing project! This is the first in a series where we’ll dive into various aspects of Windows Azure – in this first webcast, we’ll keep it 100 level, discussing the platform, how to get started, and what the project is about.  From the abstract page: In this 100-level webcast, we introduce Windows Azure. We look at signing up a new account, evaluate the offers, and give you a tour of the platform and what it's all about. Throughout this workshop, we use a real-world application that uses Windows Azure compute cycles to contribute back to Stanford's Folding@home distributed computing project. We walk through the application, how it works in an Windows Azure virtual machine and makes use of Windows Azure storage, and deploying and monitoring the solution in the cloud. If you can’t make this one, be sure to check out the rest in the series by watching the @home website – we’ll be diving deeper into various features as the weeks progress, and we’ll post links to the recordings as they become available.

Launching “Learn the Cloud, Make a Difference” DC Effort

Two years ago, Jim O’neil and I developed a quick Azure training program called “@home with Windows Azure” – a way to learn Windows Azure and have some fun contributing to a well known distributed computing effort, Folding@home.  A few months later, Peter Laudati joined the cloud team and we developed the game RockPaperAzure.   RockPaperAzure was a lot of fun and is still active, but we decided to re-launch the @home with Windows Azure project because of all of the changes in the cloud since that effort in 2010. So, having said all that, welcome to our “Learn the Cloud.  Make a Difference” distributed computing project!  It’s been updated, as you can see on the page – a much cleaner and nicer layout, maintaining our great stats from the 2010 effort where we had a cumulative 6,200+ virtual machines having completed 188k work units!   (Of course, as happy as I am with the numbers, the Folding@home project has a over 400k active CPUs with over 8 petaFLOPS of processing power! Stanford University’s Pande Lab has been sponsoring Folding@home for nearly 12 years, during which they’ve used the results of their protein folding simulations (running on thousands of machines worldwide) to provide insight into the causes of diseases such as Alzheimer’s, Mad Cow disease, ALS, and some cancer-related syndromes. When you participate in @home with Windows Azure, you’ll leverage a free, 3-month Windows Azure Trial (or your MSDN benefits) to deploy Stanford’s Folding@home application to Windows Azure, where it will execute the protein folding simulations in the cloud, thus contributing to the research effort. Additionally, Microsoft is donating $10 (up to a maximum of $5000) to Stanford’s Pande Lab for everyone that participates. We’ve provided a lot of information to get you started, including four short screencasts that will lead you through the process of getting an Azure account, downloading the @home with Windows Azure software, and deploying it to the cloud. And we won’t stop there! We have a series of webcasts also planned to go into more detail about the application and other aspects of Windows Azure that we leveraged to make this effort possible. Here is the schedule for webcasts, and of course, you can jump in before at any time: 3/15/2012 12pm EDT  @home with Azure Overview 3/22/2012 12pm EDT  Windows Azure Roles 3/29/2012 12pm EDT  Azure Storage Options 4/05/2012 12pm EDT  Debugging in the Cloud 4/12/2012 12pm EDT  Async Cloud Patterns

One Azure Web Role, Multiple Websites

Windows Azure has been capable of running multiple websites in a single web role for some time now, but I found myself recently in a situation with 2 separate Azure solutions and was looking to combine them to create a single deployment.   Just like in IIS, this is most often done via host headers, so requests coming in can be forwarded to the correct site. The Goal The fine folks at infragistics created a really cool Silverlight-based reporting dashboard for Worldmaps.  Until now, each was running as its own Azure hosted service: Options to consolidate included folding the code into the Worldmaps site, which would involve actual work, or converting the site to use IIS instead of the hostable web core (HWC), which was, originally, the only way to host Azure deployments prior to version 1.3 of the SDK.  Under IIS, host headers can be used to direct traffic to desired correct site. Preconsiderations Inside the ServiceDefinition file, the <sites> section is used to define the websites and virtual directories, like so: <Sites> <Site name="Web" physicalDirectory="..\WorldmapsSite"> <Bindings> <Binding name="HttpIn" endpointName="HttpIn" /> </Bindings> </Site> <Site name="Reporting" physicalDirectory="..\..\igWorldmaps\WorldmapsDemo.Web"> <Bindings> <Binding name="HttpIn" endpointName="HttpIn" hostHeader="reporting.myworldmaps.net" /> </Bindings> </Site></Sites> Nothing too crazy in there, but I’ll talk about the paths later. The first problem is that I was using webrole.cs file in the Worldmaps application, overriding the Run method to do some background work: public class WebRole : RoleEntryPoint{ public override void Run() { // I'm doing stuff! } } The Run method is called from a different thread, and it did a lot of background processing for the site (logging data, drawing maps, etc.).   This is a great technique, by the way, to add “workers” to your website.  This is, by itself, not a problem to do under IIS or HWC, except, under the HWC version, the thread runs in the same process.   I could write to an in-memory queue via the website, and process that queue in the webrole.cs without problem, provided the usual thread safety rules are obeyed.  Likewise, the worker could read/write to an in memory cache used by the website.  Under IIS, though, the site and role are in a separate process, so it wasn’t possible to do this without re-engineering things a bit.  You don’t need to worry about this if you aren’t doing anything “shared” in your webrole.cs file. Add the Project In my existing Worldmaps solution, I added the infragistics “WorldmapsRporting” project by adding the project to the solution (right click the solution, and choose Add Existing Project): Hook it Up The <sites> tag (seen above) is pretty self-explanatory as it defines each site in the deployment.  For the first and main site, I didn’t provide a host header because I want it respond to pretty much anything (www, etc.).  For the second site, I give it the reporting.myworldmaps.net host header.  Here’s the tricky part, which in retrospect seems so straightforward.  The physicalDirectory path is the path to the web project, relative to the Cloud project’s directory.   When I first created the Worldmaps solution (WorldmapsCloudApp4 is when I converted it .NET 4), I had the cloud project, the website itself, and a library project in the same directory, like so, with the cloud project highlighted: So, the path the WorldmapsSite is up one level.  To get to the infragistics website, it’s up to levels, the into the igWorldmaps folder and into the WorldmapsDemo.Web folder.  We can ignore the other folders.  DNS The project in Windows Azure is hosted at myworldmaps.cloudapp.net, as seen from the Azure dashboard: …but I own the myworldmaps.net domain.  In my DNS, I add the CNAMEs for both www and reporting, both pointing to the Azure myworldmaps.cloudapp.net URL (picture from my DNS dashboard, which will vary depending on who your DNS provider is): Testing Locally To test host headers on a local machine, you’d need to add the DNS names to your hosts file (C:\Windows\System32\drivers\etc\hosts) , like so: 127.0.0.1    myworldmaps.net 127.0.0.1    www.myworldmaps.net 127.0.0.1    reporting.myworldmaps.net Overall, a fairly straightforward and easy way to add existing websites to a single deployment.   It can save money and/or increase reliability by running multiple instances of the deployment. Links http://www.myworldmaps.net http://reporting.myworldmaps.net

My Apps

Dark Skies Astrophotography Journal Vol 1 Explore The Moon
Mars Explorer Moons of Jupiter Messier Object Explorer
Brew Finder Earthquake Explorer Venus Explorer  

My Worldmap

Month List