the next door (so called) geek

@rakkimk | your next door geek | friend | blogs mostly on technology, and gadgets.

What are your options to host your web application with Microsoft Azure?

Microsoft Azure is growing. All it’s services are getting huge adoption. Currently, there are 3 types of Microsoft Azure services that our customers can use to host their web applications.

    1. Hosting your web application inside IIS, from Azure Virtual Machine.
    2. Hosting as a Web Role using Cloud Services.
    3. Hosting as a Azure Website.

I’ve tried all the above for my personal hosting needs, and I thought of putting this little page that might lists most (but not all) features associated with each service, and what you can do. I’m not going to suggest you which one to use for your hosting needs, because all these 3 offerings are created for specific set of customers with specific set of needs. Depending on how much you want to pay, versus how much control that you need, you could select one of these. Here is one more blog which does comparison of features available.

Hosting your web application inside IIS, from Azure Virtual Machines

This service offering gives you all the power that you had with your on-premises deployments. You own the VM, so you do own the updates to the VM as well. You manage the host names inside IIS, you have to setup the host header within IIS, just like you do in your on-premises hosting. It’s your box. Anything that runs perfectly in your corporate server, it should run here. So, if you have any additional software that you need to install, and manage the updates to the Operating System yourself, this is it! Go for it. At the time of writing, a standard Medium VM instance (2 x 1.6GHz CPU, 3.5GB RAM, 490GB Storage) will cost you $0.18/hr, or $133.92/month.

Hosting as a Web Role using Cloud Services

Hosting as a web-role, cloud service saves you from managing the Operating System updates. Still, you own the application – you could RDP to the VMs, and perform anything that you might do in an on-premises server, only thing to remember is these are stateless Virtual Machines allocated to you. So if you stop, and start your deployment, you might be allocated a different VM, and so on. Here, you own the code, application updates, etc, but leaving the OS updates to Microsoft. At the time of writing, the same standard Medium VM instance (2 x 1.6GHz CPU, 3.5GB RAM, 490GB Storage) will cost you $0.16/hr, or $119.04/month.

Hosting as a Azure Website

Azure Websites is another one way that you could host your websites with Microsoft Azure. It is so far the easiest way to host the websites, and you don’t have to worry about the Operating System updates. To some extent, you also need not to worry about your Application Updates, since there are ways to connect your application deployment to your source control, or even Dropbox folder that will sync to the production. It has one click swap to production vs staging slots, and many more cool features. The newest WebJobs adds more background processing power to your websites. It’s like adding a new Worker Role to your existing Cloud Services offering, only that adding WebJobs doesn’t cost you anything extra since that runs in the same machine as your website. At the time of writing, the same standard Medium VM instance (2 x 1.6GHz CPU, 3.5GB RAM) will cost you $0.20/hr, or $148.80/month.

For all your pricing related queries, try this page in the official site. There are various tier offerings that you should take a look, which would match your needs, and the budget.

Hope this helps!

Azure Websites SCM site updated to have Single Sign-on

If you haven’t noticed, your Azure Website’s SCM site has been updated to work with Azure Single Sign-on. Earlier, it was prompting for a basic authentication prompt where you enter your publishing credentials, but now, you can just login using your Microsoft Account which you are using for your Azure Subscription.

For example, if I go to my site https://<mysite>.scm.azurewebsites.net, it shows me the below:

image

If I’m already logged into my Microsoft Account, say, I’m already on the Azure Management Portal, I don’t even see this if I’m going to the SCM site. It takes directly into the SCM site. However, in case if you need the basic authentication, you can append /basicauth in the url, like https://<yourwebsite>.scm.azurewebsites.net/basicauth to get the basic authentication prompt.

Azure Websites – Viewing Site Configuration files from Kudu

Azure Websites Paas offering from Microsoft gives you more power to you. You could change a few default configurations that come with the platform by various means, for example, for PHP you can set a few settings in .user.ini, for any custom values for ApplicationHost.config, you can use XDT transforms. There are a lot of times, you want to check what are the default configuration files that are used for your Azure Website. You could easily get those files from the Kudu Console. To access the Kudu Console, navigate to https://yoursite.scm.azurewebsites.net, and login with your deployment credentials.

Once you are in the site, click on “Debug console” menu, and choose “CMD”, or Powershell. Both should open a file explorer, and a console window below. From the file explorer, you can click on the “Site Root” icon, the middle one, and then click on the “Config” folder.

image

This is the hidden place (well, not anymore) where you will find all the configuration files related to your website instance. So, you have there your applicatiohhost.config file, the rootweb.config file, and also the PHP.ini files for various PHP versions available for the website.

You could either view the file right there by clicking on the could download the file by clicking on the clip_image004 icon. Ofcourse, the below console you see there is powerful too. You could try your favorite command line utility in there. For example, I want to print out the last 10 lines of the configuration file.

image

And, yes, you can edit a few files right from this Kudu as well.

Well, you get the point! Have fun.

Kudu for Azure Websites updated with ‘Process Explorer’ tab

I’m sure everyone appreciates the pace in which Azure Websites team releasing cool features. Azure Websites was all over the announcements in the recent //build. The team has updated the Kudu console with new tab named ‘Process Explorer’. You will see it in the list of options available in the site. To access the Kudu console, go to https://yourwebsite.scm.azurewebsites.net (note the https, and .scm in the url).

image

If you have used the Kudu console before, you would have seen there are REST APIs available for a lot of things, including “Processes and mini-dumps” which when used in Google Chrome with JSON viewer extension was easier to use to get mini dumps of the w3wp.exe process, or getting a gcdump of the process. This new “Process Explorer” tab gives you a cool UI way of doing the same.

image

It will list down all your running processes, under your site’s context – including the w3wp.exe that serves your main website (as well as this Kudu site, or any other site extensions like Monaco editor), any WebJobs your site might have, even the cmd.exe/powershell.exe that gets launched when I open the debug console. You could easily see things like, the memory usage of the process, how many threads are there, handles within the process, and more.

image

Getting memory dumps of the worker process was one of the main post mortem debugging techniques we often do in Microsoft Support while helping the customers with their common issues like hang, slow response, memory leak. Good to see this easy way to get dumps from the Kudu console.

Happy Debugging!

Site Extension Gallery in Windows Azure Websites

I’m no ScottGu, or Scott Hanselman, but I had my own privilege to introduce the Site Extensions gallery to the world during my talk on ‘Deepdive with Windows Azure Websites’ along with Puneet at India’s First ever Windows Azure Conference, Bangalore, March 20, 21st. David Ebbo was kind enough to let me do this. Watch out for his blog/twitter for more updates on this in the coming days. This gallery is part of your Kudu console of the Windows Azure Website. It’s like Nuget for Site Extensions. This should provide an amazing opportunities to people with diagnostics products, or helper console for Websites to get people use it with Azure Websites. Needless to say, you will hear more about this soon, but for now, it is how it appears.

image

There are only a few available right now, but I’m sure this list is going to grow. By the way, this feature is live! So, login to your SCM site of your Windows Azure Websites, and you will see a new option in the top bar, ‘Site Extensions’. You can install a Site Extension listed there by just clicking on “Install”. Once you have installed it, you have to restart the site to make it functional. You have a “Restart Site” button in the top. You also will see this new one that you installed in the “Installed” tab. From there, you can click on the “Launch” button to see the new Site Extension that you installed for your site in action. The Azure Storage Explorer site extension is really cool.

I’m sure I will write more about this sooner. Come back again later. You can follow me on Twitter for quick updates.

WAWS - WebJob to upload FREB files to Azure Storage using the WebJobs SDK

After writing my earlier post on creating a simple WebJob to upload the Failed Request Tracing logs automatically to Windows Azure Blob Storage account, I was discussing this with the awesome development team of the WebJob SDK, Amit Apple, and Mike Stall. And, the outcome is, getting my sample modified to use the awesome WebJobs SDK that eases a lot of tasks. And there is more to it – cool Azure Jobs Dashboard with your Windows Azure Web sites giving you a cool dashboard of your WebJobs messages getting processed.

With the WebJobs SDK, there is automatic way of calling certain functions. You can check Scott’s blog where he have used a function that just monitors his Azure Blob Storage account for an new blob to be created, and process that image, and push it to his Azure Blob Storage account itself. The code he has written is very less, just one function, and wrapping that inside an application with the WebJobs SDK. If you notice his function, he used attributes [BlobInput], and [BlobOutput]. However, in my case, for the example that I was trying – to push the files from file system to the Azure Blob storage, I need some thing like [FileInput] which isn’t available, but WebJobs SDK seems to give custom ways to hook in functions to help with the interaction with the Azure Blob Storage account. I’ve modified my function that uploads the file as below. I also call this Upload function from another function.

        public static void Upload(string name, string path, // Local file 
                                    [BlobOutput("freblogs/{name}")] Stream output,
                                    bool deleteAfterUpload)
        {
            using (var fileStream = System.IO.File.OpenRead(path))
            {
                fileStream.CopyTo(output);
            }
 
            if (deleteAfterUpload)
            {
                File.Delete(path);
            }
        }
        public void UploadFileToBlob(string name, string path)
        {
            var method = typeof(AzureStorageHelper).GetMethod("Upload");
            _host.Call(method, new { name = name, path = path, deleteAfterUpload = deleteAfterUpload });
        }

If you notice, from the UploadFileToBlob() which I’m calling from my FileSystemWatcher callback, I’m not doing any single bit of code to upload the blob to the Azure Blob Storage, I just call another function via the _host (of type Jobhost), and pass the parameter name, path, and the boolean, WebJobs SDK function automatically fills in the Stream output there which would be created as a blob under the “freblogs” container, with the same name that I pass into this function, “name”. You just need to configure the connection string with name “AzureJobsData” for the application in it’s app.config file. Pretty awesome, isn’t it? This WebJobs SDK is in alpha I’m told, so I’m really waiting to see what are the new features in the final version. Sure, the team has set a high bar for themselves Smile

If this isn’t enough, you have an awesome site extension for your Azure Websites that shows all these WebJobs operations with the Azure Blob Storage. I was trying to understand how this works, what WebJobs SDK does is, create another container called “azure-jobs-invoke-log” in your Blob storage account, and stores the logs inside, which then are fetched by the AzureJobs site extension, and shown to you. Here is what my Storage Account shows the containers, the “freblogs” that contain all the FREB files, and the “azure-jobs-invoke-log” showing the container that holds all the log messages of WebJobs SDK.

image

And, to enable the Site Extension, you need to make sure you first configure the connection string named “AzureJobsRuntime” having the same connection string to the Blob Container.

image

After saving this connecting string, then you have to go to the URL https://<yoursitename>.scm.azurewebsites.net/AzureJobs URL. This page shows you details about your WebJobs configured on the Blob Storage account that you have configured, and it’s full details. You can click on each invocation to see it’s details. Remember, in my case it is a custom function that I’m invoking, so it will show you all the details about the parameters that were passed in, and the result of that call. Also, it has a link to the file that we just uploaded – you can download that file by just clicking on the output hyperlink in the 2nd screenshot below.

image

image

Pretty awesome! Don’t wait. Add more power, and background processing to your Azure Websites using the new WebJobs SDK.

Windows Azure Websites - WebJob to upload FREB logs to Azure Blob Storage

Currently in Windows Azure Web sites, there is an option to store your website logs to Azure Blob Storage, but however, the FREB logs – failed request tracing logs, can only be stored in the file system. You will then grab them via your favorite FTP tool, and do the analysis. One of my co-worker asked this question on if we can store the Failed Request Tracing logs in the Azure Blob Storage, and Richard Marr gave this interesting idea of using WebJobs to move the files to Azure Blob Storage. I tried this quickly, and have a beta version of a similar webjob ready for you if you want to try it. I might revive the code sometime later when I find time, but feel free to use the code that I’ve posted in this GitHub repo.

If you notice, I’ve not used the WebJobs SDK as such, but as a normal C# program that monitors the folders and uploads the files created in that folder to Azure Blob Storage. It creates a container called “freblogs” if it doesn’t exist, and stores the files there. You can modify the code to cater to your need.

Entire code of my small C# application:

using System;
using System.IO;
using System.Configuration;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
 
namespace PushToStorage
{
    class AzureStorageHelper
    {
        CloudStorageAccount storageAccount;
        CloudBlobClient blobClient;
        CloudBlobContainer container;
        CloudBlockBlob blockBlob;
        bool deleteAfterUpload = false;
        public AzureStorageHelper()
        {
            // Retrieve storage account from connection string.
            storageAccount = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
 
            // Create the blob client.
            blobClient = storageAccount.CreateCloudBlobClient();
 
            // Retrieve a reference to a container. 
            container = blobClient.GetContainerReference("freblogs");
 
            // Create the container if it doesn't already exist.
            container.CreateIfNotExists();
            string tmp = ConfigurationManager.AppSettings["DeleteAfterUpload"];
            if (tmp != null)
            {
                deleteAfterUpload = (Int32.Parse(tmp) == 1) ? true : false;
            }
 
        }
 
        public void UploadFileToBlob(string name, string path)
        {
            try
            {
                Console.WriteLine("Starting uploading " + name);
                using (var fileStream = System.IO.File.OpenRead(path))
                {
                    blockBlob = container.GetBlockBlobReference(name);
                    blockBlob.UploadFromStream(fileStream);
                    Console.WriteLine(name + " successfully uploaded!");
                    if (deleteAfterUpload)
                    {
                        fileStream.Close();
                        File.Delete(path);
                        Console.WriteLine(path + " deleted!");
                    }
                }
            }
            catch (Exception ee)
            {
                Console.WriteLine(ee.Message);
            }
        }
 
        public bool IsFileReady(String sFilename)
        {
            try
            {
                using (FileStream fileStream = File.Open(sFilename, FileMode.Open, FileAccess.Read, FileShare.None))
                {
                    if (fileStream.Length > 0)
                        return true;
                    else
                        return false;
                }
            }
            catch (Exception)
            {
                return false;
            }
        }
    }
    class Program
    {
        static AzureStorageHelper azStorageHelper;
        static void Main(string[] args)
        {
            Console.WriteLine("Initializing AzureStorageHelper!");
            azStorageHelper = new AzureStorageHelper();
 
            string path = ConfigurationManager.AppSettings["directory"];
 
            string[] directories = Directory.GetDirectories(path, "*W3SVC*");
 
            FileSystemWatcher[] fsw = new FileSystemWatcher[directories.Length];
            Console.WriteLine(path + " " + fsw.Length);
            for (int i = 0; i < directories.Length; i++)
            {
                fsw[i] = new FileSystemWatcher(directories[i]);
                fsw[i].NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite | NotifyFilters.FileName | NotifyFilters.DirectoryName | NotifyFilters.Size;
                fsw[i].Created += fsw_Created;
                fsw[i].IncludeSubdirectories = true;
                fsw[i].EnableRaisingEvents = true;
                Console.WriteLine(String.Format("{0} Started watching directory {1} for files!", DateTime.Now.ToString(), directories[i]));
            }
 
            Console.ReadLine();
            Console.WriteLine(DateTime.Now.ToString() + " Stopping!");
        }
 
        static void fsw_Created(object sender, FileSystemEventArgs e)
        {
            FileInfo fileInfo = new FileInfo(e.FullPath);
            while (!azStorageHelper.IsFileReady(e.FullPath))
                System.Threading.Thread.Sleep(1000);
 
            Console.WriteLine(" Created " + e.Name + " " + e.FullPath);
            azStorageHelper.UploadFileToBlob(e.Name, e.FullPath);
        }
    }
}

As you see, the code reads from web.config for the Azure Blob Storage Connection String, as well as two other settings – folder to watch, and option to delete the file after uploading. They are self explanatory, so you can use this to move any files of your folders to the Azure Blob Storage. You need to install the Windows Azure Storage Nuget Package. Once you have finished the application as a standalone, you need to create a compressed zip file consisting of your .exe, .exe.config, and other Windows Azure Storage reference DLL (since these are not part of WAWS instance by default), and follow the steps in this article to create a WebJob. You can choose to create different type of the job as per your need.

Articles below helped me writing this small tool:

How to use the Windows Azure Blob Storage Service in .NET
http://www.windowsazure.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs-20/#upload-blob

Using the WebJobs feature of Windows Azure Web Sites

http://curah.microsoft.com/52143/using-the-webjobs-feature-of-windows-azure-web-sites

Hope this helps!

Do you know swapping between a staging environment, and the production environment is a push of a button, in Windows Azure Websites?

Yes, you are reading it right. It is true. It’s just a matter of one click to promote your staging website to be the production site. Gone are those hassles you were facing with publishing a new deployment to the live website, and some unbaked pages/assemblies, needless to say much pain, downtime, and probability of errors.

Windows Azure Web Sites is here to ease a lot of tasks you typically do in your websites. Now, create a new staging environment for your existing website in a matter of few seconds, and start testing the new bits, and push a button to make it live.

image

Read more about this feature in this blog post of ScottGu.

Windows Azure Web Sites (WAWS) - Collecting dumps of the worker process (w3wp.exe) automatically whenever a request takes a long time

Websites being slow is perhaps the most common problem every website administrator, and developers run into. If they are extremely unlucky, then see this problem only in their production environment. Many troubleshooting techniques, best practices are available for this scenario. I will try to cover them in a different post as a part of my ASP.NET Troubleshooting series some other time. Meanwhile, you can try looking at this post of mine, where I’ve something that might help you.

For now, let’s focus on Windows Azure Web Sites. As you know this is a closed (well, not completely) hosting environment, and still there are a few things that you can do for this problem – for example, you can try collecting FREB traces for a long running request, and see where it is stuck. FREB shows ASP.NET ETW events as well, but has only the page lifecycle events. For example, it will tell you where the problem like Page_Load is, but not what inside Page_Load. To find more, you either have to profile your application, or collect a memory dump of your process serving the request, and see what the request is doing for such a long time.

I’ll put the steps to enable an automatic collection of memory dump whenever a request processing exceeds ‘x’ number of seconds. This is going to use the same customAction for FREB which I’ve detailed in this old post of mine. In WAWS, the customActionsEnabled attribute for the website is set to “true” by default, so you have to just put the below web.config file. In this example, I’m going to use Windows Sysinternals procdump.exe to take the dump of our process (w3wp.exe). Here are the steps:

Enable ‘Failed Request Tracing’ from the Portal

First, you need to turn on FREB from your management portal. This article has the brief steps how to view those logs from Visual Studio, and even configuring it from there. From the portal, for your website, under configure tab -> site diagnostics, set the below to On.

clip_image001

Transfer Procdump.exe to your deployment folder using FTP

Second, you need to put procdump.exe in your website deployment folder. Download it to your local machine from here. You can create a new folder, and place it in there, let that folder be the path where the dumps be stored as well. In my example, I’ve created a folder called ‘Diagnostics’ under the root, and placed the procdump.exe in there. Screenshot of my FileZilla:

clip_image002

Configure the web.config with configuration to collect dump

Lastly, you need to place the below configuration in the web.config file to enable procdump.exe to be spawned with certain parameters whenever the request exceeds 15 seconds, in this case:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
  <tracing>
  <traceFailedRequests>
    <remove path="*" />
    <add path="*" customActionExe="d:\home\Diagnostics\procdump.exe" customActionParams="-accepteula w3wp d:\home\Diagnostics\w3wp_PID_%1%_" customActionTriggerLimit="5">
     <traceAreas>
       <add provider="ASP" verbosity="Verbose" />
       <add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
       <add provider="ISAPI Extension" verbosity="Verbose" />
       <add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,
                                         Cache,RequestNotifications,Module,FastCGI"
                                  verbosity="Verbose" />
     </traceAreas>
     <failureDefinitions timeTaken="00:00:15" />
    </add>
  </traceFailedRequests>
  </tracing>
</system.webServer>
</configuration>

 

Above configuration will take a mini-dump of the w3wp.exe serving your WAWS site, and put it in the folder d:\home\Diagnostics with dump name having it’s PID. And, if you want a full dump, you can have -ma parameter added. Example customActionParams="-accepteula -ma w3wp d:\home\Diagnostics\w3wp_PID_%1%_".

You could use any other additional switches that typically use for ProcDump. For a slow running page scenario, I might collect dumps at regular intervals – 3 dumps with 5 seconds interval each, so that we can check what the request is doing across these timings. For that you can set the customActionParams to “-accepteula -s 5 -n 3 w3wp d:\home\Diagnostics\w3wp_PID_%1%_”.

Hope this helps!

Quick ways to edit your files hosted in Windows Azure Web Sites (WAWS), other than re-deploying

Editing a small piece of code, or change in configuration file is perhaps the most common thing for a developer to do while testing the site, or even when the site is live for production traffic. When you host in with some hosting providers, most often you end up re-deploying the whole package, or transfer that file over FTP, or any other deployment methods. In this blog, I’m going to cover a few other methods which will help you edit your files of your website hosted in Windows Azure Web Sites.

[Monaco] Visual Studio Online

If you have a very quick edit of a source code, or configuration, editing it over this new shiny Visual Studio Online link for WAWS is perhaps the easiest of all. I have already written about this feature some time back in this blog. Here are the quick steps:

  1. Enable ‘Edit in Visual Studio Online’ option for the website. You can find this option in the Portal –> Websites –> Your Website –> Configure tab. Set it to ON. Click on Save.
  2. Click on the ‘Edit in Visual Studio Online’ option in the Dashboard. Once you did the step 1 to enable this option, and saved it successfully, you should see a new option listed in the Dashboard, ‘Edit in Visual Studio Online’. Click on it.
  3. Enter your deployment user credentials when asked.
  4. Choose the file you want to edit, and start editing your site live. No save button. It’s live as you edit.

Here is a screenshot of my test config file with Monaco:

image

Kudu Console of your website

If you have your site in WAWS, you should definitely check out this Kudu console for the site. Along with some cool stuffs like an interactive console, this also offers you a way to edit the files. Here are the quick steps to edit your web.config file using Kudu console:

  1. Browse to https://<yoursite>.scm.azurewebsites.net, and enter your Deployment credentials to access the Kudu console.
  2. You are greeted with a lot of options, for our task, choose the “Debug console” from the top menu.
  3. Now you are presented with a somewhat familiar screen showing a command prompt window, and a file explorer in the top.
  4. Select the folder. In our case, web.config would be inside /site/wwwroot
  5. You would see something like below:

     

    image

  6. Click on the image ‘Edit’ icon in the web.config file. Now, you will see something like below, where you can edit the content of the file, and hit ‘Save’ button.

 

image

Hope this helps!