Sublime Text Setup Script

One thing that I find odd about the default Sublime Text install on Windows is that you don't get the option to add subl to the path. On OS X this happens automagically. I wrote a simple script that I use on every new Windows machine to setup sublime the way I like. In addition to adding subl to the path, this also copies my sublime settings. Pretty simple script, but I find it useful.

# Make link to subl and add to path
cmd /c mklink "C:\Program Files\Sublime Text 3\subl.exe" "C:\Program Files\Sublime Text 3\sublime_text.exe"
$wsh = new-object -com
$path = $wsh.Environment("System").Item("Path")
$subl = ";C:\Program Files\Sublime Text 3\";
If ($path.Contains($subl)) {
    echo "Sublime Text already in PATH";
} Else {
    $path += ";C:\Program Files\Sublime Text 3\"
    $wsh.Environment("System").Item("Path") = $path

    echo "Added subl.exe to path";

# Save Sublime Settings
Copy-Item Preferences.sublime-settings "$Env:USERPROFILE\AppData\Roaming\Sublime Text 3\Packages\User"
Copy-Item XML.sublime-settings "$Env:USERPROFILE\AppData\Roaming\Sublime Text 3\Packages\User"
echo "Add Sublime Settings";

Deploying Static Content to Azure Storage with Visual Studio Online Build

A common task for improving the performance a web application is to server your static content from a different domain from you app. This external domain could be simply another pointer to your web server, a CDN service, or in the case here Windows Azure Blob storage. Blob storage is a fast and super inexpensive place to host static content for your site. To that end, I wrote a very simple PowerShell script that deploys all of my .min.js files in my script folder to blob storage. This script can be used from a Visual Studio Online build definition in order to automate the deployment.

Note, this script assumes that your site (in this case MyWebSite) and the script itself are at the root of your source directory. If you have a different directory structure, you will need to adjust the paths.

Also, notice that I am referencing the Windows Azure PowerShell Cmdlets which are installed on the VSO build server.

        "+"C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell"
Import-Module Azure
$storageAccount = "<your storage account>"
$storageKey = "<your storage account key>"
$containerName = "scripts"
$dir = "MyWebSite\Scripts"
    $dir = $Env:TF_BUILD_SOURCESDIRECTORY + $dir

$context = New-AzureStorageContext  –StorageAccountName $storageAccount
                                    -StorageAccountKey $storageKey

# Set the ContentType and optionally the CacheControl
$properties = @{ 
  "ContentType" = "application/javascript"; 
  "CacheControl" = "public, max-age=31536000";

$files = Get-ChildItem $dir -force | Where-Object {$_.FullName.EndsWith(".min.js")}
foreach ($file in $files)
  $fqName = $dir + '\' + $file.Name
  Set-AzureStorageBlobContent -Blob $file.Name
                              -Container $containerName
                              -File $fqName
                              -Context $context
                              -Properties $properties

In order to use this script for continuous integration simply add it to your source repo and edit your Build Definition to use the script as shown below.

Build Definition

That should be it. Now when your build runs your scripts will be deployed to blob storage.

Client IP Address Filtering with Windows Azure Web Sites

UPDATE: This feature is now officially supported using the <ipSecurity> configuration in web.config. More details on the Windows Azure blog.

I was recently working on a prototype for an internal project that I wanted to publish to Windows Azure Web Sites. Because the project is for Microsoft use only I needed to either setup authentication or filter by IP address. Authentication wasn't really what I wanted because I wanted the prototype to be easily shared within the company. Security wasn't super critical, but it needed to be good enough. My initial thought was to just use the <ipSecurity> configuration in web.config. However, I realized that this module isn't actually installed on Windows Azure Web Sites and that even if it was, it wouldn't work for my needs because sites in WAWS sit behind ARR servers.

The solution I ended up using was to create a simple MVC Action Filter that blocks requests from outside of a specific network range. Unfortunately, some of the utilities for calculating subnets are actually not part of .NET. However, I was able to find a nice set of extensions that gave me what I needed. Combining these extensions with a simple action filter was able to perform my required task.

You can find the full source to this action filter in this Gist. It contains both the IPFilterAttribute class as well as the IPAddressExtension.cs file.

Using the filter is easy. Simply add the filter to your controller and specific the IP address or network you want to allow.

public class HomeController : Controller
    public ActionResult Index()
        return View();

Optionally, you can choose to block localhost access from your site. Localhost is allowed by default.

[IPFilter("", AllowLocalhost = false)]
public class HomeController : Controller
    public ActionResult Index()
        return View();

There are a couple things worth noting as to how this filter actually works. First, since Windows Azure Web Sites sits behind a ARR server the IP address of the end user is not the IP address the request comes from. The end user's IP address is actually stored in the header "X-Forwarded-For". You can see how we get the ip address below.

protected override bool AuthorizeCore(HttpContextBase httpContext)
    var clientIP = httpContext.Request.Headers["X-Forwarded-For"];
    var isAuthorized = IsAuthorizedIPAddress(this.AuthorizedIPAddress, clientIP);

    return isAuthorized || 
          (this.AllowLocalhost && (httpContext.Request.Url.Host == "localhost"));

Second, this is just a sample I wrote quickly. If you are using this for something that requires real security or solid error handling, please look over the code first.

Windows Azure SDK for Node.js and CLI updated for Node.js 0.10.x

Good news! The Windows Azure Cross Platform CLI and the Windows Azure SDK for Node.js has been updated to support Node.js version 0.10.3 and above. You can now update your installation of node and Windows Azure tools to the latest and greatest.

Note: Make sure you are using 0.10.3 or greater. Guang Yang, who is a PM on the team that writes the SDK pointed out that there is a bug in earlier versions that can cause issues in the SDK.

To upgrade Node.js head over to and click the install button. In order to upgrade the Windows Azure CLI tools simply use npm as show below.

npm install azure-cli -g

Note: On OS X you will need to run this command as sudo to install globally.

To install the latest version of the Windows Azure SDK for Node.js in your project you can use npm as well.

npm install azure

An alternative option is to use our one-click installers to install the SDK and tools together. You can find the Node.js installer here.

Using Node.js with Jekyll to Optimize Static Content

I recently migrated my blog to Jekyll hosted on Github Pages. I did this primarily because I was sick of dealing with the web editor in Wordpress. Wordpress is a great blogging platform for many people, but I just wanted something simpler and more flexible. I have written about how to configure Jekyll on Windows in the past. This post is about optimizing the content of a Jekyll page using Node.js.

The reason, I am using Node.js is primarily because of cross-platform issues with many Ruby gems. I played around with Jekyll Asset Pipeline and it seemed like a great tool for combining and minifying static content, but it seems to have issues running on Windows. I use many different computers including several PCs (I love my Surface Pro) and a Macbook Air so cross-platform support is critical. I blog on both devices so my blog platform must work on each OS. Node.js is an obvious choice for this as it runs flawlessly on both platforms.

The second reason I chose to use Node.js rather than Ruby gems is because I am hosting this site on Github Pages to use their automatic Jekyll deployment. Unfortunately, Github Pages doesn't support Jekyll plugins so anything that does this level of customization must be done on my local machine anyway. This means it didn't really matter which tools I used.

Now, that you know why I am doing this. Lets talk about how. First, I decided to use node-minify to handle the actual combining and minifying of files. This node module works well, it is simple, and it works with several minification libraries like UglyJS, YUI Compressor, and Google Closure Complier. Since I avoid Java like the plague I am just using UglyJS. The optimization isn't quite as good as the other two, but it is enough for me.

Below you can see how node-minify can be used to combine and minify js and css files.

// Using UglifyJS for JS
new compressor.minify({
  type: 'uglifyjs',
  fileIn: ['assets/js/prettify.js', 'assets/js/app.js'],
  fileOut: 'assets/' + config.version + '.js',
  callback: function(err){
    if (err) {

// Using Sqwish for CSS
new compressor.minify({
  type: 'sqwish',
  fileIn: ['assets/css/bootstrap.min.css', 'assets/css/style.css'],
  fileOut: 'assets/' + config.version + '.css',
  callback: function(err){
    if (err) {

In addition to node-minify I also wanted something that would help me automate the various tasks of building the site. For this I chose Jake. Jake is a great tool that I have used on many other projects for creating build scripts.

I setup my jakefile.js with a few simple tasks.

task('default', ['minify', 'build'])

task('minify', function() {
  // Clean up old files
  var files = fs.readdirSync('./assets/');
  for (var i = files.length - 1; i >= 0; i--) {
    var file = files[i];
    if (path.extname(file) == '.js' || path.extname(file) == '.css') {
      fs.unlinkSync('./assets/' + file);

  // Combine and minify code here

task('build', function() {
  setConfigValue('version', config.version);

On thing you will notice is that my static content is being created with a uuid version number. This way each time I publish my site I get a new version. Eventually, I am going to change this code to only perform the minification if the css or js files change. The issue with using a random name is that I need to somehow tell Jekyll which files to use. For this, I set a configuration value in the _config.yml file that Jekyll uses. You can see that file below. The version number gets updated every time I build my static content.

markdown: rdiscount
pygments: false
permalink: /:year/:month/:day/:title/
paginate: 5
version: 63301a88-ec4f-4f35-b9a7-7533810aec30
name: Nathan Tottes&#39;s Blog
description: Thoughts and Experiences with Software Development.

Finally, I need to reference this version number in my Jekyll layouts. This is easy and can be seen below.

<!DOCTYPE html>
<html lang="en">
    <meta http-equiv="Content-type" content="text/html; charset=utf-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
    <link href='' 
      rel='stylesheet' type='text/css'>
    <link href=',700' 
      rel='stylesheet' type='text/css'>
    <link href='/assets/{{ site.version }}.css' 
      rel='stylesheet' type='text/css'>

Now when I publish the site to Github pages, Jekyll will use the version number to setup the static content path and my site will be served with the smaller and optimized js and css files. You can see the full jakefile.js in my website's Github repositoryy.

Nathan Totten

Nathan Totten is a Customer Success Engineer at Auth0. Previously, he was a Senior Program Manager on the Microsoft Azure team.