Using Jenkins to Detect Broken Links on Your Site

One of the most basic aspects of building a great website and keeping your customers happy is ensuring that your links work. I know this seems obviously, but you would be surprised at how many products and documentation sites don't bother with this leaving their content littered with links to 404s.

At Auth0 we setup a simple Jenkins task to scan our entire site for broken and malformed links once a day using the very handy LinkChecker tool by Bastian Kleineidam.

The first thing you will need to do is setup linkchecker on your Jenkins machine. The easiest way to do this is to just install it globally as shown below. You will need SSH access to Jenkins for this step and you will need to install the python-dev package if it isn't already installed. (Note, this step assumes Jenkins is running on Linux - this will also work on Windows, but with slightly different installation).

sudo apt-get install python-dev
sudo pip install LinkChecker

To setup your build in Jenkins simply create a "New Item" and select "Freestyle Project". Name the project whatever you like.

Configure the project with a few basic settings. Under build triggers, check "Build Periodically" and set a cron schedule like H 0 * * *.

Jenkins Build Triggers

Next, set add a build step for "Execute Shell" and add the command to run linkchecker.

linkchecker https://example.com

See the documentation on link checker for additional configuration on the CLI.

Set your desired notification preferences such as email or Slack when the build fails.

Save your configuration and run your project. The console output will show you if you have a broken link on your site like this.

Jenkins Console Output

Now you are set to provide a more usable site for your readers or customers.

Sublime Text Setup Script

One thing that I find odd about the default Sublime Text install on Windows is that you don't get the option to add subl to the path. On OS X this happens automagically. I wrote a simple script that I use on every new Windows machine to setup sublime the way I like. In addition to adding subl to the path, this also copies my sublime settings. Pretty simple script, but I find it useful.

# Make link to subl and add to path
cmd /c mklink "C:\Program Files\Sublime Text 3\subl.exe" "C:\Program Files\Sublime Text 3\sublime_text.exe"
$wsh = new-object -com wscript.shell
$path = $wsh.Environment("System").Item("Path")
$subl = ";C:\Program Files\Sublime Text 3\";
If ($path.Contains($subl)) {
    echo "Sublime Text already in PATH";
} Else {
    $path += ";C:\Program Files\Sublime Text 3\"
    $wsh.Environment("System").Item("Path") = $path

    echo "Added subl.exe to path";
}

# Save Sublime Settings
Copy-Item Preferences.sublime-settings "$Env:USERPROFILE\AppData\Roaming\Sublime Text 3\Packages\User"
Copy-Item XML.sublime-settings "$Env:USERPROFILE\AppData\Roaming\Sublime Text 3\Packages\User"
echo "Add Sublime Settings";

Deploying Static Content to Azure Storage with Visual Studio Online Build

A common task for improving the performance a web application is to server your static content from a different domain from you app. This external domain could be simply another pointer to your web server, a CDN service, or in the case here Windows Azure Blob storage. Blob storage is a fast and super inexpensive place to host static content for your site. To that end, I wrote a very simple PowerShell script that deploys all of my .min.js files in my script folder to blob storage. This script can be used from a Visual Studio Online build definition in order to automate the deployment.

Note, this script assumes that your site (in this case MyWebSite) and the script itself are at the root of your source directory. If you have a different directory structure, you will need to adjust the paths.

Also, notice that I am referencing the Windows Azure PowerShell Cmdlets which are installed on the VSO build server.

$env:PSModulePath=$env:PSModulePath+";
        "+"C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell"
Import-Module Azure
$storageAccount = "<your storage account>"
$storageKey = "<your storage account key>"
$containerName = "scripts"
$dir = "MyWebSite\Scripts"
if ($Env:TF_BUILD_SOURCESDIRECTORY)
{
    $dir = $Env:TF_BUILD_SOURCESDIRECTORY + $dir
}

$context = New-AzureStorageContext  –StorageAccountName $storageAccount
                                    -StorageAccountKey $storageKey

# Set the ContentType and optionally the CacheControl
$properties = @{ 
  "ContentType" = "application/javascript"; 
  "CacheControl" = "public, max-age=31536000";
} 

$files = Get-ChildItem $dir -force | Where-Object {$_.FullName.EndsWith(".min.js")}
foreach ($file in $files)
{
  $fqName = $dir + '\' + $file.Name
  Set-AzureStorageBlobContent -Blob $file.Name
                              -Container $containerName
                              -File $fqName
                              -Context $context
                              -Properties $properties
                              -Force 
}

In order to use this script for continuous integration simply add it to your source repo and edit your Build Definition to use the script as shown below.

Build Definition

That should be it. Now when your build runs your scripts will be deployed to blob storage.

Client IP Address Filtering with Windows Azure Web Sites

UPDATE: This feature is now officially supported using the <ipSecurity> configuration in web.config. More details on the Windows Azure blog.

I was recently working on a prototype for an internal project that I wanted to publish to Windows Azure Web Sites. Because the project is for Microsoft use only I needed to either setup authentication or filter by IP address. Authentication wasn't really what I wanted because I wanted the prototype to be easily shared within the company. Security wasn't super critical, but it needed to be good enough. My initial thought was to just use the <ipSecurity> configuration in web.config. However, I realized that this module isn't actually installed on Windows Azure Web Sites and that even if it was, it wouldn't work for my needs because sites in WAWS sit behind ARR servers.

The solution I ended up using was to create a simple MVC Action Filter that blocks requests from outside of a specific network range. Unfortunately, some of the utilities for calculating subnets are actually not part of .NET. However, I was able to find a nice set of extensions that gave me what I needed. Combining these extensions with a simple action filter was able to perform my required task.

You can find the full source to this action filter in this Gist. It contains both the IPFilterAttribute class as well as the IPAddressExtension.cs file.

Using the filter is easy. Simply add the filter to your controller and specific the IP address or network you want to allow.

[IPFilter("123.45.0.0/16")]
public class HomeController : Controller
{
    public ActionResult Index()
    {
        return View();
    }
}

Optionally, you can choose to block localhost access from your site. Localhost is allowed by default.

[IPFilter("123.45.0.0/16", AllowLocalhost = false)]
public class HomeController : Controller
{
    public ActionResult Index()
    {
        return View();
    }
}

There are a couple things worth noting as to how this filter actually works. First, since Windows Azure Web Sites sits behind a ARR server the IP address of the end user is not the IP address the request comes from. The end user's IP address is actually stored in the header "X-Forwarded-For". You can see how we get the ip address below.

protected override bool AuthorizeCore(HttpContextBase httpContext)
{
    var clientIP = httpContext.Request.Headers["X-Forwarded-For"];
    var isAuthorized = IsAuthorizedIPAddress(this.AuthorizedIPAddress, clientIP);

    return isAuthorized || 
          (this.AllowLocalhost && (httpContext.Request.Url.Host == "localhost"));
}

Second, this is just a sample I wrote quickly. If you are using this for something that requires real security or solid error handling, please look over the code first.

Windows Azure SDK for Node.js and CLI updated for Node.js 0.10.x

Good news! The Windows Azure Cross Platform CLI and the Windows Azure SDK for Node.js has been updated to support Node.js version 0.10.3 and above. You can now update your installation of node and Windows Azure tools to the latest and greatest.

Note: Make sure you are using 0.10.3 or greater. Guang Yang, who is a PM on the team that writes the SDK pointed out that there is a bug in earlier versions that can cause issues in the SDK.

To upgrade Node.js head over to nodejs.org and click the install button. In order to upgrade the Windows Azure CLI tools simply use npm as show below.

npm install azure-cli -g

Note: On OS X you will need to run this command as sudo to install globally.

To install the latest version of the Windows Azure SDK for Node.js in your project you can use npm as well.

npm install azure

An alternative option is to use our one-click installers to install the SDK and tools together. You can find the Node.js installer here.

Nathan Totten

Nathan Totten is a Customer Success Engineer at Auth0. Previously, he was a Senior Program Manager on the Microsoft Azure team.