Unit Testing Azure Storage in TFS/Build Server

I’ve been working on a project that saves and reads blobs from Azure. I created a unit test for this functionality. Instead of making calls to Azure proper, I am running the Azure Storage Emulator so that I don’t incur any cost. “It works on my machine!” …but when I check my code into TFS and do a build on the server, the unit test fails because the Storage Emulator is not running. If I remote desktop into the server and start the Storage Emulator then the unit test passes. Starting the emulator manually is a problem because if the server reboots then I have to remember to start that up again and I won’t know about it until the test fails. Other developers might not know to do this either.

To combat this problem I tried starting the emulator with a scheduled task that runs when the server starts. This did not work. I’m not sure why but it just didn’t. Task Scheduler says the task is running but I don’t see it in Task Manager and my unit test fails. I can only assume that the Task Manager status is wrong and something that it doesn’t know about isn’t working. It would be nice if Microsoft created the emulator so that it runs as a Windows Service instead.

I Googled for a solution and came up empty except for one article that mentioned starting the emulator from within code. In order to detect if the emulator is running or not, first you need to access a storage container. If you get a StorageException then the emulator isn’t running and you know to start it. This seems like a hacky solution but I tried it and it works. Here is what I ended up with:

The first line of that method checks a config setting that will let me turn the hack on/off. You won’t need it when you move to production because Azure is always on.

One thing to note is that your build controller must be using a user account and not a built-in account. This is because the storage emulator stores it’s settings in the user’s directory. I created a local administrator account on my build server to run the build controller as.

Google Fonts Installer

I am a big fan of Google Fonts. There are some really nice fonts on there. While easy to use on the web, there is no easy way to install fonts on your computer to use in Word or whatever app you’re using locally.

I take that back – there is an “easy” way. It’s called SkyFonts. It’s a third party app (not Google) that will install all of the Google Fonts for you… but there are some drawbacks. Their motivation is money. They want you to use their service to discover OTHER fonts and PAY for them. Their app also runs on your computer 24/7. Isn’t that silly? Just install the fonts and move along. So if you’re like me you’ll say NO THANKS to SkyFonts.

So without SkyFonts there is no other option to easily install all of the Google Fonts locally… Until now! I’ve written a small app to download all the fonts and install them. Half of the credit goes to this guy on github called w0ng. My app automates the download of his zip file and does the installing.

.NET Framework 4.5 is required to run the app (comes with most Windows PCs)

DOWNLOAD: google_font_installer_1.0.zip

Error TF54000 when creating a new team project in TFS 2013

I setup a brand new TFS 2013 server and was in the process of creating my first team project when I received this error:

TF54000: Cannot update data because the server clock may have been set incorrectly. Contact your Team Foundation Server Administrator.

I had changed the time zone on the server after installing TFS so I figured this error was because of that. In TFS there is a table called tbl_Changeset. If you check it there is a CreationDate column which is the date at which you checked in something to TFS. When installing TFS it creates the first changeset automatically. The date for that first record was now 3 hours after the current time. I couldn’t create a team project because changeset 2 would have occurred before changeset 1. The changeset numbers and CreationDate both need to be chronologically in the same order for things to work.

To fix this I ran an update statement to set the first record back a day (it could have been 3 hours but 1 day works just as well). This is the query I used:

UPDATE tbl_Changeset SET CreationDate=CreationDate-1 WHERE ChangeSetId=1

I went back to Visual Studio and was then able to create my team project without issues.

Online markdown editor encrypting files to the cloud

I use text editors a lot. I use them for taking notes, writing code, writing long emails before I send them out, and much more. For a long time I’ve used Notepad++ because it can handle all kinds of text-based files. Quite often the files I create in Notepad++ I also encrypt. I use a plugin called SecurePad for this.

Lately I have been using MarkdownPad to replace a lot of the things I do in Notepad++. Markdown (the syntax) is great because it gives you guidelines on how your text should be formatted. It’s used all over the place: reddit, github, and stackoverflow to name a few. Because of the guidelines markdown provides it can be easily transformed into HTML and rendered in an eye-appealing fashion. MarkdownPad splits it’s window into 2 panes: the left pane is a text editor and the right pane is the HTML formatted view. MarkdownPad is very basic as is markdown in general – because it was intended to be simplistic and only used for basic formatting of text. MarkdownPad has no way of encrypting files.

So I got to thinking… Why is MarkdownPad a Windows application? It’s so basic that it could easily be browser based. It would also be nice if it encrypted files. In today’s landscape you can never be too careful when it comes to protecting your personal data, thoughts, and daily habits. I also thought, if you could make a browser based markdown editor where would you open and save files from? …the cloud!

After doing a quick search on Google I couldn’t find anything that remotely resembled the online tool I wanted, so I made it myself.

It’s called Secure Markdownhttps://securemarkdown.azurewebsites.net

It’s a work in progress but the basics are there:

  • Text editor (left pane)
  • Markdown syntax is shown in HTML format as you type (right pane)
  • Files are read and written to Dropbox
  • Files are encrypted (in the browser) before being saved to Dropbox

In the future I would like to add:

  • Buttons for markdown-formatting text, like MarkdownPad.
  • Option for saving unencrypted.
  • Overwriting files (when you save a file with the same name as an existing file in dropbox now it appends a number to the file name)
  • Seperate save / save as options.
  • Option for viewing HTML source and downloading the HTML version.
  • Option for saving HTML view as a PDF
  • Move to own domain name if warranted

If you have any suggestions please leave a comment.

Improving WebAPI performance with Jil and Protocol Buffers

One of the reasons I don’t use WebAPI for internal services is because of the performance hit incurred when serializing and deserializing JSON and XML messages. Even for public API’s this is somewhat of a pain. After digging around I’ve found ways to improve the performance of WebAPI serialization.

In .NET there are lots of serializers. The most popular being Json.NET which was chosen by Microsoft to be the default JSON serializer for WebAPI. Json.NET is great because it is fast, robust, and configurable. There are serializers that are faster than Json.NET though. Take a look at these benchmarks:

Source: theburningmonk.com

Source: theburningmonk.com

ServiceStack.Text, Jil, and Protobuf-Net are all faster than Json.NET. Protobuf-Net is not a JSON serializer but I’ll get to that in a minute. ServiceStack.Text slightly edges out Json.NET in terms of speed and from what I can tell is as robust as Json.NET. I frown upon using it however because it’s not entirely free. There is a free version but it will only deserialize 10 object types and serialize 20. I will never use it because of this. Json.NET is free and the performance gain in using ServiceStack isn’t enough for me to pay.

There is one more option we need to look at though – Jil. Jil is optimized for speed over memory. The author of Jil goes into detail on his blog on the optimizations he’s made to get the best performance. It is interesting reading if you have time. Jil is also used on the Stack Exchange API which is used by many people. I almost forgot to mention – it’s free!

Protocol Buffers, or Protobuf for short, is Google’s official serializer. They use it for all of their internal services. Instead of JSON this is a binary serializer. I was interested to see the payload size of binary serialization versus JSON so I did a quick test. Json.NET produced a payload of 728 bytes while the same object serialized by Protobuf was 444 bytes (39% smaller). The act of serialization is also faster as you can see in the benchmarks above. The combination of the two means you can receive and send WebAPI messages faster than the default serializer. But how can we use this in WebAPI?

The great thing about WebAPI is that swapping out serializers and adding support for new ones is a breeze. In my projects I am now swapping out Json.NET for Jil and adding support for protobuf. WebAPI will return the correct result to the caller based on the content-type they wish to receive. Here are some articles on how you can do this too!

Using a custom default font color in Lync 2010

I use Lync 2010 at work and wanted to use orange as my default font color. The problem is that it’s not available through the settings dialog box. These are the options it gives me…

lync2010_colors_dialog

No orange!?!

I figured the settings must be stored in the Windows Registry so I opened up the Registry Editor and navigated to the key [HKEY_CURRENT_USER\Software\Microsoft\Communicator]. There is a setting called “IM CharFormat” that stores the default font information. It is a binary value in the registry and you can modify it manually. If you look closely at the value the color is just an HTML HEX color code within this value. Here it is…

lync2010_default_font_custom_color

All you need to do is get the HEX value for your custom color and update this part of the value with it. #F26522 is the orange color I want. You will have to completely close down Lync and then restart it after you save the registry value. Now open up the default font dialog in Lync again. You will see that your custom color is now being used.

lync2010_colors_dialog_with_orange

If you choose one of the other colors then the “Custom Color” option will go away and you’ll have to edit the registry again. If you edit the font type, size, or other options and keep the custom color selected it will stay.

The color picker in Lync 2013 is different and I believe you can select a custom color with it but you can still do this registry hack if you wish. It will also work with earlier versions of Lync/Communicator.

Presenting RulePlex at 1Million Cups

Today I presented my startup company, RulePlex, at 1Million Cups. Here is a video of the presentation.

I’ve gotten a few comments about it being too technical and that I should give a demo of the product while presenting. This was my first presentation to a large audience on the service and I think those ideas are great. I’ll be working them in for next time.

Expiring links after a given timeframe

Here is one way to expire a link/web page after a certain amount of time has passed. Instead of keeping track of whether a link is valid or not through a database look-up, this is done by verifying that the expiration date in the url generates the same token that is also passed in through the url. If the user tries to change the date value it will invalidate the token. Users cannot generate the token without the secret key you keep on your server. Here is how it’s done.

First we need to create a model for our expiration date and token:

public class ExpiresModel
{
    public DateTime ExpiresOn { get; set; }
    public string Token { get; set; }
}

Next we need a utility for generating and checking tokens:

public class TokenHelper
{
    private const string HashKey = "secret";

    public static string GetToken(ExpiresModel model)
    {
        if(model == null)
            throw new ArgumentNullException("model");

        return Crypto.HashSHA512(String.Format("{0}_{1}", HashKey, model.ExpiresOn.Ticks));
    }

    public static bool IsValidToken(ExpiresModel model)
    {
        if (model == null)
            throw new ArgumentNullException("model");

        return model.Token == GetToken(model);
    }
}

Notice that in the TokenHelper class is where the secret key lives. The Crypto class I used can be found here.

For the front end I have one page which creates the link and another to check the status. Here is the controller for that:

public class HomeController : Controller
{
    public ActionResult Index()
    {
        var model = new ExpiresModel();
        model.ExpiresOn = DateTime.Now.AddSeconds(30);
        model.Token = TokenHelper.GetToken(model);
        return View(model);
    }

    public ActionResult Check(long dateData, string token)
    {
        var model = new ExpiresModel();
        model.ExpiresOn = DateTime.FromBinary(dateData);
        model.Token = token;

        if (!TokenHelper.IsValidToken(model))
            ViewBag.Message = "Invalid!";

        else if (model.ExpiresOn >= DateTime.Now)
            ViewBag.Message = "Still good: Expires in " + (model.ExpiresOn - DateTime.Now);

        else
            ViewBag.Message = "Not good: Expired on " + model.ExpiresOn;

        return View();
    }
}

And the Views for those controller methods…
Index:

@model WebApplication4.Models.ExpiresModel
@{
    ViewBag.Title = "Home";
    Layout = "~/Views/Shared/_Layout.cshtml";
}

<h3>Home</h3>

<p>A new link has been generated that expires in 30 seconds. Use the link below to check the status:</p>
<p><a href="@Url.Action("Check", new { dateData = Model.ExpiresOn.ToBinary(), token = Model.Token })">Check Now</a></p>

Check:

@{
    ViewBag.Title = "Check";
    Layout = "~/Views/Shared/_Layout.cshtml";
}

<h3>Check</h3>

<p>@ViewBag.Message</p>

<p>@Html.ActionLink("Gernate another link", "Index", "Home", new { area = "" }, null)</p>

The entire solution can be downloaded here.

Stopwatch Class in JavaScript

I needed the equivalent of .NET’s Stopwatch class in JavaScript today. I did a quick search and could only find actual stopwatches so I figured it would be faster to write it myself.

Here is a fiddle that shows the usage

and here is the code for the Stopwatch class

Which .NET JavaScript Engine is the fastest?

UPDATE: Added ClearScript

In RulePlex users are allowed to write rules in JavaScript, make an API call which passes in data, and execute those rules in the cloud. RulePlex is written in .NET. So how do we execute JavaScript in .NET? It turns out there are a bunch of JavaScript engines that can do this, but which one is the fastest?

I took an inventory of the more popular .NET JavaScript engines:

My initial thoughts were that JavaScript.Net would be fast since it is just a wrapper for Google’s V8 engine which is the fastest JavaScript engine currently. I also thought IronJS would be fast since it uses Microsoft’s Dynamic Language Runtime. jint and Jurassic I was skeptical about.

The Tests

I created a project and referenced each engine by using NuGet. I called each engine 5 times to execute a snippet of code and took the average. The snippet of code I executed came from a suite of array tests I found at Dromaeo. You can view the tests in this gist.

I also did another test where I loaded the linq.js library (one of my favorite, lesser known, JavaScript libraries).

The Results

Array test results:

jint 31,378 ms
IronJS 2,499 ms
JavaScript.Net 21 ms
Jurassic 494 ms
ClearScript 261 ms
ClearScript (compiled) 24 ms

Linq.js load results:

jint 35 ms
IronJS 245 ms
JavaScript.Net 14 ms
Jurassic 170 ms
ClearScript 49 ms
ClearScript (compiled) 1 ms

It turns out I was right about JavaScript.Net. It is much faster than any of the others. I did run into sort of a quirk with that library though and that was that I couldn’t use “Any CPU” as my platform target but instead had target x86 (you can probably target x64 as well it just can’t be Any CPU because the correct C++ libraries need to be targeted)

I was completely wrong in thinking IronJS would perform well. jint was by far the worst due to it’s incredibly bad array execution time. Jurassic was a pleasant surprise. It performed both the test and linq.js load reasonably fast.

If you come across any other .NET JavaScript engines feel free to let me know and I’ll add them to my comparison.

I was made aware of ClearScript which I’ve now added to the results list. This library also runs V8 but doesn’t need the C++ libraries. It looks almost as impressive as JavaScript.Net.

One More Test

I wasn’t entirely happy with the tests I had done so I added one more. The script I executed only does one small thing – set a variable to true. This shows more or less the overhead of each engine. I ran this this test 5000 times each and took the average.

One variable results:

jint 0 ms
IronJS 1 ms
JavaScript.Net 10 ms
Jurassic 8 ms
ClearScript 29 ms
ClearScript (compiled) 0 ms

Here is the complete script I used. I swapped out currentScript and changed N as needed.