Zopfli is a compression algorithm developed by Google. At the moment it produces the smallest possible output on files that can be decompressed using the DEFLATE algorithm. PNG files use DEFLATE so they can be compressed using Zopfli. The one downside is that the algorithm is pretty slow. Here is a good article with more details.

I created WinZopfli as a GUI tool for compressing multiple PNG files at once.

* Requires Microsoft .NET Framework 4.6.1

WinZopfli - PNG Compression

Note: The tool will create a thread for each logical processor on your system. This will lag your system until it’s done processing PNGs. If you want to change this behavior open the WinZopfli.exe.config file in a text editor and change the value for NumberOfThreads. Changing the value to 0 will revert back to the default behavior.

I wanted to run .NET on linux since I’ve seen people talking about it so much recently. I couldn’t find a start-to-finish tutorial though so I am attempting to do that here.

  1. In VMWare Workstation (you can use VirtualBox just the same) create a VM and install the latest version of Ubuntu desktop.
  2. Once installed and you login, update everything that needs updating in the Ubutntu Software Center and reboot
  3. Type everything in the sub-lists below as commands in a terminal…
  4. Install Mono:
    1. sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
    2. echo "deb http://download.mono-project.com/repo/debian wheezy main" | sudo tee /etc/apt/sources.list.d/mono-xamarin.list
    3. sudo apt-get update
    4. sudo apt-get install mono-complete
  5. Install libuv:
    1. sudo apt-get install automake libtool curl
    2. curl -sSL https://github.com/libuv/libuv/archive/v1.4.2.tar.gz | sudo tar zxfv - -C /usr/local/src
    3. cd /usr/local/src/libuv-1.4.2
    4. sudo sh autogen.sh
    5. sudo ./configure
    6. sudo make
    7. sudo make install
    8. sudo rm -rf /usr/local/src/libuv-1.4.2 && cd ~/
    9. sudo ldconfig
  6. Install the .NET Version Manager (DNVM):
    1. curl -sSL https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.sh | DNX_BRANCH=dev sh && source ~/.dnx/dnvm/dnvm.sh
  7. Install the .NET Execution Environment (DNX):
    1. dnvm upgrade
  8. More installs for the next part:
    1. sudo apt-get update
    2. sudo apt-get upgrade
    3. sudo apt-get install build-essential openssl libssl-dev curl git
  9. Install NVM:
    1. git clone git://github.com/creationix/nvm.git ~/.nvm
  10. To load NVM whenever a terminal is opened:
    1. echo '[[ -s "$HOME/.nvm/nvm.sh" ]] && source "$HOME/.nvm/nvm.sh"' >> ~/.bash_profile
  11. Start NVM in the current terminal:
    1. . ~/.nvm/nvm.sh
  12. Install Node.js (0.12.6 is the latest version as of this post, you should replace this with whatever the current version is when you install it):
    1. nvm install v0.12.6
    2. nvm alias default 0.12.6
  13. Install Yeoman (Yo) and the scaffolding template for ASP.NET projects:
    1. npm install -g yo generator-aspnet
  14. Generate an empty ASP.NET project. A wizard will come up and ask you what type of project you want to create. I created a Simple website and named it “MyFirstDotNetAppOnLinux”:
    1. yo aspnet
  15. Switch to the new directory/template that was created:
    1. cd MyFirstDotNetAppOnLinux
    2. dnu restore
  16. Start Web Server:
    1. dnx . kestrel
  17. Open Firefox and go to http://localhost:5000
  18. To kill the web server hit Ctrl+Z then enter kill %1

Here are the articles I referenced when putting together this start-to-finish guide:

Don’t forget to install Visual Studio Code so you can edit your project in Ubuntu!

There have been a couple of times recently where I wanted to implement double-checked locking so that I could pull data from cache and fall back on a database lookup. This is a simple technique if I just had one thread but I am doing this in the context of a multi-threaded application (a RESTful API). If I place a lock on an object it would block all other threads. Because requests include a key (think int Id property, Guid, or unique string name) I would like to put a lock on the key so that other threads can continue being processed unless they pertain to the same key. This way I am only doing one database lookup per key. I didn’t find anything baked into .NET that would allow me to do this. I also wanted it to look as much like the typical lock(object){} syntax as possible so that it could be easily understood by other developers. Here is the solution I came up with:

You can combine this with a using statement to achieve the desired feel of the lock syntax:

using (new KeyLocker("mykey"))
//only one thread per key will execute code in this block

Here is an example:

Back in August of last year I did some tests to determine which .NET JavaScript engine was the fastest. I wanted to get a better picture of the overall performance of each so I went back and grabbed all of the tests from Dromaeo to run. Below are the engines I compared and how fast they ran each test.



* All times are in milliseconds

  Jint IronJS JS.NET Jurassic ClearScript NiL.JS
dromaeo-3d-cube 744 649 34 287 163 164
dromaeo-core-eval 138 79 19 28 48 12
dromaeo-object-array 12958 1306 20 205 76 1646
dromaeo-object-regexp 14494 1998 225 1754 264 2511
dromaeo-object-string 9712 ERROR 42 999 161 1228
dromaeo-string-base64 1368 287 16 253 48 150
v8-crypto 30578 ERROR 29 1465 58 1666
v8-deltablue 1051 415 28 212 50 168
v8-earley-boyer 19396 24271 42 TIMEOUT 80 1898
v8-raytrace 4368 8564 34 1489 68 609
v8-richards 654 167 15 98 42 93
sunspider-3d-morph 508 35 13 33 44 32
sunspider-3d-raytrace 946 437 21 168 49 98
sunspider-access-binary-trees 743 68 13 94 35 89
sunspider-access-fannkuch 1685 59 14 77 37 72
sunspider-access-nbody 757 139 14 78 38 72
sunspider-access-nsieve 2611 56 13 164 36 193
sunspider-bitops-3bit-bits-in-byte 1353 28 12 26 35 83
sunspider-bitops-bits-in-byte 1253 30 13 16 36 78
sunspider-bitops-bitwise-and 362 17 12 9 40 14
sunspider-bitops-nsieve-bits 1586 49 13 80 36 66
sunspider-controlflow-recursive 3116 73 13 66 36 177
sunspider-crypto-aes 4505 351 18 347 45 226
sunspider-crypto-md5 684 233 16 106 40 43
sunspider-crypto-sha1 638 72 14 59 43 36
sunspider-date-format-tofte 430 48 14 183 39 31
sunspider-date-format-xparb 923 98 15 49 40 22
sunspider-math-cordic 526 38 12 28 36 34
sunspider-math-partial-sums 143 33 13 20 47 12
sunspider-math-spectral-norm 710 41 13 46 40 44
sunspider-regexp-dna 372 416 19 385 48 ERROR
sunspider-string-fasta 758 88 15 100 43 70
sunspider-string-tagcloud 438 7822 20 135 47 97
sunspider-string-unpack-code 642 311 27 138 55 100
sunspider-string-validate-input 972 66 16 75 42 102
d3.min 143 ERROR 34 766 68 ERROR
handlebars-v3.0.3 86 560 27 204 52 58
knockout-3.3.0 38 1070 30 326 49 TIMEOUT
lodash.min 161 776 25 362 55 38
qunit-1.18.0 50 203 19 116 44 86
underscore-min 22 344 16 93 42 20

If I didn’t have a timeout then NiL.JS would never have finishing loading knockout. Jurassic’s timeout on v8-earley-boyer is okay, it just runs really slow.

I’ve been thinking of adding more tests which show the performance of .NET types being used in JavaScript and JavaScript variables being retrieved by .NET after the script has run. Stay tuned.

The source code for these tests are on GitHub.

I am little old school in that I’ve used Winamp since the 90’s. At the beginning of last year it was bought by Radionomy from AOL. I thought that meant they would finally update it, but it’s been over a year and I haven’t heard any news of this ever happening. I decided to switch to AIMP as my audio player instead. It looks a lot like Winamp but the it’s much more usable.

Anyways, back in the day when I used to use AIM (I still use AIM but on Pidgin now, occasionally) there was a Winamp plugin that would update your AIM profile with what you were listening to in Winamp. There are more modern plugins that do basically the same thing but post to Twitter instead.

I use Lync at work. I thought it would be neat to create something that would update Lync’s “Personal Note” with what I am listening to in AIMP. Lync, for some reason doesn’t allow plugins, I read that somewhere so I gave up quickly. That’s when I looked into how to create AIMP plugins… again, didn’t find much. AIMP was developed by some Russians and it’s in C which I don’t know well enough to be writing plugins in. So what I did was create my own app. It starts when I log into Windows and runs in the background. It takes about 2MB of memory and 0% CPU. All it does is monitor if a file has changed and then calls the Lync API to update my note. Monitor a file? Yes… there is a plugin you need to install in AIMP called “Current Track info to file v3.1”. It writes the currently playing track info to a file.

Here is what you need to do in order to get this working on your Windows machine.

  1. Install AIMP 3 if you haven’t already.
  2. Install the “Current Track info to file v3.1” plugin for AIMP
  3. In the Current track info plugin settings… the path should be to your user account’s “My Documents” folder and the file name be called “CurrentTrackInfo.txt” (ex. C:\Users\rfrisby\Documents\CurrentTrackInfo.txt)
  4. For the plugin’s template use this:
    %IF(%R,%R - %T,%Replace(%F,.mp3,))
  5. The setting for “Remember list of, files” should be 1. None of the other options should be checked.
  6. Download the LyncAimpUpdater zip file and extract it to your hard drive
  7. In Windows Explorer go to the startup folder for your user account. (ex. C:\Users\rfrisby\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup) In this folder right click and choose New > Shortcut.
  8. Where it says “Type the location of this item” paste this line:
    C:\Windows\System32\cmd.exe /c start /min C:\Apps\LyncAimpUpdater\LyncAimpUpdater.exe ^& exit

    Then change the path to LyncAimpUpdater.exe so that it points to where you extracted the zip on your hard drive.

That’s it. You can open that shortcut to run the app right away but if not it will start the next time you log into your computer.

.NET 4.5 is required to run the app.

This also works for “Skype for Business”.

protobuf-net cannot serialize everything you throw at it. It’s picky about what it does because it wants to be fast. If it were to accommodate every type of object it would have to sacrifice speed. The author did however create a hook so that things it can’t serialize can be turned into something it can serialize. This is done with what it calls Surrogates. As you can tell from the name, we tell protobuf-net that for a given type (that it can’t serialize) we want to use a surrogate type (that it can serialize).

Take this class for example:

MyNameValueInfo can’t be serialized because it doesn’t know how to serialize the Value property (typeof object). It will throw an exception: “No Serializer defined for type: System.Object”

To get around this we need to provide a surrogate for MyNameValueInfo that protobuf-net can serialize. First register the surrogate type (only needs to be done once):

RuntimeTypeModel.Default.Add(typeof(MyNameValueInfo), false).SetSurrogate(typeof(MyNameValueInfoSurrogate));

Then implement MyNameValueInfoSurrogate so that it can be transformed from/to MyNameValueInfo and is serializable by protobuf-net:

Doing binary serialization like this will include Type information in the serialized byte array. This is only useful if the receiving system is also in .NET. For a more universal approach you could use JSON serialization.

Bootstrap is a handy tool and I use it a lot. I decided to use it with a WordPress plugin I am developing but when I included bootstrap’s css file in my plugin page it blew up the wordpress admin panel’s design. Thus started my journey of hacks to get it working. Here is how it’s done…

This is what my plugin folder looks like:

  • Stylesheets (and a less css file as you will soon find out) live in the “css” folder.
  • Bootstrap fonts in the “fonts” folder.
  • Javascript in the “scripts” folder.

In your plugin file/installer/whatever you probably have a line which was loading the bootstrap css in the html head element…

wp_enqueue_style('admin_css_bootstrap', plugins_url('/myplugin/css/bootstrap.min.css'), false, '1.0.0', 'all');

Get rid of that. You can load the bootstrap javascript file this way but for the stylesheet we begin the hacks…

Create a script called bootstrap-hack.js and load it with your plugin.

wp_enqueue_script('admin_js_bootstrap_hack', plugins_url('/myplugin/scripts/bootstrap-hack.js'), false, '1.0.0', false);

The content of that file is this:

As you can see first we are dynamically adding a .less (LESS CSS) file. Next we are loading the LESS Javascript to transform the .less file. Then we are loading any stylesheets that may override bootstrap styles. This is the main part of the hack but there is a little more to it as I will explain.

The content of bootstrap-wrapper.less is this:

What this does is load the bootstrap css file as LESS and then outputs it with all of the styles wrapped with the “.bootstrap-wrapper” class. This means you have to add a div that wraps your content so that the bootstrap styles will be available to it. It will look something like this:

Now back to bootstrap-hack.js… It’s loading less.js file so download and include it in your scripts folder.

Make sure you load any stylesheets that override bootstrap styles in the same way bootstrap’s CSS is loaded afterwards. You don’t have load the stylesheet using less – we just did that because we needed to wrap bootstrap’s styles with another class so that they won’t conflict with wordpress’ styles. Don’t forget that your overrides must be prefixed with .bootstrap-wrapper now as well.

I was working on RulePlex this week and came across a couple of things I wanted to share. First is a change in the way that rules are “compiled”. You can’t really compile JavaScript per se but this is the process of how things came about working the way they do…

In the first iteration, JavaScript rules were executed individually. If there were 100 rules in a Policy I would execute 100 little JavaScript snippets for an incoming request. The snippets were “compiled” when the call to the API was made. I soon realized that this might be okay if a Policy had a few rules but for large Policies it was slow – even if I executed all of the snippets in parallel.

For the next iteration I took all of the rules and compiled them into one big script. In order to do this I had to wrap the rules with some advanced JavaScript techniques. Because this big script contains the results of every rule I had to append something unique to each result’s variable name – the rule’s Id. This makes the script look horrifying but I am okay with it for now (it’s not hurting performance). Executing one big script increased performance tremendously. Here is an example of what a Policy with 1 rule that simply returns true looks like compiled:

At the same time I had extended this technique to the C# rule engine. I took it a step further though and actually compiled C# rules into a DLL. I took the binary code for the DLL and stored it along with the Policy. I did the compilation whenever a rule in the Policy changed – not when the API was called like I had been doing with the JavaScript engine. When the API was called, I got the DLL’s binary data and loaded it into memory to be executed against the incoming data.

I mimicked the binary compilation and loading of the C# rules into the JavaScript engine as well. The thing is, I never really liked doing it in the JavaScript engine because I had to convert the compiled script (text) to binary, so I could store it in the same database field, and then from binary back to text when it was time to be executed. In C# it made sense but not in JavaScript. Now that the C# engine is gone I had a chance to go back and change this.

Presently, when rules are changed (or added/deleted/etc), RulePlex will compile it’s big script during the save of the rule. I saves it to the Policy as text. When the API is called the script is retrieved and executed.

I haven’t thought about tweaking this process any more but I may in the future. Instead I have been thinking about how this affects the workflow from a business perspective. The more I think about it the more I like the changes that I’ve made. If I ever change how I “compile” the big script it won’t affect policies that are currently working (a certain way). What if I’ve got a bug in the script that you’ve accounted for in you’re rules, knowingly or unknowingly. If I change how the script is compiled and it’s being compiled during the API request then it could be different day-by-day without any actions by you. This is bad because I may have fixed or introduced a bug that changes the results. Now the application you’ve integrated with RulePlex is broken!

The ideal workflow is that there are two copies of the same Policy, maybe even three, or N. One copy would be designated as a Production copy, while the others are for Dev/Staging/whatever. When the engine changes, you want to test those changes in a non-Production environment first. When you’ve verified that the changes do not affect your application then that non-Production copy can be promoted to Production. This also applies to the workflow of building out a Policy too, not just back-end changes to the engine. The concept of environments will be included in the next version of RulePlex.