Neat Reader: My Google Reader replacement

I already had the idea of writing my own feed reader on my list of things to do, so when news came out that Google were phasing Reader out, it naturally became my next hobby project.

Only spending a few hours here and there over several months, I’ve personally been using it since Google Reader died.

Only in the last few weeks has it been fit for human consumption (or even gotten a proper name).

Neat Reader has changed a lot already in the last few weeks, and there’s lots more to come.

Asychronously upload a file using jQuery to a Web API controller

Here’s the code for the HTML5 form:

        <form id="upload">
                <label for="myFile"></label>
                    <input type="file" id="myFile" />
            <button type="submit">Upload</button>

We need to use jQuery to handle when the user submits the upload, to make sure it gets handled asynchronously and posted to our Web API controller:

        // Hook into the form's submit event.
        $('#upload').submit(function () {

            // To keep things simple in this example, we'll
            // use the FormData XMLHttpRequest Level 2 object (which
            // requires modern browsers e.g. IE10+, Firefox 4+, Chrome 7+, Opera 12+ etc).
            var formData = new FormData();

            // We'll grab our file upload form element (there's only one, hence [0]).
            var opmlFile = $('#opmlFile')[0];

            // If this example we'll just grab the one file (and hope there's at least one).
            formData.append("opmlFile", opmlFile.files[0]);

            // Now we can send our upload!
                url: 'api/upload', // We'll send to our Web API UploadController
                data: formData, // Pass through our fancy form data

                // To prevent jQuery from trying to do clever things with our post which
                // will break our upload, we'll set the following to false
                cache: false,
                contentType: false,
                processData: false,

                // We're doing a post, obviously.
                type: 'POST',

                success: function () {
                    // Success!

            // Returning false will prevent the event from
            // bubbling and re-posting the form (synchronously).
            return false;

Ok and now in our controller, we can deal with the uploaded file(s):

    using System;
    using System.IO;
    using System.Net;
    using System.Net.Http;
    using System.Web;
    using System.Web.Http;

    class UploadController : ApiController
        public async void Post()
            if (!Request.Content.IsMimeMultipartContent())
                throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable, "This request is not properly formatted"));

            // We'll store the uploaded files in an Uploads folder under the web app's App_Data special folder
            var streamProvider = new MultipartFormDataStreamProvider(HttpContext.Current.Server.MapPath("~/App_Data/Uploads/"));

            // Once the files have been written out, we can then process them.
            await Request.Content.ReadAsMultipartAsync(streamProvider).ContinueWith(t =>
                if (t.IsFaulted || t.IsCanceled)
                    throw new HttpResponseException(HttpStatusCode.InternalServerError);

                // Here we can iterate over each file that got uploaded.
                foreach (var fileData in t.Result.FileData)
                    // Some good things to do are to check the MIME type before we do the processing, e.g. for XML:
                    if (fileData.Headers.ContentType.MediaType.Equals("text/xml", StringComparison.InvariantCultureIgnoreCase))
                        // And this is how we can read the contents (note you would probably want to do this asychronously
                        // but let's try keep things simple for now).
                        string contents = File.ReadAllText(fileData.LocalFileName);

Uninstalling VMWare Workstation when Hyper-V is installed

Well, it was annoying enough that VMWare doesn’t launch on my machine when Hyper-V is installed. I’m skeptical that VMWare wouldn’t run if Hyper-V isn’t running at the same time, and this check just reeks of anti-competition.

Unfortunately for VMWare, that can work against them rather than encourage me to uninstall Hyper-V. That means I’m ditching them for now (mainly as Visual Studio integration with Hyper-V for things like Windows 8 and Windows Phone 8 development is good; not to mention, Hyper-V is built in and doesn’t cost extra).

I went to uninstall VMWare today but got this error:


Wow! Talk about pain in the ass.

Fortunately a quick search later, Jussi Palo has a simple solution, altering the installer’s lua script to skip this check:

Remove VMware Workstation or Player when Hyper-V is installed

And now I can uninstall.

Error C2039: ‘SetDefaultDllDirectories’ when targetting Visual Studio 2012 Windows XP C++ Runtime

We’re switching our legacy C++ projects from Visual C++ 2010 to the Visual C++ 2012 Runtime, now that Microsoft allows you to target Windows XP for C++ in 2012 (available in Visual Studio Update 1).

So that involves switching the Platform Target from v100:


To v110_xp:


Well upon compilation, I saw these errors for one particular project:


The key error being Error C2039: ‘SetDefaultDllDirectories’ : is not a member of ‘`global namespace” from line 638 in atlcore.h

Well if we jump into that code, we see this:

#ifndef _USING_V110_SDK71_
	// the LOAD_LIBRARY_SEARCH_SYSTEM32 flag for LoadLibraryExW is only supported if the DLL-preload fixes are installed, so
	// use LoadLibraryExW only if SetDefaultDllDirectories is available (only on Win8, or with KB2533623 on Vista and Win7)...
	IFDYNAMICGETCACHEDFUNCTION(L"kernel32.dll", SetDefaultDllDirectories, pfSetDefaultDllDirectories)
		return(::LoadLibraryExW(pszLibrary, NULL, LOAD_LIBRARY_SEARCH_SYSTEM32));

It looks like that define should exist, as we’re targeting “V110_SDK71” (aka v110_xp).

Well, with a little digging, that define is getting created by the C++ MSBuild files in C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V110\Platforms\Win32\PlatformToolsets\v110_xp\Microsoft.Cpp.Win32.v110_xp.props:

      <!-- Add /D_USING_V110_SDK71_ when targeting XP -->

But was getting blown away in my project file:

  <ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">     <ClCompile>       <RuntimeLibrary>MultiThreadedDLL</RuntimeLibrary>       <InlineFunctionExpansion>OnlyExplicitInline</InlineFunctionExpansion>       <StringPooling>true</StringPooling>       <FunctionLevelLinking>true</FunctionLevelLinking>       <Optimization>MinSpace</Optimization>       <SuppressStartupBanner>true</SuppressStartupBanner>       <WarningLevel>Level3</WarningLevel>       <PreprocessorDefinitions></PreprocessorDefinitions>       <AssemblerListingLocation>$(IntDir)</AssemblerListingLocation>

So the fix is to include any existing pre-processor definitions (i.e. the Microsoft one) before defining our own (don’t forget to do this for all configurations and platforms in your project file):

      <WarningLevel>Level3</WarningLevel>       <PreprocessorDefinitions>%(PreprocessorDefinitions)</PreprocessorDefinitions>       <AssemblerListingLocation>$(IntDir)</AssemblerListingLocation>

Otherwise, you can simply remove the PreprocessorDefinition element itself (if you have no defines of your own), or choose to inherit from the parent or project defaults from the project properties (which will essentially do the same thing):


And now we recompile fine.

Windows Installer custom actions, UAC and elevation

We had a problem this week where a merge module from a third party was causing our installer to fail under UAC.

The installation was being elevated, but the custom actions (which were doing things such as file system and registry operations) were still failing due to security exceptions.

The root cause was that the merge module contained custom actions that were still not being run under the elevated account.

The reason behind this is a common one when scheduling deferred actions; the custom action must have the Impersonate flag set to false.

You can do this in WiX code by setting Impersonate to No:

<CustomAction Id="RegisterAspNet4" BinaryKey="WixCA" DllEntry="CAQuietExec"

Execute="deferred" Return="check" Impersonate="no"/>

The documentation in WiX on the Impersonate attribute reads (emphasis mine):

This attribute specifies whether the Windows Installer, which executes as LocalSystem, should impersonate the user context of the installing user when executing this custom action. Typically the value should be ‘yes’, except when the custom action needs elevated privileges to apply changes to the machine.

Being a third party merge module, we had to fix this ourselves by opening up Orca and setting this flag manually.

You can do this by looking at the CustomAction table, finding your custom action in the list, and checking the Type field.

The Type field is a field full of flags, and will show up as a decimal number e.g. 1025.

If you open up Windows Calculator, switch to Programmer (View > Programmer or Alt+3), choose Decimal and enter the number, you can check the bit fields:


The first bit is set to signify it as an In-Script Execution custom action (type 1).

The 11th bit is set to signify Deferred execution (i.e. instead of running immediately, we schedule when the installer should run our custom action).

The 12th bit (the one I’ve highlighted) is the one that turns off impersonation; this is the bit we need to flip on to disable impersonation for this custom action, so that it runs with elevated privileges.

Being the twelfth bit, that is a value of 2048, so we can just add that to our value, giving us 3073:


You can see the No Impersonate bit is now flipped.

You can now update the value in Orca and save the merge module (or create a transform).


UAC in MSI Notes: The NoImpersonate Bit Mistake 

Custom Action In-Script Execution Options (Windows)


Hi, it’s me!

This website will be the new dumping ground for my experiences while developing software, and the home for any of my own personal software that I develop and want to make available to everyone.