Indexing for objects in your code – Not only a data storage thing


As you might already know, reader, if you hit this page, is that Indexing is and will always be:

  • Indexing comes from a simple need, for dev related things or not.
  • It helps getting to the data you want FASTER.

“I want to find all people with their first name starting with S”, I can sort that out with that folder sorting.

When it comes to a data storage things are the same:

  • File systems indexing the position of files on the actual hard drive and flash drive,
  • databases can allows you to create multiple indexes on certain columns, to make sure you can sort it faster.

Why making the point for code then? Well, it comes down to the exact same problem: performance.

Most of the times, people using LINQ, to actually parse data and get subsets of it for processing, in a loop for example:

foreach (var countryCode in _countryCodes)
    var countriesPerCode = _entitiesLists.Where(e => e.Country == countryCode).ToList();
    var count = countriesPerCode.Count;
    // Code supposingly doing something with entities.

It can be fine if not too much of these loops are run, but why is it so different when it runs on let’s say a million rows?

Same problem occurs with a database: if the sql enginge running the query doesn’t know anything about how to get rows that match your WHERE clause, it will have to run on all rows.

In the case of our loop above, LINQ does the same: how LINQ would know about which items match you lambda function until it tried it on all items in your list?

To solve that issue we are going to use an uncommon LINQ object: Lookup.

The goal is simple: we are going to use it to build out an index out of our data to group it by a given key. Running this only once on our dataset fixes our problem, in the sense that getting back data subset for each loop iteration will be instant with our Lookup.

Here are the performance difference you can get from our test app (output from our console app):

Data: building
Data: done
Test1: start
Test1: processed in 535 milliseconds
Test2: start
Test2: lookup done in 140 milliseconds
Test2: processed in 141 seconds

To summarize our little article (TL,DR):

  • The cost of initializing the dictionary first takes some time, but doing it once offers around a x4 performance gain
  • Make use of indexing capabilities when you dataset starts to be above a certain number of items (that is to say most of the times!)
  • This is a really simple sample that does not reflect other things that could happen around your code, as I have seen performance getting from 45 minutes to 5 (a x9 increase then)
  • Repro code is to be found in Github here.


Multi tenancy with Azure – Guide

Dealing with client data is quite important, especially when GDPR is coming 🙂

Still, building apps for multiple clients is and has always been a complicated task for multiple valid reasons:

  • Monetization: consumer vs company data segmentation
  • Resources Cost: data split strategy vs costs
  • Non-technical: legal or data protection for country or unions (e.g. European Union)

Lets go through options we can have that actually works with pros and cons, for each of those concerns.

Continue reading Multi tenancy with Azure – Guide

Entity Framework – Code First Migration – Solving merge errors

Using Entity Framework with git and having multiple branches updating a model can be quite challenging.

Here is a concrete example:

  • develop branch is on code first migration v1
  • feature X updates model to add fields to an entity, adding migration v2
  • feature Y also updates model  to add fields on an entity, adding migration v3
  • feature X is merged into develop
  • feature Y is merged, coming on top of previous one


First, both feature migrations are independent and both built on top of v1:

EF will be able to process them, but what will happen is this:

Changes noted here as incoming will bring back migration v2 changes when running “Add-Migration” again, even if they are present.

It basically means EF does not see v2 because v3 was generated with v2 not being present in the branch at the time of adding a migration.

This has to do with data found in migration RESX file, which contains Base 64 binary encoded info to keep track of link between each.

Bottom line: to avoid having this to happen:

  1. Rebuild migration v3 on top of v2 in develop branch by deleting v3 after getting v2 from develop
  2. Build a model update feature that everyone merge into their branch one updated with their model change
  3. merge feature X into feature Y only for db change, then add your fields in feature Y so that new fields rely on migration v2

Happy merging!


Connecting to CosmosDB with Microsoft Azure Storage Explorer now

You’ve probably noticed the CosmosDB announcement a couple of weeks ago, and this a great step to get secondary index for Table Storage-like data you are using today ins Azure.

I am relying quite often on Microsoft Azure Storage Explorer to access my tables date, but the Cosmos DB part is done done yet, so howto you do that?


Because Cosmos DB has now a Table API that behaves exactly the same as a Storage Table, just:

  • open your “Local and Attached” top root navigation node in Explorer
  • right-click “Storage Accounts”, add select “Connect to Azure Storage”
  • Select “Use a connection string or a shared access signature URI” and follow the rest of the process to add your Cosmos DB table and use it as a Storage table!

This is a work-around to play with your CosmosDB data in a simple way, without having to wait.

Still, CosmosDB does not work the same way as the traditional Table Storage, especially on import/export on large volume of data: where Table storage is limiting the query performance, CosmosDB just cut connection straight away.

Happy indexing!

Using Hangfire, or how to run your code as jobs

I encountered Hangfire a while ago, and tried it out sometimes a year ago, but was not having time or need to properly address some job capabilities.

If you are reading me, it’s because your either looking at understanding what is Hangfire or how to address some of your needs.

Hangfire is about running some portion of your code as jobs away from your main (server/web) process. It also adds the capability to run this code as recurring (very convenient to put in place simple update/cleaning/reminder/mailing jobs).

The most important part you have to get when willing to run Hangfire job, is that you code has to be cable to give itself a proper context:

  • no HttpContext.Current or similar objects: only what you give to your object at method calling time matters (this is what get serialized as Json on the Hangfire back-end).
  • no complex object graph: if the class/service you are willing to instantiate has many dependencies (other objects inits or similar), please make sure everything is in proper order from the call you initiate with Hangfire OR let your object initialize itself properly.
  • Bottom-line, be context friendly! if you have keys or ids to identify data you want to manipulate, pass on these values for serialization: simple to serialize and easier to maintain.

When digging into implementing Hangfire, you’ll see by yourself going over the documentation that almost all you need has been thought through.

As per writing some code using Hangfire, here a re a few hints:

  • You can only add Hangfire.Core Nuget package to a given project if you only intend to add jobs from it (less is better)
  • When willing to use an IOC container, make sure your use the proper Enqueue prototype; if you don’t, Hangfire will simply store the actual type (not the interface) that was used at Job enqueuing time,  which might work at first, but  won’t switch to your new type if you change your interface implementation in your ioc container:
BackgroundJob.Enqueue(x => x.MyInerfaceMethod(param1, param2, param3));
  • If you are planning to run the Hangfire server part in an ASP.NET app, don’t forget to have it running all the time! Hangfire does not auto-start the web app because it’s there 🙂
  • As you can access multiple queues when building with Hangfire, don’t forget to assign Hangfire processing servers to different queues.

Thanks for reading, happy coding!

Azure Directory Library & TokenCache persistance – upgrade issue

When using the Microsoft.IdentityModel.Clients.ActiveDirectory nuget package to deal with ADAL token cache, you can ctaully serialize the state of your cache to use it at a later point:

public class RefreshTokenCache : TokenCache
    private void AfterAccessNotification(TokenCacheNotificationArgs args)
        if (this.HasStateChanged)
            var data = Convert.ToBase64String(this.Serialize());

When migrating from 2.x to 3.x versions of the library, I encountered some issues trying to serialize back my token cache to its initial state:

var tc = new RefreshTokenCache();

The problem was that serialized data obtained on v2.x was not de-serialized properly when using 3.x versions.

I that case, I am forcing the token to be asked again to the end-user, so that I can serialize it back to the 3.x format.

Happy coding!

Kicking System.Web out of your code – part 3: utilities

Last part of my blog posts about getting rid of System.Web.

We are going to look into some pretty gluing parts of System.Web now.

1. HtmlEncode & UrlEncode

This one is pretty easy, System.Net is actually coming with System.Web equivalents methods here.

2. Querystring value access

Same for easiness here, based on Rick Strahl’s extensions found here.

3. MachineKey

This one gets a litle more tricky, but looking around can bring you a replacement, as the one below:

public class LogrrMachineKey
    private static string _decryption = "AES";

    private static byte[] _cryptKey;
    private static string _decryptionKey;

    public static string DecryptionKey
            _decryptionKey = value;
            var key = HexStringToByteArray(_decryptionKey);
            _cryptKey = key;

    public static byte[] Encrypt(byte[] inputBuffer)
        SymmetricAlgorithm algorithm;
        byte[] outputBuffer;

        if (inputBuffer == null)
            throw new ArgumentNullException("inputBuffer");

        algorithm = GetCryptAlgorithm();

        using (var ms = new MemoryStream())
            ms.Write(algorithm.IV, 0, algorithm.IV.Length);

            using (var cs = new CryptoStream(ms, algorithm.CreateEncryptor(), CryptoStreamMode.Write))
                cs.Write(inputBuffer, 0, inputBuffer.Length);
            outputBuffer = ms.ToArray();
        return outputBuffer;

    public static byte[] Decrypt(byte[] inputBuffer)
        SymmetricAlgorithm algorithm;
        byte[] inputVectorBuffer, outputBuffer;

        if (inputBuffer == null)
            throw new ArgumentNullException("inputBuffer");

        algorithm = GetCryptAlgorithm();
        outputBuffer = null;

            inputVectorBuffer = new byte[algorithm.IV.Length];
            Array.Copy(inputBuffer, inputVectorBuffer, inputVectorBuffer.Length);
            algorithm.IV = inputVectorBuffer;

            using (var ms = new MemoryStream())
                using (var cs = new CryptoStream(ms, algorithm.CreateDecryptor(), CryptoStreamMode.Write))
                    cs.Write(inputBuffer, inputVectorBuffer.Length, inputBuffer.Length - inputVectorBuffer.Length);
                outputBuffer = ms.ToArray();
        catch (FormatException e)
            throw new CryptographicException("The string could not be decoded.", e);
        return outputBuffer;

    private static SymmetricAlgorithm GetCryptAlgorithm()
        SymmetricAlgorithm algorithm;
        string algorithmName;

        algorithmName = _decryption;
        if (algorithmName == "Auto")
            throw new Exception("Explicit algorithm is required");

        switch (algorithmName)
            case "AES":
                algorithm = new RijndaelManaged();
            case "3DES":
                algorithm = new TripleDESCryptoServiceProvider();
            case "DES":
                algorithm = new DESCryptoServiceProvider();
                throw new Exception($"Algorithm {algorithmName} is not recognized");

        algorithm.Key = _cryptKey;

        return algorithm;

    private static byte[] HexStringToByteArray(string str)
        byte[] buffer;

        if (str == null)
            throw new ArgumentNullException("str");

        if (str.Length % 2 == 1)
            str = '0' + str;

        buffer = new byte[str.Length / 2];

        for (int i = 0; i < buffer.Length; ++i)
            buffer[i] = byte.Parse(str.Substring(i * 2, 2), NumberStyles.HexNumber, CultureInfo.InvariantCulture);
        return buffer;

This class is not a one-to-one replacement of the existing MachineKey classes found in System.Web, so please make sure you migrate/ reset your values first.

Happy coding!

Kicking System.Web out of your code – part 2: HttpBrowserCapabilities object

Second part of my blog posts about getting rid of System.Web.

I will cover 2 part fo this object that I usually use: browser info & mobile device detection

1. Web browser information

We will he make use of the UAParser-csharp (c-sharp equivalent or the UAParser.js) which provides us a parsing mechanism of the User Agent string.

After installing the lib using Nuget, just a simple extension make the day (here returning “Windows 10 using Chrome“):

public static string GetBrowserName(this HttpRequestMessage request)
    var ua = request.Headers.UserAgent.ToString();
    var uaParser = Parser.GetDefault();
    var c = uaParser.Parse(ua);
    return $"{c.OS.Family} using {c.UA.Family}";

2. Mobile device detection

As far as you want to make sure you are rendering the web page for a mobile device or not, you can just use this extension method (inspired by this SO answer):

public static bool IsMobileDevice(this HttpRequestMessage request)
    string u = request.Headers.UserAgent.ToString();
    var b = new Regex(@"(android|bb\d+|meego).+mobile|avantgo|bada\/|blackberry|blazer|compal|elaine|fennec|hiptop|iemobile|ip(hone|od)|iris|kindle|lge |maemo|midp|mmp|mobile.+firefox|netfront|opera m(ob|in)i|palm( os)?|phone|p(ixi|re)\/|plucker|pocket|psp|series(4|6)0|symbian|treo|up\.(browser|link)|vodafone|wap|windows ce|xda|xiino", RegexOptions.IgnoreCase | RegexOptions.Multiline);
    var v = new Regex(@"1207|6310|6590|3gso|4thp|50[1-6]i|770s|802s|a wa|abac|ac(er|oo|s\-)|ai(ko|rn)|al(av|ca|co)|amoi|an(ex|ny|yw)|aptu|ar(ch|go)|as(te|us)|attw|au(di|\-m|r |s )|avan|be(ck|ll|nq)|bi(lb|rd)|bl(ac|az)|br(e|v)w|bumb|bw\-(n|u)|c55\/|capi|ccwa|cdm\-|cell|chtm|cldc|cmd\-|co(mp|nd)|craw|da(it|ll|ng)|dbte|dc\-s|devi|dica|dmob|do(c|p)o|ds(12|\-d)|el(49|ai)|em(l2|ul)|er(ic|k0)|esl8|ez([4-7]0|os|wa|ze)|fetc|fly(\-|_)|g1 u|g560|gene|gf\-5|g\-mo|go(\.w|od)|gr(ad|un)|haie|hcit|hd\-(m|p|t)|hei\-|hi(pt|ta)|hp( i|ip)|hs\-c|ht(c(\-| |_|a|g|p|s|t)|tp)|hu(aw|tc)|i\-(20|go|ma)|i230|iac( |\-|\/)|ibro|idea|ig01|ikom|im1k|inno|ipaq|iris|ja(t|v)a|jbro|jemu|jigs|kddi|keji|kgt( |\/)|klon|kpt |kwc\-|kyo(c|k)|le(no|xi)|lg( g|\/(k|l|u)|50|54|\-[a-w])|libw|lynx|m1\-w|m3ga|m50\/|ma(te|ui|xo)|mc(01|21|ca)|m\-cr|me(rc|ri)|mi(o8|oa|ts)|mmef|mo(01|02|bi|de|do|t(\-| |o|v)|zz)|mt(50|p1|v )|mwbp|mywa|n10[0-2]|n20[2-3]|n30(0|2)|n50(0|2|5)|n7(0(0|1)|10)|ne((c|m)\-|on|tf|wf|wg|wt)|nok(6|i)|nzph|o2im|op(ti|wv)|oran|owg1|p800|pan(a|d|t)|pdxg|pg(13|\-([1-8]|c))|phil|pire|pl(ay|uc)|pn\-2|po(ck|rt|se)|prox|psio|pt\-g|qa\-a|qc(07|12|21|32|60|\-[2-7]|i\-)|qtek|r380|r600|raks|rim9|ro(ve|zo)|s55\/|sa(ge|ma|mm|ms|ny|va)|sc(01|h\-|oo|p\-)|sdk\/|se(c(\-|0|1)|47|mc|nd|ri)|sgh\-|shar|sie(\-|m)|sk\-0|sl(45|id)|sm(al|ar|b3|it|t5)|so(ft|ny)|sp(01|h\-|v\-|v )|sy(01|mb)|t2(18|50)|t6(00|10|18)|ta(gt|lk)|tcl\-|tdg\-|tel(i|m)|tim\-|t\-mo|to(pl|sh)|ts(70|m\-|m3|m5)|tx\-9|up(\.b|g1|si)|utst|v400|v750|veri|vi(rg|te)|vk(40|5[0-3]|\-v)|vm40|voda|vulc|vx(52|53|60|61|70|80|81|83|85|98)|w3c(\-| )|webc|whit|wi(g |nc|nw)|wmlb|wonu|x700|yas\-|your|zeto|zte\-", RegexOptions.IgnoreCase | RegexOptions.Multiline);
    var returnVal = b.IsMatch(u) || v.IsMatch(u.Substring(0, 4));
    return returnVal;

Last step, will be removing MachineKey and utilities dependencies.

Happy coding!

Kicking System.Web out of your code – part 1: ASPNET MVC

Being doing some .NET code for a while now, you sure saw how much dependencies your code has on it.

With .NET core coming in sight, but also for optimizing you existing ASP.NET code base, you probably would want to start lowering your dependencies on it…

The first step (but not the smallest) for me is to kick out ASP.NET MVC.

DISCLAIMER: don’t get me wrong, ASP.NET MVC is a great framework for some use cases, but in my search for boot time optimization and lower memory footprint, this is not exactly what I need.

  1. Replacing ASPNET.MVC by RazorEngine

Not having much linked to it anyway, I replaced it with Razor Engine, a decoupled open source version for the MVC Razor engine (that you can also use to render other things than pages: emails, documents…)

Here is a piece of code to end up using the Razor Engine to render html views and use it the way you alwazs used ASP.NET MVC:

HttpRequestMessage extensions first:

public static class HttpRequestMessageExtensions
    public static TemplateServiceConfiguration TemplateConfig = new TemplateServiceConfiguration
        TemplateManager = new EmbeddedResourceTemplateManager(typeof(Startup))

    public static IHttpActionResult View&amp;lt;T&amp;gt;(this ApiController controller, string viewName, T model)
        return GetResponseView(controller.Request, viewName, model);

    public static IHttpActionResult GetResponseView&amp;lt;T&amp;gt;(this HttpRequestMessage request, string viewName, T model)
        var service = RazorEngineService.Create(TemplateConfig);

        var result = service.RunCompile($"Views.{viewName}", typeof(T), model);

        return new HtmlActionResult(request, result);

Using the extension in a controller:

public IHttpActionResult Index()
    var model = new AppModel();
    // filling the model accordingly.
    return this.View("Index", model);

Warning: this does not handle partial views, so this has to be managed additionally.

  1. Replace ASPNET.MVC with SPAs

The ultimate goal is to only have service-side code running as services (leveraging ASP.NET Web API), then to build your UI using a front-end stack for that (Angular. React, whichever suits you).

Next step, will be removing Browser HttpBrowserCapabilities dependencies.

Happy coding!