Git Continuous Deployment from VSO to Azure App Service!

Since this week, we can now link directly our Git repository in from Visual Studio Online to our Azure App Service!

NB: it was possible to achieve it before of course, but this was requiring some updates using Kudu directly to create Web-hooks yourself, which then triggered deployment on the App Service.

The update here is that it is just a couple of clicks away!!!

First, go to your App Service, Go into the Settings blade, and then select “Continuous Deployment” to  get started:

1

Then, select your VSO project, and then your branch from your you want to deploy:

2  3

You are all set to have your Nodejs site published straight outta Visual Studio Online!

Once this done, you are all set to enjoy a smooth Continuous Deployment with your Nodejs site and VSO!

NB: It happened to me a couple of times that deployment could fail (very few actually), having a look into Deployment logs was giving me this:

npm ERR! enoent ENOENT: no such file or directory, rename 'D:\home\site\wwwroot\node_modules\.staging\wrappy-e5193d35318260dc6219dca3e7bd3562' -> 'D:\home\site\wwwroot\node_modules\bower\node_modules\update-notifier\node_modules\latest-version\node_modules\package-json\node_modules\got\node_modules\duplexify\node_modules\end-of-stream\node_modules\once\node_modules\wrappy'
│ ├── depd@1.0.1 
npm ERR! enoent ENOENT: no such file or directory, rename 'D:\home\site\wwwroot\node_modules\.staging\wrappy-e5193d35318260dc6219dca3e7bd3562' -> 'D:\home\site\wwwroot\node_modules\bower\node_modules\update-notifier\node_modules\latest-version\node_modules\package-json\node_modules\got\node_modules\duplexify\node_modules\end-of-stream\node_modules\once\node_modules\wrappy'

The bottom line here is some npm packages were not deployed properly (issue related to npm itself here), so the simplest scenario to have them installed right was to use Kudu to get a prompt on the App Service to force npm install again:

4

Why I prefer PaaS over IaaS

A small digression on my techy blog mostly on a subjective aspect of what I like a bout Cloud services in General: PaaS.

I am not going to enter into a complete explanation to differentiate IaaS over PaaS, so I just chose to reuse a couple of well know picture around the subject to highlight my view:

AzureServicesOverview PaaSPizza

The point of these 2 pictures is to show that PaaS offers more managed components than IaaS does.

It is true that PaaS, dues to offered services costs more than traditional IaaS does…

Price/Cost is a thing for sure, but having your team losing time on infrastructure building/patching/deployment/reboots is in today’s world not something you wanna do for multiple reasons:

  • When you teams do these tedious things, they aren’t producing value; neither for you of for your customers,
  • Doing these activities is not something that will make your team happy on the long run, better having it off your plate to let them work on things they like and that will make them learn,
  • Building software these days in more about playing with Legos, and it’s a good thing; building your infrastructure should be the same (of course you still need to know what’s in those boxes, and what are their limitations in what you wanna do with it).

legoWall

Happy legoing!

 

Creating an Azure SQL Database+ using Entity Framework Code first

I have been talking about Azure SQL Database Elastic Pools a few times before, and using it now for self-registering and DB creation on the fly.

I came into an issue when using it along with code first to apply my model to a newly create database; the common message I was receiving was “Database does not exists“…

I figured out that, when SQL Database creation command gets executed, it doesn’t mean that the database is actually completely created, it’s status just after the creation command is “Creating”.

This then makes sense as from the error I am receiving from the API, which means I just need to wait a little bit before applying my Code First model 😀

var sqlMgmtClient = new SqlManagementClient(
    new CertificateCloudCredentials("subscriptionID",
    "your X509Certificate2 instance"));

// Generating a unique ID for the DB.
var dbId = Guid.NewGuid().ToString("N");

var dbParams = new DatabaseCreateParameters(dbId) { Edition = "Basic" };
var db = await sqlMgmtClient.Databases.CreateAsync(ServerName, dbParams);

// Waiting DB to be accessible before updating it.
var wait = true;
while (wait)
{
    var status = await _sqlMgmtClient.Databases.GetAsync(ServerName, dbId);
    if (status.Database.State == "Normal") // else is "Creating"
        wait = false;
    Thread.Sleep(1000);
}

// Updating DB to latest Code First version.
var connectionString = String.Format(this.dbConnectionTemplate,
    this.serverName, dbId, this.adminName, this.adminPassword);
var migratorConfig = new Configuration
{
    TargetDatabase = new DbConnectionInfo(connectionString, "System.Data.SqlClient")
};
var dbMigrator = new DbMigrator(migratorConfig);
dbMigrator.Update();

Happy coding!

Better Density and Lower Prices for Azure’s SQL Elastic Database Pools

SQL Elastic Database Pools is a great feature for people spawning up databases for self registration scenarios as I do for quite some projects.

With yesterday’s announcement, The Azure team is making it less expensive, with the ability to host more databases under the same Elastic pool.

The biggest part is that Basic & Standard pools jump from 200 to 500 max DBs (150% capacity rise!!!).

image_thumb_2

for more information please have a look into Scottgu’s blog here 🙂

New SQL Database feature: Elastic Pools

As announced by Scott Guthrie on his blog a few days ago, here is a really interesting feature coming down the road of Azure: Elastic Pools.

image_7

What is it?

Basically, it is the capability to bundle up to 200 database under the same container for a given performance capacity.

What is awesome about this, is that you get a big container of DBs, with a single price.

The scenario addressed by such an offering is to regroup plenty of databases  with predefined performance and usage targets. You don’t have to provision one DB with over-needed performance capacity each time anymore!

For Multi tenancy apps  or self registering platforms, this a great feature I am looking forward to use soon!

Announcing Windows Server 2016 Containers Preview

This is a great announcement  from the Azure team committing to the Docker platform.

Microsoft is now announcing the first preview of Windows Server Containers as part of along with Windows Server 2016 Technical preview 3.

Going this road MS is committing on providing a native Docker capability with Windows, along with a coming offer for Microsoft Azure to host Docker apps in a simple manner.

More tight and interesting collaboration would come out of this, and this a great news for the whole Docker & Azure ecosystems!

More details on the Gu’s blog of course!

Swagger (Swashbuckle) with dotnet: real life implementation – part 2

Following my first article on how to use Swagger with .NET using Swashbuckle, I wanted to cover a last but very important point: dynamic entities.

I am sure that a lot of you are using dynamic object with document-orentied DBs, which bring the problem of giving a proper output to them with .NET documentation (and strongly type)

In any of these cases, what I usually do first is to never exposing my back-end objects directly through Swagger (which makes real sense with Entity Framework) to give all the flexibility to serialize or not certain properties, or even the way they need to.

1. Using SwashBuckle hooks

Which SwashBuckle we can hook into documentation creation to add/update object properties to set more custom properties. Have a look here in the current documentation.

2. Using string[] and Json.Net objects

What to do if you have complete dynamic objects that you what to show either as empty objects (on which you can add as many properties as you want) or an array of strings or properties ?


// property shown by Swashbuckle as empty object.
public JObject AppInstProperties { get; set; }

// property shown by Swashbuckle as an array.
public string[] EditableUserInAppProperties { get; set; }

In this cases, you could also replace string[] by YourObject[] 🙂

Happy coding!

Swagger (Swashbuckle) with dotnet: real life implementation – part 1

I have been roaming around the web for a while to try to find real hints to implement proper Swagger documentation on top of Web API, and I didn’t really find what I wanted.

I have then being doing myself investigations on the road, and I found very interesting points on how to (or not to!) do things with Swagger and dotnet.

1. Setup

To be able to use your method documentation on you assemblies, the first step is to enable then to be output, whether it’s coming from your main project or referenced libraries:

To do so, go to you build project Build settings, and check “XML documentation file” to specify the file name used to output the XML:

xmlComments

Then in your Swagger project go to your SwaggerConfig.cs in App_Start folder and use the following line of code to make Swashbuckle understand where to Get XML to enrich your methods and object definitions:

c.IncludeXmlComments(string.Format(@"{0}\bin\Gibberish.Api.xml", AppDomain.CurrentDomain.BaseDirectory));

2. Documenting

To document your entities refer to this table to know how xml tags are mapped to Swagger properties, have a look here (coming from the Swagger Github home page):

  • Action summary -> Operation.summar
  • Action remarks -> Operation.description
  • Parameter summary -> Parameter.description
  • Type summary -> Schema.descripton
  • Property summary -> Schema.description (i.e. on a property Schema)

Real example for these for methods:

/// <summary>
/// Updates the mobile entity
/// </summary>
/// <remarks>
/// entity need to be up to date
/// </remarks>
/// <param name="entity">Parameter description goes here</param>
[HttpPut, Route("update")]
public IHttpActionResult UpdateMobileEntity(RestMobileEntity entity)
{
    return Ok(base.mobileService.Update(entity));
}

And for types :

/// <summary>
/// Mobile entity
/// </summary>
public class RestMobileEntity 
{
    /// <summary>
    /// Entity Id
    /// </summary>
    public int entityId { get; set; }

    /// <summary>
    /// Entity content
    /// </summary>
    public string entityContent{ get; set; }
}

That’s a first round for Swagger and dotnet, let’s have a look next about Swagger with dynamic object, and custom theme use.

Back to basics : Generics in Action<T>!

Even if I have been using generics once in a while, it’s still quite interesting to see all scenarios you ca address with them.

To go back to basics, let’s see an old piece of code:

Hashtable d = new Hashtable();
d.Add("1", new User());
d.Add("1", new Group())

Then later in the code…

var user = d["2"] as User;

This does compile, but if you would have to run it… It would just crash.

What Generics avoid here is that it forces you to type (e.g. in case of collections) items in a list to a certain object :

var list = new List<User>();

This at least avoids us to try to cast incompatible types and to mix object types as it is possible for a hastable.

But this is just a very small value of what Generics can do.

You sure have seen so many methods used over LINQ or other parts of the .NET framework that uses SomethingOf<T>, which if the point I want to go to.

Here is a small example of what you can do in that matter:

public static T AsOb<T>(this string s) where T : new()
{
    if (String.IsNullOrEmpty(s))
    return new T();
    return JsonConvert.DeserializeObject<T>(s);
}

Which then gets called this way:

var user = strUser.AsOb<User>();

One single line of code then to de-serialize any string to any json objects (quite usual these days).

You can use additional selectors when using Generics on a methods:

  • class: if you want on ensure only object with are classes will be used,
  • specific class: for objects inheriting that class, letting you use its methods and properties,
  • interfaces: no need to explain 😀
  • finally the “new()” constraint e.g. (to ensure class is not sealed), to be able to instantiate it, as in above code

Now let’s do some more interesting things with it

Taking my code above for json deserialization, here is somehitg people do alllllllll the time:

var user = JsonConvert.DeserializeObject<User>(myObject.UserAsString);
user.LastModified = DateTime.UtcNow;
myObject.UserAsString = JsonConvert.SerializeObject(user);
// Do someting with the updated json (Db update...)

With the small piece of code above, we just change one line….
What if I just use then Action ???

Action if used here to wrap a method call (usually anonymous) to be done upon a given object.
Wait a minute, you can call a moethds passed as a parameter? (looks like JS isn’t it?)

So I can now redo my method like this:

public static string GetScopedJson<T>(this string s, Action<T> ssAction)
{
    var desserialized = JsonConvert.DeserializeObject<T>(s);
    ssAction(desserialized);
    return JsonConvert.SerializeObject(desserialized);
}

Which then makes my code looks like this:

myObject.UserAsString = myObject.UserAsString.GetScopedJson<User>(u =>
{
    u.LastModified = DateTime.UtcNow;
});

This then helps on simplifying very tedious pieces of code that could have been around for a while 🙂

Another example of code I have been using extensively, leveraging the same principle (for ADO.NET lovers):

public void ExecuteReader(string storedProcName, SqlParameter[] parameters, Action<SqlDataReader> action)
{
    using (SqlConnection connection = new SqlConnection(_conn))
    {
        connection.Open();

        SqlCommand cmd = new SqlCommand(storedProcName);
        cmd.Connection = connection;
        cmd.CommandType = CommandType.StoredProcedure;
        cmd.Parameters.AddRange(parameters);

        using (SqlDataReader reader = cmd.ExecuteReader())
        {
            action(reader);
        }
    }
}

This way I can make my calls in ADO.NET pretty damn short:

Organization org = null;
var param = new SqlParameter("@IP", SqlDbType.NVarChar);
param.Value = orgKey;

DatabaseHelper.ExecuteReader(QueryConstants.GetOrganizationByOrgKey,
    new SqlParameter[] { param }, reader =>
    {
        reader.Read();
        org = DatabaseHelper.MapDataToEntity<Organization>(reader);
    }
);

Back to basics : Difference between func and action

Generics are a really awesome feature of .NET; but what makes it even better are all what has been made out of it.

My preferred example her is to talk about Func and Action.

These two are a bit like the Tuple object : a way to give a general mean of a structure.

In the case of Action<T> and Func<T> here are what makes them special:

  • Action<T> is a way to pass a method with parameters (often anonymous) to another method.
  • Func<T> does the same, but also allows for a return type, which means your anonymous methods can then return a computed value used in matter of the method you have been calling.

An example of an Action<T> is (using Azure Table Storage):

public static T GetResult<T>(this TableResult result) where T : class
{
    if (result.Result != null)
        return result.Result as T;
    return null;
}

public T GetTenant<T>(string tableName, string pKey, string rKey) where T : TableEntity
{
    var tableClient = _storageAccount.CreateCloudTableClientWithRetries();
    var table = tableClient.GetTableReference(tableName);

    // Create a retrieve operation that takes a customer entity.
    var retrieveOperation = TableOperation.Retrieve<Tenant>(pKey, rKey);
    var result = table.Execute(retrieveOperation);
    return result.GetResult<Tenant>();
}

An example of a Func<T> is:

public void ExecuteReader(string storedProcName, Func<SqlParameter[]> parameters, Action<SqlDataReader> action)
{
    using (SqlConnection connection = new SqlConnection(_conn))
    {
        connection.Open();
 
        SqlCommand cmd = new SqlCommand(storedProcName);
        cmd.Connection = connection;
        cmd.CommandType = CommandType.StoredProcedure;
        cmd.Parameters.AddRange(parameters());
 
        using (SqlDataReader reader = cmd.ExecuteReader())
        {
            action(reader);
        }
    }
}

Happy coding!