Mixed Routes with WebApi, OData and Owin

We are currently looking to kick off a project that involves a lot of displaying information with Search, sort and filter functionality.  As part of this we are looking to use OData to give us a lot of “Bang for our Buck” with regards to the query syntax and it’s great out of the box functions for things like paging etc.

One of the requirements though is that we can still use the out of the box WebApi controllers to mix and match based on the needs of the data being returned. (Some of it just doesn’t make sense to send via OData).  So along with a lot of the learning with OData I needed to figure out a way of getting it all to play nice with our Owin based WebApi project.

The solution I came up with can be found in the gist below or here.

The main elements of which were derived from watching a great pluralsight video from Kevin Dockx

Points of note are the Startup.cs file and in particular the ordering of the Routing declarations. With config.MapHttpAttributeRoutes(); needing to come before the OData configuration calls.

The controllers are then using Routing attributes as I feel this gives a more readable approach and easier to track down what URI actually points to that action.

The API controller uses the RoutePreFix attribute rather than a convention in the routing parameters again I just find this easier and cleaner when understanding what the URI scheme is for the controller. The OData controller uses Attributes but the prefix is set in the startup.cs class

                routeName: "ODataRoute",
                routePrefix: "api/odata",
                model: builder.GetEdmModel());

Note the “api/odata” value. The controller then takes on the matching of the action to the EntitySet. So the following

[EnableQuery, ODataRoute("Sales")]
public IHttpActionResult Get()
    return Ok(_context.PrismSaleSummaryView);

Gives us a mapping to the EntitySet for Sales which is defined in the OData Configuration section of the Startup.cs file.

ODataModelBuilder builder = new ODataConventionModelBuilder();

So in summary the configuration described above allows us to call the API in two distinct ways.

1: OData call to sales end point with a page of 10 records


2: WebApi call to get the current logged in user profile







These helped and are good reads for getting to grips with this.



Array Find in Angular js 2 with typescript

This post is more of a quick note as much as anything.  A little side project I’m currently working on required a little juggling in JavaScript.  The problem was I wanted a user to be able to add Items to an array.  The items where indexed by size.  If the then added additional Items of the same size I wanted a cunt to be incremented.

After a few different approaches that included some multi dimensional arrays I settled on a simple “find” with JavaScript. The basic principal is the function finds the array item and then increments the count.  A plunker can be seen with the end result below.

Please bear in mind this was for a specific project and browser compatibility is not a consideration.  Find is not supported by all browsers and requires polyfill. This is described in the Mozilla document below.




Seq, Azure and Serilog

seq banner

As part of our application development we are becoming more and more reliant of logging to help us figure out what’s really going on with the applications.  A really nice framework for doing this is serilog. Serilog is an amazing .net logger for producing structure logs. I won’t go into serilog as part of this post but if you are interested in finding out more I highly recommend the pluralsight course http://app.pluralsight.com/courses/modern-structured-logging-serilog-seq

Serilog works even better with the Seq, Seq is a fantastic front end portal that sits over structured logging and makes it really easy to filter noise and get insights into what’s been going on under the hood.  It’s even integrated with the new Microsoft.Extensions.Logging.

The main point of this post is to highlight how I got our logging up and running with some spare MSDN credits we had knocking about. The tutorial on the Seq site was great up was a little outdated.

You can find the origianls here http://docs.getseq.net/docs/azure-installation

The following will be more of a step by step guide.  In production you may want to make some of the security a little harder.  This is a development log server and won’t hold any sensitive information, with a fire-hosed approach to logging you could easily leak out sensitive data so the logging server would need to be secure.  I’m sure people with bmore battle sense than myself will point out better approaches to this.

Ok so, first steps.  Login to your Azure portal. https://portal.azure.com.  A great part of the Microsoft Develop Network benefits are Azure credits, this are what I’ll be using for our server here.

Seq Server - Step 1

in the search bar start to type windows server and you should see the 2012 R2 Datacenter option appear. Select that.

Seq Server - Step 2

Select that edition form the list of returned results and you will be presented with another “blade” (Blades are a key part of the new Azure portal and have taken me a while to get used to but they grown on you.)

In the new blade leave the drop down selected as “Resource Manger” then hit the create button.

Seq Server - Step 3

You will new be presented with a wizard that guides you through the steps to configure the new Virtual Machine Instance.

Seq Server - Step 4

Set up the Name, username, Password and also the subscription and Resource Group. For this example I’ve simply left that as the Default-Storage for North Europe.

Next you will be presented with a Size option, In this example I selected A1 as it’s a pretty cheap option and for logging you don’t necessarily need SSD and multicore. We havea  machine set up on the D1 plan with SSD and it’s a little faster but not drastically so.  The choice is your’s.

Seq Server - Step 5

In the next set of settings, I’ve left a lot of the defaults because this machine is purely for dev/test logging and doesn’t need to be part of any other virtual networks or work with any existing infrastructure. The settings that where changed are as follows:

On the Public IP Address blade I changed the address to be static and gave it a name that will be used later. seq-logger1 which is simple enough for our needs.

Seq Server - Step 6

On the Network Security Group I created a new group and opened port 80 to allow HTTP traffic to the site.  Make sure you don’t conflict with the priorities of the rules or you will get an error.

Seq Server - Step 7

You will finally come to a summary page that will show you the details of all the setting you have configured.  Once you click OK on this it will take a little while.

Once the machine is deployed you will be able to see it in the virtual machines list.  Click on it and use the connect icon to download a remote desktop session connector.

Seq Server - Step 8

We now want to set a DNS name s we can bind to it with seq. GO to the Public IP address section and click on the configuration blade to open the settings.  Enter a name in the dns section and make a note of it.

Seq Server - Step 12


Lets get on the box now.

Open the downloaded rdp session and enter the credentials you used when creating the machine. Make sure you set the machine name or I.P for the machine as the domain of the user otherwise you could have issues with domain conflict and user not found or passwords failing.

Seq Server - Step 10

You will receive a notification about certificates and remote access this is fine. Click Don’t ask me again and then Yes.

Seq Server - Step 11

Once logged in we will need to turn off the IE Enhanced Security Configuration so we can go out and get seq. This can be done my selecting Local Servers in the Server management console that appears of the first load.  From there just select off for administrators

Seq Server - Step 13

We also need to open up the machines firewall.  We have exposed the machine port via Azure but we still need to go to Windows Firewall > Add inbound rule and add port 80 and maybes 443 if you are planning to use secure connection.

Seq Server - Step 19png

Next load up internet explorer and got to http://getseq.net then download the latest MSI which was 3.1.17 at the time of writing this.

Seq Server - Step 14

From here run the installer selecting yes to the terms and then follow through with the default options taking note of where the location is for the files.

Seq Server - Step 15

On the final screen untick the install seq service. We don’t want to do this just yet.  If you wish to use the default port of 5431 then feel free to do this but you will need to open up the appropriate firewall rules and endpoints. As this server is purely for Seq we are going to stick with listening on port 80.

Seq Server - Step 16

Go to the location of Seq in windows explorer and create a new folder called data.

Seq Server - Step 17

Then click on the seq.Administration tool which will run the wizard for setting up the service.  This could also be done from the command line for those who like that sort of thing.  You can read more about it here http://docs.getseq.net/v3/docs/storage-paths-and-urls

Seq Server - Step 18

Once the install is complete you will be able to see a green Browse Seq button (As long as all went well). Clicking this will bring up a browser and take you to the events screen.

Seq Server - Step 20

A few things to note are the obvious security concerns of mass logging application data particularly as we haven’t used ssl. The other aspect is Seq’s security model means you would have to pay for a licence to have the ability to lock seq down on a per user basis.  Seq is an amazing product and I highly recommend doing this but be careful that whilst you are on the trial version anybody with access to the machine can view seq.

In a future post I will sow you how to wire up MVC 6 so that you get full debugging to a Seq server.

Any constructive comments, points or criticisms would be greatly appreciated.


Building a Dashboard

Modern web portals as Line Of Business Apps

Currently we are embarking on some interesting development projects which seem ideal for a series of blogs posts.  The posts will highlight paths that we are going down, decisions that we have made and why and also any sidesteps we have to take.

Hopefully the series will provide interesting for others and also act as a stage to create some discussions about the decisions we have made and the strategies we have used to achieve our goals.

At work we are currently working on a simple little portal site.  The site is a work in progress that will for the backbones of a more sophisticated portal as we progress with the system.  The ultimate goal is to harness the power of micro services and 3rd party services and wrap them with a modern web portal, giving the users insight across all of the applications without having to leave the browser and ultimately have one place to go for all business information.

Currently the source code resides on an internal source code repository (Mercurial for those that are interested) but hopefully at some point we will have a publicly exposed version for others to look at.  The initial plan is to put some of the more interesting code out as gists so others can see and touch it, then we will hopefully release projects under an open source licence once we can.

Phase 1

As we are pushed for both time and resource the decision was made to go with an Agile style release early, release often type schedule.  This has meant that a number of architectural compromises have been made.  At least for the initial release.


The project will ultimately see single sign on been delivered via Identity Server and have access controlled across numerous micro services with JWT’s utilising the OpenID/OAuth methods.  Due to the learning curve and time getting all the servers up and running to handle OAuth in a windows authentication environment we have decided to leave this out for the first release.  We also toyed with using ADAL but this didn’t fit with the company’s internal policies.  So the decision was made to handle the initial client and api on a single server with web api utilising the inbuilt IIS windows authentication functionality.

This means authentication is as simple as decorating our controllers with the “Authorize” attribute.  We know this will cause problems down the line but at speed up the release and getting things moving we will live with it for now.

Angular 2

One of the key drivers was to try and hit the early adopters end of the curve so we aren’t out of date before the application is released but due to the learning curve as well as the availability of some of our chosen core libraries we have stuck with Angular 1 but 2 is being kept an eye on and we will look to make the move to NG2 and typescript as soon as we feel it’s appropriate.

Front End Framework

Choosing a front end framework that would let us move fast was one of the key considerations.  Bootstrap seems the default framework for responsive websites.  We’ve already used it for a couple f internal projects and are more than comfortable with it but it’s always felt a little lacking in functionality.  Having to grab libraries from all over github to piece together a full functioning portal was a bit painful so we looked at the Angular Material library which seems to have much more built-in functionality along with the strong guidelines of the Material design library we thought it was a defiant front runner.  The other library we considered was Microsoft latest offering  Fabric UI.  This would have fit in nicely with our other internal apps such as CRM, 365, and the office suite.  The main concerns where that it’s still relatively immature compared to bootstrap and material but also the angular library based on it was very limited when we initially looked into things.  We toyed with the idea of becoming contributors and helping build the library but with tight timescales and limited resources we had to pass. (For the time being at least)..

So based on the number of controls, functionality and quality of support and documentation we chose angular materials.


Tooling will be a mix of Visual Studio 2015 more for the server side of things and VS code for front end.  Visual studio is still the out and out best IDE for developing Microsoft applications but for light weight front end development I’m rapidly falling in love with VS code.  It is a productivity dream and once you get the right snippets from the market place it really shines for rapid development.

Gulp will be used to automate some of the front end tasks such as sass, dependency injection for bowers libraries and ultimately minification, transpilation etc. The core of the scripts will be simplified versions from the John Pappa pluralsight course.  Some great examples of which can be found on git hub we have also heavily taken from the Angular Clean code course and John’s rather brilliant  Angular style guide.

Server Tech

Based on the fact that .Net core is a massive change in development we were hesitant at first to jump down the rabbit hole and actually run with something that is still in transition and has a fair amount of unknown’s but after the recent RC2 announcement  and looking at the roadmap along with the fact that it’s just lovely to develop in Visual Studio with the new json config files and wwwroot folder.  We decided to go ahead and bite the bullet.

I think this is enough rambling and info for now. I’ll try and keep the posts short and specific as we come up with problems address issues and progress with the project.