Tiered Compilation with .Net Core 2.1

With all the interesting improvements that the team at Microsoft are undertaking relating to performance improvements, I would like to highlight a new preview release that they have provided for new performance enhancements.

Here is just a overview summary of what they have talked about, you can read more in their main article below.

Compilation with .Net Framework

  • Historically compilation performed with tradeoffs, with combination of PreJitting to optimize code for steady state performance, but is slow and will affect initial start time.
  • Alternative method like an econoJit approach that will start fast but code quality will suffer.
  • .Net provides a combination in order to take a balanced approach that will provide a reasonable job for both start up and steady state peformance.

Tiered performance allows .Net to have multiple compilations so that they can be hot swapped, so we can pick best technique for startup and best for steady state performance.

the benefits of this are:-

  • Faster application startup time
    • Tiered compilation ask JIT to compile quickly and optimise if needed.
  • Faster steady state performance
    • Tiered compilation ask JIT to create optimised code in background thread that will replace the pre compiled version.

 

To try this:-

  • If you build the application yourself using .NET 2.1 SDK – Add the MSBuild property <TieredCompilation>true</TieredCompilation>to the default property group in your project file. For example:
<Project Sdk="Microsoft.NET.Sdk">
    <PropertyGroup>
      <OutputType>Exe</OutputType>
      <TargetFramework>netcoreapp2.1</TargetFramework>
      <TieredCompilation>true</TieredCompilation>
    </PropertyGroup>
</Project>
  • If you run an application that has already been built, edit runtimeconfig.json to add System.Runtime.TieredCompilation=true to the configProperties. For example:
  {
      "runtimeOptions": {
        "configProperties": {
          "System.Runtime.TieredCompilation": true
        }
      },
      "framework": {
        ...
      }
    }
  • If you run an application and don’t want to modify any files, set the environment variable
COMPlus_TieredCompilation=1

 

Reference: https://blogs.msdn.microsoft.com/dotnet/2018/08/02/tiered-compilation-preview-in-net-core-2-1/

Is SEO still relevant to acquisitions?

With the ever evolving algorithmic dynamics of google, and how it tries to interpret and classify websites available on the web, there has in recent years seen a shift in the overall tactics once deployed by search engine optimization specialists.


The once common tactics seen on the web such as aggressively building up landing pages and key word stuffing pages in order to attempt to boost rankings are now seen as outdated. More evident with googles recent penguin release such tactics has seen to cause many once establish high ranking sites to loose rankings on various keywords and suffer the search engine penalties associated with them.


Other tactics that have fallen fowl of the search engines are things like participating in large link building campaigns, whereby paid or otherwise links are aggressively sought after in order to build large amounts of them, either by buying placements from other websites or by spamming digital link directories.
The aim now is to not use such disparate link building campaigns and to shift towards building fewer but more quality links that are relevant and possess more suitable and diverse anchor tags.

 

The obsession with that number 1 spot
With previous understanding of user driven behavior and the ability to acquire new users to your website, there was always the obsession with obtaining the number 1 spot. It is however still relevant to aim as high as possible and at the least to remain on the first page of any given search term. However the understanding of users behavior and how people interact with search results has changed alot. With most search engines and especially with those of google, the companies revenue is based upon the amount of ads its able to sell, and as such there has been a greater prevalence in the ads seen at the top of search results. This has also resulted in a shift in users behavior as they are more aware of the prevalence of advertisements located at the top of search results, thus have adapted their behavior in the understanding that they may have to scroll further down the result set in order to locate the results that are of more relevance to them.


The main tactic in order to capture an increase in the click through rate is the better utilization of the web pages title and descriptions that will be displayed in the search result. It has been shown that these have a greater impact as to whether links are clicked and selected if they are tailored to each relevant page with meaningful descriptions and titles.
Creating a wide range of landing pages and keywords what are well targeted to long tail searches prove a good move in order to capture ranks for more defined niche search traffic. The focus on these better quality content pages with deep dives into topics help to bring in the various small percentile of keywords, but accumulatively provide a greater range of search results. Gone are the days of creating lots of landing pages with thin content in them as google especially are now removing these pages and penalizing sites that perform this kind of tactic.


Any change that is being made to a website require long term planning and commitment as improvements generally take a long time period to acquire any improvements in search rankings. The usual timescale for any such improvements would be on a timescale of 2 to 6 months, but this may be accelerating depending on your sites value and the terms being sought after.

Using the Priority Queue Pattern with our Microsoft Azure Solution

In a recent programming conundrum, I was trying to find a way I could promote certain customer queries to the top of the support list compared to that of a general nature. The limitations with the azure service bus is the inability to add priority headers to packets being placed into the queue, so that was a nonstarter.

With queues, they work on a FIFO (first in first out) nature and allow developers to decouple the components that are adding items to the queue from those that are performing tasks against it.

The queue in general was great at adding items to be processed in an asynchronous nature, however with recent updates to our SLA requirements we were encountering issues where non urgent request were creating a service bottleneck in our system.

The solution that we decided to try to implement was the priority queue on the Microsoft Azure using a few message queues and multiple consumer instances against those queues.

 

The plan was to identify the data being processed from the producer side, and based on that generate the relevant packets and use the message router to send the packets of data to the corresponding queue. In our example, we spawn 3 types that were being used (High, Medium and Low priorities).

In essence the queue would function as normal with the producer – consumer peeking and deleting messages as they are being added to the queue. The difference and where the priority queue pattern comes into play is the number of consumers being allocated to subscribed to the particular queues. With the high priority queue, we have 5 instances competing to consumer the messages. With the medium we had 3 and with the low we had one. The result of this is that the high priority queue would be able to handle many more requests and faster than the other queues, and therefore would be able to provide a far better SLA time and meet expectations.

 

priority-queue-separate

 

For more information on it you can read https://docs.microsoft.com/en-us/azure/architecture/patterns/priority-queue

You can also find the implementation example on Github