[Code Smell] Primitive obsession

One of the most important aspects in refactoring the most effective way is looking out for Primitive obsession.

What is it?

  • The usage of primitive types instead of utilising small objects to represent data.
  • Use of constants for the setting of information such as role=1
  • Overuse of primitive types can make extending functionality more complicated and introduce more tech debt.

 

Why is this a problem?

Primitive obsession can come about from programming laziness and short cutting. If I want to store a title of a book in my class, then why not just use a string? Its much easier than creating a new object?

As you add more properties to the class, then things start to introduce more noise and the class become larger and harder to read.

 

How to treat this smell?

With lots of primitive types being introduced to the class one way to try and resolve this is to try to group them into logical groups in their own class. Examples of grouping can be something like functionality, behaviour or domain.

Code will become more flexible than using lots of primitive types, and creates a better understanding and organisation of code. It also makes it easier to find duplicated code when they are grouped into the same area of the code base.

Tiered Compilation with .Net Core 2.1

With all the interesting improvements that the team at Microsoft are undertaking relating to performance improvements, I would like to highlight a new preview release that they have provided for new performance enhancements.

Here is just a overview summary of what they have talked about, you can read more in their main article below.

Compilation with .Net Framework

  • Historically compilation performed with tradeoffs, with combination of PreJitting to optimize code for steady state performance, but is slow and will affect initial start time.
  • Alternative method like an econoJit approach that will start fast but code quality will suffer.
  • .Net provides a combination in order to take a balanced approach that will provide a reasonable job for both start up and steady state peformance.

Tiered performance allows .Net to have multiple compilations so that they can be hot swapped, so we can pick best technique for startup and best for steady state performance.

the benefits of this are:-

  • Faster application startup time
    • Tiered compilation ask JIT to compile quickly and optimise if needed.
  • Faster steady state performance
    • Tiered compilation ask JIT to create optimised code in background thread that will replace the pre compiled version.

 

To try this:-

  • If you build the application yourself using .NET 2.1 SDK – Add the MSBuild property <TieredCompilation>true</TieredCompilation>to the default property group in your project file. For example:
<Project Sdk="Microsoft.NET.Sdk">
    <PropertyGroup>
      <OutputType>Exe</OutputType>
      <TargetFramework>netcoreapp2.1</TargetFramework>
      <TieredCompilation>true</TieredCompilation>
    </PropertyGroup>
</Project>
  • If you run an application that has already been built, edit runtimeconfig.json to add System.Runtime.TieredCompilation=true to the configProperties. For example:
  {
      "runtimeOptions": {
        "configProperties": {
          "System.Runtime.TieredCompilation": true
        }
      },
      "framework": {
        ...
      }
    }
  • If you run an application and don’t want to modify any files, set the environment variable
COMPlus_TieredCompilation=1

 

Reference: https://blogs.msdn.microsoft.com/dotnet/2018/08/02/tiered-compilation-preview-in-net-core-2-1/

Is SEO still relevant to acquisitions?

With the ever evolving algorithmic dynamics of google, and how it tries to interpret and classify websites available on the web, there has in recent years seen a shift in the overall tactics once deployed by search engine optimization specialists.


The once common tactics seen on the web such as aggressively building up landing pages and key word stuffing pages in order to attempt to boost rankings are now seen as outdated. More evident with googles recent penguin release such tactics has seen to cause many once establish high ranking sites to loose rankings on various keywords and suffer the search engine penalties associated with them.


Other tactics that have fallen fowl of the search engines are things like participating in large link building campaigns, whereby paid or otherwise links are aggressively sought after in order to build large amounts of them, either by buying placements from other websites or by spamming digital link directories.
The aim now is to not use such disparate link building campaigns and to shift towards building fewer but more quality links that are relevant and possess more suitable and diverse anchor tags.

 

The obsession with that number 1 spot
With previous understanding of user driven behavior and the ability to acquire new users to your website, there was always the obsession with obtaining the number 1 spot. It is however still relevant to aim as high as possible and at the least to remain on the first page of any given search term. However the understanding of users behavior and how people interact with search results has changed alot. With most search engines and especially with those of google, the companies revenue is based upon the amount of ads its able to sell, and as such there has been a greater prevalence in the ads seen at the top of search results. This has also resulted in a shift in users behavior as they are more aware of the prevalence of advertisements located at the top of search results, thus have adapted their behavior in the understanding that they may have to scroll further down the result set in order to locate the results that are of more relevance to them.


The main tactic in order to capture an increase in the click through rate is the better utilization of the web pages title and descriptions that will be displayed in the search result. It has been shown that these have a greater impact as to whether links are clicked and selected if they are tailored to each relevant page with meaningful descriptions and titles.
Creating a wide range of landing pages and keywords what are well targeted to long tail searches prove a good move in order to capture ranks for more defined niche search traffic. The focus on these better quality content pages with deep dives into topics help to bring in the various small percentile of keywords, but accumulatively provide a greater range of search results. Gone are the days of creating lots of landing pages with thin content in them as google especially are now removing these pages and penalizing sites that perform this kind of tactic.


Any change that is being made to a website require long term planning and commitment as improvements generally take a long time period to acquire any improvements in search rankings. The usual timescale for any such improvements would be on a timescale of 2 to 6 months, but this may be accelerating depending on your sites value and the terms being sought after.