Microsoft Code Analysis – An introduction

A tool I’ve been keen to dive into for quite some time is Microsoft’s managed Code Analysis (CA) which comes integrated in Visual Studio. It’s main purpose is to improve code quality by pointing out (by default) a list rule violations according to the .NET Framework Design Guidelines. From what I can tell on MSDN the tool dates back to VS2005 but I’ll be taking a look at the VS2015 version, although, it might supprise you that this tool has been around since VS2005!

The CA tool’s purpose is to point out potential problems in your design (such as using properties rather than fields, using generics where appropriate, excessive method parameters), coding style (such as naming conventions, removing redundant ‘this’ qualifier on class members), security related issues (such as SQL injection) and even compiler warnings which is great because all issues flow through the same pipeline, useful for reporting the results after a build. A bit like StyleCop and FxCop rolled in to one. Unlike those tools though CA does offer solutions in some cases.

Overview

You might not be aware that CA is already available in your solution and just needs to be enabled. If you view properties on one of the projects in your solution you’ll see a tab called “Code Analysis”. By default all of Microsoft’s rules are enabled and at the time of this writing there are 655. Going through this list might take a little time. Luckily there’s a list of default rule sets to chose from, each with a different focus.

ca-projectproperties

So what are rules, categories and rule sets? At the lowest level, what you’ll be looking at is a Rule which is (at this point) specific to C# and targets specific issues and conditions in code. Rules can only belong to one categeroy (i.e. Security or Style) and each have a unique identifier. A rule set is a list of categories (or groupings) with each category containing a list of rules. By default Microsoft provide 10 out-of-the-box rule sets which have some preselected rules from each category enabled by default. You can also create your own (which I’ll cover a bit further down).

Because I’m just starting out, I’ll chose an existing rule set with a preconfigured list of rules in each category. In the screenshot below I’ve gone for the Basic Correctness Rule.

ca-sensdefaults

And this rule set enables the appropriate rules which will run for this project.

ca-sensdefaultsrules

A Solution wide rule set

The example above demonstrates having a per-project rule set. Perhaps your business logic could use the “Design” category of rules applied more heavily than than presentation layer where the code semantics are different. In other cases it makes sense to have a solution-wide rule set with rules more focused on naming conventions and maintainability metrics.

By right-clicking on your solution, selecting properties and “Code Analysis Settings” you can enable rule sets at the solution level.ca-solutionproperties

Custom Rule set

The default rule sets provided are a good start, however, not many of us work in silos so you’ll want discuss a set of rules which make sense for your team and the product your developing. You cannot modify the built in rule sets but you can create your own. Doing this isn’t the most intuitive process.

Right click on any project and choose Properties. Then click the Code Analysis tab and from the drop down list “Run this rule set”,  select “Choose multiple rule sets”. ca-createnewruleset-1

You’ll then be presented with a list. From this list you can choose one or more rule sets to base your own rule set, and, if you choose more than one rule set it will combine the category selection of rules from each rule set into one. In this example I’ve chosen three rule sets. Once I click save as, I can choose a location for the new ruleset file. You can either save this at the solution or project level, whichever makes more sense for its usage.

ca-createnewruleset-2

Once the rules are saved, you’ll go back to the project properties and from there, if you click Open, you can customise the rule set once more.

Executing Rule sets

There are two ways rules can be evaluated. One is at compile time within Visual Studio and the other, also at build time, is at the command line which makes more sense for your continuous integration server.

Within Visual Studio

Depending on the number of rules for each rule set, having them run each time a soltuion is compiled, especially a large one does slow down your development cycle, especially if you’re running unit tests frequently. In fact, I enabled this on 3 of the 34 projects open source project nopCommerce and it added roughly 5 seconds to the build time. Your timings may vary so don’t let that turn you off.

You can find this option in project properties. If you have lots of projects, you’ll unfortunately need to enable it manually for each one.

ca-runonbuild

After build, any rule violations will show up as warnings.

ca-ruleviolationlisting

If you want these warnings to be treated as errors, then you’ll need to modify a compiler flag to do so, although, you can change this in project properties under Project Properties > Build > “Treat warnings as errors”.

At build time

Almost always you’ll want these rules executed in your build pipeline. Perhaps not on every build, you could limit the analysis to every second commit or perhaps a job that runs once a day or week which produces a report. Spewing out a report on each commit will almost garantee that the information will be ignored by developers.

Speaking of reports, wouldn’t it be nice if the results from Code Analysis were written in to a file which can then be transformed by XSLT? Indeed. Adding the parameter CodeAnalysisLogFile=true will write the contents of the analysis for each project in the project directory. I have not determined how to create a consolidated output.

Ignoring rules aka Supress Messages

There will be times where a rule doesn’t apply to a section of code or even an assembly, yet, you still would like a solution wide rule set to maintain consistent codebase. In these cases you can suppress (ignore) a rule at the a particular member, class or assembly level.

Suppress messages for Members and Assemblies

The example here shows a method which might be calling Dispose() on the textWriter object multiple times. It would be correct if the XmlWriter.Create() method was not being used. In this case I’ve provided a hint to not only the static analyser but also the next developer. I’ve also included the class name DoNotDisposeObjectsMultipleTimes of the rule for clarity although I could have left it out and simply used the code CA2202.

One thing which caught me out was the first two parameters, category and checkId. If they do not match up it will continue generating the warning. As its name implies, category is the name where the rule is defined and checkId is the unique rule ID or code. Originally I thought the only identifier would be important but it turns out you need both. You can find this value in your Error List window as seen below.

ca-ruleid

That was before I found out about the right-click way of doing things. If you’re using Visual Studio instead of a text editor then you can simply click on a message and ignore it for that member or the entire assembly.

ca-ruleignorerightclick

Selecting “In Suppression File” will create a GlobalSuppressions.cs file in the root of your project structure containing all of the supression rules for that assembly.

Suppress messages with build configurations

If you decompile (or look at the source code) for the SuppressMessageAttribute you’ll see the Conditional("CODE_ANALYSIS") attribute defined. What does that mean? Well if you define a conditional compilation symbol in your Debug build configuration then you can toggle code analysis. You can read more on the ConditionalAttribute here.

On for Debug
On for Debug
Off for Release
Off for Release

I’m sure there are other creative ways to use this.

Final words

There’s always a danger of CA reports becoming noise in a developers workflow. The last thing we need to distract us is meaningless information or to be presented with thousands of warnings. However, CA provides valuable information which unit testing alone cannot provide and it in some cases, will find problems which might have passed code review.

Introducing a new tool in to your build pipeline requires far more investment than simply modifying your Jenkins job to execute CA at build time. It requires team buy-in, some training on how to supress messages, working out an appropriate time to run analysis and carefully crafting reports so they matter.

What happens when a report is produced and who looks after investigation of issues? Agreed upon processes among your team for dealing with CA messages and team buy-in, will be the biggest tasks ahead. Remember that when evaluating CA.

It’s been interesting and fun diving in to the Code Analysis tool. It has it’s place in the static analysis tool chain and has a nice set of features which combine some functionality StyleCop, FxCop and highlighting potential problems.

It comes bundled with all editions of Visual Studio 2015 (including community).

Further Reading

Official Code Analysis documentation from Microsoft

T-SQL Static Analysis if you install SQL Server Data Tools