Creating analysis rules with SQL Enlight – Overview

One of the most exciting features in SQL Enlight is the ability to create your own static analysis rules. This is a very powerful feature that can be of great use and unleash almost an unlimited number of possibilities for the ones who consider trying it.

The current post is the first of a series of blog posts which to guide you through the creation of custom static code analysis rules with SQL Enlight. Each of the subsequent articles will cover the implementation of several interesting analysis rules and will point some unfortunately not so obvious features and abilities of SQL Enlight static code analysis engine.

How does SQL Enlight analysis work

SQL Enlight analysis workflow.

  1. Parse T-SQL code, validate the syntax, and generate SQML batches.
  2. Load connection context and generate context XML.
  3. Generate an XSLT template from analysis rules expressions.
  4. For each parsed T-SQL batch:
  5. Apply XSLT transformation on the SQML of the current batch using all active batch scoped analysis rules. The analysis context and rule parameters are provided to the analysis rule as XSLT variables.
  6. Apply XSLT transformation on an empty document using all active Context Only analysis rules. The rules’ parameters and the analysis context are provided to the analysis rule as XSLT variables.
  7. Return rule violations.

Types of analysis rules

Context rules

The Context Only rules can be used to only analyze the current database schema. This kind of rule does not depend on the analyzed T-SQL code, but only on the current connection context which is available inside the rule expression using the $context (the root Context node), $server (current Server node) and $database (current Database node) variables.

Important for the Context Only analysis rules is that they are evaluated only once at the beginning of the analysis process.

Candidates for Context Only rules are all the analysis rules which do not refer to the SQML but only use the current analysis context.

Batch rules

The Batch analysis rules also have access to the analysis context but are focused on the T-SQL code its SQML representation. The batch rules are applied separately for each of the analyzed T-SQL batches.

For specific types of checks might be convenient to have separate implementations for both rule types in order to cover both database schema and T-SLQ code violations. An example of such rule is SA0043 which has one implementation (SA0043A) that checks the current database schema for the usage of the obsolete data types and another implementation that checks the T-SQL code for usage of those types.

Explicit rules

The explicit rules are not a separate rule type, but just rules added to the Explicit Rules group. This group is meant to store all the rules which are designed to be executed separately, because the results they produce generally (but not necessarily) are more informational than indicating defects or showing possible problems.

Dynamic Rules

Dynamic rules are the rules which use direct queries to the underlying database to retrieve analysis data and generate results. This kind of rules mainly process and output the server state information returned by the SQL Server dynamic management views and functions. This information can be used to monitor the health of a server instance, diagnose problems, and tune performance.
We will pay more attention to each of the rule types in the next posts.

Analysis Rules Expression

Analysis rule expression is the actual implementation of the analysis rule. The rule expression can be defined using the standard XSLT syntax, but with some small restrictions to the allowed XSLT elements.

The restrictions are meant to disallow output results which are not valid according the SQL Enlight analysis engine. Almost all elements of the XSLT standard syntax are allowed except elements which can directly manipulate the result data (like xsl:call-template, xsl:value-of,xsl:element,xsl:attribute and etc.). This kind of elements are allowed as child elements of xsl:varaiable, xsl:with-param and xsl:param elements. Simple analysis rule matching expression can be defined like this:

<xsl:for-each select=//o:comparison[k:null or g:expression/k:null]>

<xsl:call-template name=output-message>

<xsl:with-param name=lineselect=@se:sline />

<xsl:with-param name=columnselect=@se:scol />

<xsl:with-param name=msgselect=$v-rulename />

<xsl:with-param name=descselect=concat($v-rulename,’ : ‘,$v-ruledescription) />

<xsl:with-param name=nearselect=@name />

<xsl:with-param name=typeselect=$v-ruleseverity />



The XPath expression “//o:comparison[k:null or g:expression/k:null]” matches all comparison operators having NULL as one of their operands.
The call to the ‘output-message’ template is the only valid template call (‘output-error’ is still supported, but is obsolete and is not recommended). The call to the ‘output-message’ named template internally returns to the SQL Enlight engine the rule violation information that is provided from the rule.

Rule variables

Following variables $v-rulename, $v-ruledescription, $v-ruleseverity are initialized when the rule is generated and hold respectively the rule name, description and severity.

The following variables are also available inside the analysis rule:

  • $context – holds reference to the analysis context node set
  • $server – references the server node in the analysis context
  • $server-name – current server name
  • $database – references the node of the current database on the current server
  • $database-name – the name of the current context database
  • $is-server-case-sensitive – gets value ‘yes’ if the server collation is case sensitive.
  • $is-database-case-sensitive – ‘yes’ if database has default case insensitive collation
  • $parameters – holds node-set of Param nodes for accessing rule parameter values

Analysis Rule Parameters

Create parameters for the analysis rule using the Parameters tab.

A single analysis rule parameter can have none or several predefined values from which the user will be able to choose before running the Static Code Analysis command.

A parameter value can be referenced inside the analysis rule using the $parameters variable:

<xsl:variable name=ParameterValueVariable


The above expression assigns the parameter value to the local XSLT variable.

The Always require parameters to be set analysis rule option can be used to make the Analysis Parameters dialog always appear when the rule is applied. If this option is not checked, the default values of the parameters will be used.

Analysis Context

The Analysis Context represents the current server and database schema information that SQL Enlight loads before starting the analysis. The analysis context is an XML document that holds information for the current SQL Server, the current database. The current database is determined initially from the SQL connection and later by the database switches (‘USE [DatabaseName];’ statements) inside the script.

How to query the analysis context

Example XPath expression for retrieving the XML node of the function ‘ufnGetAccountingStartDate’ from database ’AdventureWorks’:

<xsl:variable name=functionselect=$context/Server/Databases/Database[@Name=’AdventureWorks’]/Functions/Function[@Name=’ufnGetAccountingStartDate’]/>


<xsl:variable name=functionselect=$server/Databases/Database[@Name=’AdventureWorks’]/Functions/Function[@Name=’ufnGetAccountingStartDate’]/>

The same XPath expression, but assuming that the ’AdventureWorks’ database is the current database:

<xsl:variable name=functionselect=$database/Functions/Function[@Name=’ufnGetAccountingStartDate’]/>

Generating test analysis context

Sometimes it can be handy to be able to test your analysis rules with different than the default test analysis context.

You can create your own analysis context XML for testing your analysis rules either by editing the AdventureWorks one or by using the Enlight command-line tool with the command ‘analysiscontext’.

For instance, the command below will generate an analysis context using the AdventureWorks2008R2 database:

Generate Analysis Context with SQL Enlight command line tool.
Enlight.exe analysiscontext /server:. /database:AdventureWorks2008R2 /o: e:AdventureWorks2008R2.xml

Change the test context

To be able to test your analysis rules with a specific database schema, you can use the generated context file and set it as the default analysis context source at the SQL Enlight->Options->Settings.

XSLT Extension

SQL Enlight analysis rules support several XPath extension functions and modules that provide features missing in XPath 1.0.

  • EXSLT Extensions (http://www.exslt.org/)
  • Microsoft XPath Extension Functions
  • Standard XPath Functions
  • Ubitsoft XPath Extension Functions

For more information and reference for the supported XPath extensions, refer to the SQL Enlight documentation here.

To be continued …

In the next posts, we are going to continue and walk through the implementation of the different types of analysis rules and their specifics.

2 replies
  1. Chip Bergquist
    Chip Bergquist says:

    I have downloaded SQL Enlight and begun to investigate it. I am very interested in the static analysis rules and making some new ones for my developers. MS Visual Studio has several static analysis rules for C# and making sure that we do not violate any of those recommendations has significantly improved the quality of our C# code. My hope is this will help us do the same for our SQL code. I look forward to the continuation of this tutorial so that I can make new static analysis rules. Please feel free to contact me if you have any questions about what we are doing and new rules we hope to add.

  2. siddu
    siddu says:

    hi i have a requirement need to develop the specific rule for our requirement with regular expression


    dbo.up_addCarBoduy like this how to implement the reqular expression in SQL Enlight

    i am writing in parameter tab under Analysis–>Rules->edit–>Rule–>parameter

    under this in value column i put :regexp:dbo.up_[a-z]+ but not working out

    could you pls hint on this

    thanks in advance.


Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *