Query store – next generation tool for every DBA

Along with the release of SQL server 2016 CTP 3 now comes the preview of a brand new feature for on premise databases – the Query Store. This feature enables performance monitoring and troubleshooting through a log of executed queries.

This blogpost will cover the aspects of this new feature including:

  • Introduction
  • How to activate it
  • Configuration options
  • What information is found in the Query Store
  • How to use the feature
  • What’s in it for me


The new feature Query Store enables everyone with responsibility for SQL server performance and troubleshooting with insight to the actual queries and their query-plans. It simplifies the old way of setting up tracing, logging and event handling to a standard, out of the box, feature.

It enables you to find causes for performance differences due to change in query plans. It also captures historic data from queries, plans, statistics (runtime), and keeps these for later review. This storage is divided into configured time-slots.

All in all, this feature enables you to monitor, capture and analyze performance issues in the server with a few standard settings.

How to activate it

The feature can be enabled in to ways – from SSMS with mouse-clicks or from T-SQL statements.

Enable Query Store from Management Studio

From the object explorer window, right click the database and select Properties.

Here click the Query Store page and change the ‘Enable’ to TRUE:

query store

Enable Query Store from T-SQL statement

In a new query window the following statement enables Query Store on the database ‘QueryStoreDB’:


Configuration options

The Query Store has a series of configuration options. All of them can be set from the SQL Server Management Studio with clicks or with T-SQL Statements.

OPERATION_MODE – This can be READ_WRITE or READ_ONLY and states if the Query Store is to collect new data (READ_WRITE) or not to collect data and just hold current data (READ_ONLY).

CLEANUP_POLICY – Specifies through the STALE_QUERY_THRESHOLD_DAYS the number of days for the query store to retain data.

DATA_FLUSH_INTERVAL_SECONDS – Gives the interval in which the data written to the Query Store is persisted to the disk. The frequency, which is asynchronous, for which the transfer occurs is configured via DATA_FLUSH_INTERVAL_SECONDS.

MAX_STORAGE_SIZE_MB – This gives the maximum size of the total data in the Query Store. If and when the limit is reached, the OPERATION_MODE is automatic changed to READ_ONLY and no more data is collected.

INTERVAL_LENGTH_MINUTES – Gives the interval at which the data from runtime execution stats is aggregated. The option gives the fixed time window for this aggregation.

SIZE_BASED_CLEANUP_MODE – When the data in the Query Store gets close to the configured number in MAX_STORAGE_SIZE_MB this option can control the automatic cleanup process.

QUERY_CAPTURE_MODE – Gives the Query Store option to capture all queries or relevant queries based on execution count and resource usage.

MAX_PLANS_PER_QUERY – The maximum number of execution plans maintained for queries.

From SQL Server Management Studio the window look like below when the Query Store is enabled. Also in the bottom of this window, you can see the current disk usage.

The T-SQL syntax for setting the Query Store options is as follows:

ALTER DATABASE <database name> 

What information can be found in the Query Store

Specific queries in the SQL server normally has evolving execution plans over time. This due to e.g. schema changes, changes in statistics, indexes etc. Also the plan cache evicts execution plans due to memory pressure. The result is that query performance troubleshooting can be non-trivial and time consuming to resolve.

The Query Store retains multiple execution plans per query. Therefore it can be used to enforce certain execution plans to specific queries. This is called plan forcing (see below for stored procedure to do this).

Prior to SQL 2016 the hint ‘USE PLAN’ was used, but now it is a fairly easy task to enforce a specific execution plan to the query processor.

More scenarios for using the Query Store:

  • Find and fix queries that have a regression in performance due to plan changes
  • Overview of how often and in which context a query has been executed, helping the DBA on performance tuning tasks
  • Overview of the historic plan changes for a given query
  • Identity top n queries (by time, cpu time, io etc) in the past x hours
  • Analyze the use of ressources (io, CPU and memory)

The Query Store contains two stores – a plan store and a runtime stats store. The Plan Store persists the execution plan information and the Runtime Stats Store persists the execution statistics information. Information is written to the two stores asynchronously to optimize performance.

The space used to hold the runtime execution information can grow over time, so the data is aggregated over a fixed time window as per setting made in the configuration.

When Query Store is enabled in the database a set of system views will be ready for queries.








Furthermore a series of system stored procedures can be called:







How to use Query Store

The Query Store comes with 4 standard reports as shown below:

For all standard reports is that they can be modified in several ways to fit your personal needs. This is done by selection in drop-downs and point-and-click.

The Regressed Queries gives an overview of the top 25 most resource consuming queries in the last hour. This including the execution plan, a time table to see when and how long the query took to run etc.

The Overall Resource Consumption show 4 charts as standard based on duration, execution count, CPU time and Logical reads

The Top Resource Consuming Queries report shows in the same format as Regressed Queries only non-aggregated and with more details.

The Tracked Queries report show detailed data from a selected query – here you need to find and remember the query id – this can be found, among other ways, from below queries against the Query Store system views.

The data from the Query Store can be accessed from the above described system views. Examples of usage can be found below.

Top 5 queries with the longest average execution time the last hour

   sys.query_store_query_text AS qt 
   RIGHT JOIN sys.query_store_query AS q 
      ON qt.query_text_id = q.query_text_id 
   RIGHT JOIN sys.query_store_plan AS p 
      ON q.query_id = p.query_id 
   RIGHT JOIN sys.query_store_runtime_stats AS rs 
      ON p.plan_id = rs.plan_id
WHERE  1=1 
   AND rs.last_execution_time > DATEADD(hour, -1, GETUTCDATE())
   rs.avg_duration DESC;

Last 10 queries executed on the server

SELECT TOP 10 qt.query_sql_text, q.query_id, 
    qt.query_text_id, p.plan_id, rs.last_execution_time
FROM sys.query_store_query_text AS qt 
JOIN sys.query_store_query AS q 
    ON qt.query_text_id = q.query_text_id 
JOIN sys.query_store_plan AS p 
    ON q.query_id = p.query_id 
JOIN sys.query_store_runtime_stats AS rs 
    ON p.plan_id = rs.plan_id
ORDER BY rs.last_execution_time DESC;

Queries with more than one execution plan

,p.query_plan AS plan_xml
FROM (SELECT COUNT(*) AS count, q.query_id 
FROM sys.query_store_query_text AS qt
JOIN sys.query_store_query AS q
    ON qt.query_text_id = q.query_text_id
JOIN sys.query_store_plan AS p
    ON p.query_id = q.query_id
GROUP BY q.query_id
HAVING COUNT(distinct plan_id) > 1) AS qm
JOIN sys.query_store_query AS q
    ON qm.query_id = q.query_id
JOIN sys.query_store_plan AS p
    ON q.query_id = p.query_id
JOIN sys.query_store_query_text qt 
    ON qt.query_text_id = q.query_text_id
ORDER BY query_id, plan_id;

What’s in it for me

Well I hope that the answer to this is pretty obvious to you after you have read this post 🙂

The Query Store enables any person responsible for database performance to monitor, analyze and keep track of queries, execution plans and resource usage through system views or standard reports from the SQL Server Management Studio.


This new feature Query Store is a great add-on for the DBA (or accidental DBA) that needs to keep the analytical data in a standard form and have availability of query statistics and troubleshooting.

This blogpost is based on the latest CTP of SQL Server 2016 (CTP 3.0) which can be downloaded here:


Many-to-many in SSAS Tabular

Many-to-many in SSAS Tabular

With the release of SQL Server 2016 CTP 3.0 also comes the ability to test the new functionality of Many-to-Many in SSAS Tabular.

This blogpost will cover the aspects of the many-to-many feature from SQL Server 2016 – including:

  • Prerequisites
  • The old way
  • The new way

This post is based on data from the AdventureWorksDW2012 database.


In order to test the new many-to-many feature from SQL Server 2016 SSAS Tabular you’ll need to download the latest CTP from Microsoft – it can be found here:


Also you’ll need the Visual Studio 2015 and the add-in for Business Intelligence:


Choose the SSDT October 2015 Preview in Visual Studio for download.

After a bit of waiting with the installation, you are ready to test the functionality.

The old way

Before showing the new (and for me right way) to do the many-to-many in SSAS Tabular, let me first show you how it was done prior to SQL Server 2016 CTP 3.0.

Thanks to the two brilliant guys from SqlBI Marco Russo (T,L) and Alberto Ferrari (T,L) we’ve had below approach for quite a while now.

First of all you need to build a bridge table with the column that links the two tables and build a model like below illustrates.

The m2mKey is a concatination of the SalesOrderNumber and SalesOrderLineNumber as the Tabular still does not have the ability to handle two joins at the same time.


Then all measures that need to take the DimSalesReason into account needed to be rewritten with some DAX coding:

Sum of UnitPrice:=CALCULATE(SUM([UnitPrice]);vBridgeSalesReason)

Then the output will look something like this:


The new way

With the CTP 3.0 release and the SSDT addon for Visual Studio 2015 now this get’s as easy as 1,2,3.

First of all, it is now possible to build a datamodel directly without any bridge tables like this:


Note the highlighted area – here you can see the many-to-many relationship. This is modelled when creating the relationship in the model like this:


Remember to select the Filter Direction to << To Both Tables >>.

And that is it!

The result without doing DAX formulas:



Behold the new live query stats in SQL Server 2016

With the release of SQL Server 2016 also comes a great new feature to get a live query stats view of the current execution plan for an active query.

This blogpost will cover the aspects of this new feature including:

  • Introduction
  • How to activate
  • How to use and read the output
  • Downsides – if any


The introduction of live query plans are in the current release of SQL Server 2016 CTP 2.2 a new feature from Microsoft, which hopefully will be in the final release.

The feature provides real-time insights to the SQL Server engine’s query execution process. This in a visual matter when data flows from one plan operator to the next in the execution. The display will cover the usual elements of an execution plan – this including the number of rows handled, the time spend, progress of the single operators and other well-known statistics of a query execution.

One of the good news in this feature is the ability to show and analyze the query even before it has finished. This is good when debugging complex queries – the operators are shown with their individual performance, giving the DBA or other person responsible for the database a faster view of the places to make the performance-optimization.


In SQL Server Management studio there is a new option when right clicking the query window – “Include Live Query Statistics”:

live query stats - activation

For some reason, there is no keyboard shortcut to activate that functionality. Maybe this will come in the RTM release of SSMS for SQL Server 2016.

This function can also be activated from the top-menu in SSMS 2016 CTP 2.2:

live query stats - activation 2

Note: this feature also works on SQL Server 2014 SP1 – as the feature relies on underlying DMV’s from this service pack.

If the session running the query has enabled either statistics XML (SET STATISTICS XML ON;) or statistics profile (SET STATISTICS PROFILE ON;) then the Live Query Statistics can also be reached from the activity monitor.

The DBA can also activate a server wide setting to enable Live Query Statistics on all sessions with the extended event query_post_execution_showplan – for more info click HERE [Link: https://msdn.microsoft.com/en-us/library/bb630319.aspx]:

Right click the current query in Active Expensive Queries and choose ‘Show Live Execution Plan’:

live query stats - show plan

The observant reader will now ask – why are the names not the same across the SSMS? Well – I don’t know actually.


A database level SHOWPLAN is required to populate the Live Query Statistics and the server level VIEW SERVER STATE permission in order to see the live statistics.

And voila!

live query stats - gifs animation

The query runs against an enlarged table from the AdventureWorksDW2012 with more than six mill. rows.

…and with a bit more complex query (not optimal query design at all):

live query stats - gif animation 2

The output

The output from the Live Query Statistics can be read like any other execution plan. The operators are the same and the depper statistics can be revealed like usual with hovering the cursor on the single operators:

The data in these statistics will not change as the query runs – you have to move the cursor and hover again to get an updated info on the specific operators.

The DMV used is sys.dm_exec_query_profiles [LINK: https://msdn.microsoft.com/en-us/library/dn223301.aspx] which can be queried and gives the same results in text-form as the graphic animations. But it is a lot more efficient and easier to decode the animations than the text-based results.

The catch

What great new feature – and it also works on SQL Server 2014 SP1 as mentioned earlier. But there is a catch – as always:

  1. If the query are using columnstore indexes then the live window will not show
  2. If the query are using tables that are memory optimized then the live window will not show
  3. NSP (Natively Stored Procedures) are not supported

It only works on SQL Server 2014 SP 1 and onwards. But who isn’t using one of those in production now J.

Conclusion – live query stats

The new feature Live Query Statistics are great for performance tuning of queries and the DBA that want to see the live performance of data loading in the database. The feature works like a charm and is, from my perspective, a nice feature.

I hope this post makes a great start for you to work with the Live Query Statistics. This post is written based on the current CTP of SQL Server 2016 (CTP 2.2) which can be downloaded here: https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2016

If the feature is updated in later versions of SQL Server 2016, then this post will be updated accordingly.

Install SQL Server in 5 minutes

sql server 2014 logo

You know this everybody – you need to install a whole new SQL Server on the customers Windows Server.
And the customer doesn’t have the time or money to wait for you to click around the GUI and sit around and watch the progress-bar move every second. You can actually install SQL Server in 5 minutes.

So why not do this in a fully automatic way and in the same time save both effort and expensive time doing so?

Using the command prompt

There is a command line util that gives you the ability to write a very very very long command and in that way install the SQL Server from there. But I personally do not want to sit around and write these commands every time.

Using config files

A very more efficient and reusable way is to make your own configuration file and use this every time you need to install a new SQL Server.

When you click your way through the UI to install the SQL Server the UI actually make a config file and then use this file to install using a very small commandline:

setup.exe /configurationfile=config.ini (if ‘config.ini’ is your config-file).

The configurationfile is easy to model around with and in that way you can have your own personal config-file that can be adapted to every environment and every customer. And if the customer for some reason needs to reinstall the server again, they just need the media and the config-file.

The upside

When you use configurationfiles and the command line the installation can be done in, for my example on my laptop, in aprox. 5 minutes.

2014-11-04 21_18_27-Summary.txt - Notepad

How long do you normally wait for the installation to finish?


I really can’t find any. Once you’ve got the config-file up and running, it’s easy-peasy to install a new SQL Server. You can also reuse the config-file for a newer version of SQL Server – just add or modify the file to fulfill the needs for that version and that environment.

If you have encountered any downsides that I haven’t seen – then please leave a comment and I’ll get back to you.

Further reading

Microsoft has good documentation for this:

Command prompt installation

Configuration file installation

Just got my blog hacked

hacked-Wordpress-1Statistically it happens some time for all of us with a website – and today was my ‘lucky day’ when I got my blog hacked.

Got a mail from my hosting-company telling me that my WordPress blog has gotten hacked and was disabled untill I changed all access-codes and made a full-re-install of the binaries.

Luckily I’ve got backups of everything – so nothing was lost.

Everything is a-ok now – and I hope it does not happen again.