When you deploy your SSRS reports, you might want to know about how they perform. Or maybe your reports perform allright, but there might be improvements on how the end users can be presented with the reports even faster. Also, you might have deployed a bunch of reports that nobody reads but they’re just sitting there… To get insight into these things, the ReportServer database has a few tables that contain log records of your reports.
First of all, by default the records are kept for 60 days. Older log entries are deleted. Since I always want to have a little bigger time frame of log data, I have changed the setting of this. You can find this setting in the table ConfigurationInfo in the ReportServer database. For example, I’ve used the following query to change the settings so the log entries are kept for a whole year:
SET Value = ‘365’
WHERE Name = ‘ExecutionLogDaysKept’
Right, so if you also want to have log data for a year to use this functionality, please come back in 10 months… just kidding, the 60 days by default should usually generate enough data that you can test out the functionality described in this article.
As of SQL Server 2008, the log data is stored in the table dbo.ExecutionLogStorage (in earlier versions, the table is called ExecutionLog). For some reason Microsoft does not recommend to query against tables directly, therefore there are three views created to use for querying the log. By default this view is called ExecutionLog, in SQL Server 2008 the view ExecutionLog2 was introduced, and in 2008 R2 the view ExecutionLog3 was introduced. The only difference between ExecutionLog2 and ExecutionLog3 is two column names. ReportPath is called ItemPath and ReportAction is called ItemAction.
On to the next step… getting an idea how your end users view the reports. Well, this is a very straightforward thing, if you log table has grown proportionally, it means that your work has paid off. There are several columns in this table/view that gives you an idea what the users have been browsing for and the type of actions that have been performed.
To measure the performance of your reports, there are five columns that are important to take a look at. These columns are:
- TimeDataRetrieval: This value reflects the number of milliseconds spent interacting with data sources and data extensions. This includes all data sets for the report itself and datasets for possible subreports. This value includes the time that is spent opening connections to the data source and the time spent reading the data from the database. This retrieval executes in parallel (in some cases this is not possible), therefore the number that is displayed here is not the sum of the retrieval of the datasets, but the dataset that took the longest to retrieve. In those cases where the datasets are retrieved in sequence, this number is the sum of the retrieval of all data sets.
- TimeProcessing: This value (also in milliseconds) is the time spent on processing the data in the reporting engine. This phase also includes the processing time of the tablixes, for example sorting, filtering, grouping, etc.
- TimeRendering: This value (I’m sure it’s clear now in which measurement unit…) reflects the time spent on rendering the reports and applying the pagination to the reports.
- ByteCount: This is the number of bytes that are generated for the output of this report request.
- RowCount: The number of rows that were retrieved from the data source to generate this report.
I’ll conclude with some tips how to prevent reports from taking a long time to load based on the values in the above five columns:
- When TimeDataRetrieval has a high value, try to modify your query to the data source. Narrow it down to the data that you need. Try to avoid SQL statements with “SELECT * “, where all the columns of a table are retrieved.
- When TimeProcessing is high, try to “prepare” the data differently that is used in your report. It’s sometimes easier to use an ORDER BY in your query that eliminates the need of sorting in your tablixes. Also, try to use as many filters in your SQL query to reduce the data that is processed.
- When TimeRendering is high, try to rearrange items on your report. For example, one list with one chart that is repeated 10 times is better than having 10 different charts each filtered on a different column value.
- When ByteCount and/or RowCount is high, try to reduce the amount of data that you use for your report. When you have a tablix that shows the top 10 best performing sales agents, it’s better to have a TOP 10 condition in your SQL query than pulling the whole set and filter on your tablix.