Recently I was asked to do some performance testing and benchmarking for a SQL Server Analysis Services 2008 (SSAS 2008) cube. As it happens in most of the projects, I did not have much time and I needed to find a quick way to do this. In this post, I would like to jot down the approach I took in the hope that one day it might be useful to somebody. So here it goes.
The cube in the question was of an average size. About 20GB in size with 10 dimensions and 5 measure groups. At any time, maximum 15 people would use it exclusively through Excel 2007. This gave me a good starting point. I wanted to test the performance of the cube for 15 people with queries which typically are like the one generated in Excel. I also came to conclusion that in each query the response time of around 300ms is acceptable (Of course this is debatable but I leave it up to you).
Collecting the data..
Given that all the queries will be from Excel, I had two options to collect test queries. Using OLAP PivotTable Extension which is also mentioned on my Tools and Tips page or using SQL Server profiler.
Collecting queries from OLAP PivotTable extension is very straight forward. You build your pivot table and copy query from the extension’s dialogue box. However, there are two main drawback I found with this approach. One, it is one-query-at-a-time approach. Two, the pivot table would be built by me and not the actual users. I don’t necessarily know what dimension and measure groups they are most interested in and so the queries may not necessarily represent actual workload. I also did not have the option of asking each end-user to create a pivot table and send me the queries. I though that’s too much to ask.
Using SQL Server profiler solves both the issues. Since end-users are continuously using the cube, I get variety of real life queries and I can save them in batch. As it turns out there is a way to export exact MDX queries from SQL server profiler. So I ran SQL Server profiler on the server (yes, I know, you should not run profiler on live server but in this case I took the risk!) at peak time while maximum users were using the cube and exported the MDX queries. When profiler exports the queries, it exports it in a batch i.e. all the queries are in one file. Here, I resorted to manually separating each query in a separate file.
For testing/benchmarking exercise, I used AS Performance Workbench. It is the simplest and most intuitive tool I could find around. I am pretty sure there are other utilities and methods available but this one was just what I was looking for. It is simple to use, the report generated is pretty and easy to understand, it can simulate load and ,most importantly, it is free.
I decided to create 25 query files. They covered all the dimensions and measures. I selected more complex queries on frequently used measures. I also decided to do three runs with 10,12 and 15 users respectively. This varied the number of users nicely. I did the first run on cold cache i.e. I emptied the cache before running the first test while for second and third run I let the cache as it is. Also I made sure that nobody else is using the server in any way while I was testing. This was done to get a more realistic picture about effect of queries on CPU and IO. If any other services are running on the server, then they will be using CPU and disk, which will skew the results.
There were two aspects to this exercise – benchmarking and testing.
Benchmarking part was easy. After each run I generate the report and saved it. I also saved all the queries along with details of the hardware on which testing was carried out and a brief report. The idea is that after while when the data increases, we would run the same set of test again to see performance change.
As for testing I was most interested in three counters
a. % Processor Time
b. Avg. duration for each query
c. Current Disk Queue Length
A small table for each run like below sufficed.
|Counter||Expected Result||Actual Result||Pass/Fail|
|% Processor Time||Below 60%||45.96 %||Pass|
|Current Disk Queue Length||Less than 1||0||Pass|
|Avg. Query 1 duration||300 ms||175 ms||Pass|
|Avg. Query 1 duration||300 ms||390 ms||Fail|
Hope that this post will be useful to somebody or may be in future!!