One of the most popular questions after an SSAS cube is moved into production and users begin accessing the data is, “Who’s using the cube?” It makes sense right? After spending a lot of time and money getting an analysis cube built, who wouldn’t want to know what the usage was like? Luckily, there is an easy and slick way to surface all of this information.
First, turn on query logging for SSAS. This can be done in SSMS. Connect to the cube, right click on the instance of SSAS, and select “Properties”.
Make sure to configure the following properties. Please note that it is not best practice to set QueryLogSampling to “1” since this can have negative performance impacts on a highly utilized system, however, I have set it to 1 so that I can show a true representation of cube usage.
Taking a little time to build a quick ETL solution and cube off of the usage table, you can get two nice dashboards. The first dashboard (shown below) analyzes system usage by measuring the number of queries sent to the system. This can be used to help plan system maintenance windows and downtime when the smallest number of users will be impacted. In our case this would be Saturday between 10 PM and 12 AM. Also included, but not shown below is a simple table with the Top and Bottom 10 users of the system. This way you can really tell who is actually using the new cube!
You can also build a dashboard that analyzes the duration of each query. This can be used to see when the system is hit hard, if the duration of queries is related to the amount of queries being sent to SSAS, or if the server might just need a quick reboot! As you can see, this dashboard has a very similar look and feel to the screenshot shown above.
The analysis that can be performed by this quick solution includes the following dimensions and measures:
Measures:
First, turn on query logging for SSAS. This can be done in SSMS. Connect to the cube, right click on the instance of SSAS, and select “Properties”.
Make sure to configure the following properties. Please note that it is not best practice to set QueryLogSampling to “1” since this can have negative performance impacts on a highly utilized system, however, I have set it to 1 so that I can show a true representation of cube usage.
Taking a little time to build a quick ETL solution and cube off of the usage table, you can get two nice dashboards. The first dashboard (shown below) analyzes system usage by measuring the number of queries sent to the system. This can be used to help plan system maintenance windows and downtime when the smallest number of users will be impacted. In our case this would be Saturday between 10 PM and 12 AM. Also included, but not shown below is a simple table with the Top and Bottom 10 users of the system. This way you can really tell who is actually using the new cube!
You can also build a dashboard that analyzes the duration of each query. This can be used to see when the system is hit hard, if the duration of queries is related to the amount of queries being sent to SSAS, or if the server might just need a quick reboot! As you can see, this dashboard has a very similar look and feel to the screenshot shown above.
The analysis that can be performed by this quick solution includes the following dimensions and measures:
Measures:
- Number of Queries
- Query Duration (milliseconds)
Dimensions:
- Date Hierarchy
- Hour (kept separate from the date dimension, this can be used to perform hour by hour analysis)
- User
- Cube (the cube to perform analysis on, this can be one or many)
- System (the system / server to perform analysis on, this can be one or many)
Nice Post Mike. Could you explain in detail how you have implemented ETL and build dashboards for tracking cubes usage. I’m in a situation to build same kind of reporting. Appreciate your help. Thanks