Optimize Your Power BI Dashboard Performance
Power BI is fast emerging as the go-to BI tool for many companies. Backed by Microsoft, Power BI provides companies a low cost but effective BI solution to companies for their reporting needs. This blog seeks to educate users on how they can ensure that the performance of Power BI Reports provides a good end-user experience.
Let us start by understanding what we would categorize as poor performance or a “slow” Report. A report can be termed “slow” due to many reasons:
The report takes a lot of time to refresh.
The visuals in the dashboard take too much time to load.
Power BI runs out of resources to compute the visual.
There are two types of refreshes, schedule refresh and on demand refresh. Scheduled refresh is automated to run at specific time periods selected, and on demand is a manual refresh triggered by selecting the option shown below.
Some reports, especially finance related, could be so voluminous that the scheduled refresh time taken could be well over half an hour. Depending on when the tables are loaded, the report user would have to wait for the dashboards to get updated.
Schedule refresh works on a shared capacity limitation, and it is observed to be slower than the Refresh Now option. However, it is not practical to click refresh now manually every day. To work around this limitation, we use Power BI Rest APIs to refresh the dashboards.
Using Power BI Rest API, the login credentials, and specifying the Workspace and Dataset name, we can trigger a refresh which is faster than the scheduled refresh.
Performance of Visuals in the dashboard depends on various factors. To identify the root cause, we can use Performance Analyser to see the time taken for a visual to load as well as the breakup of the time taken.
Performance Analyser can be accessed in the View ribbon as shown:
It is recommended to open the dashboard onto a blank page and start recording in the performance analyser tab. After you change the page, the performance analyser will start recording the time taken for each visual.
The duration is shown in milliseconds and can be sorted to find the longest running visual. The description of the Breakup in loading the visual can be seen. This will give an idea on how to proceed with the optimization of the visual.
There are many ways to optimize this time. If you look closely, the visual “Average Payment Term by Month” takes 1.448 s to load, and the majority of it is in Other. This other includes cross filter interaction by other visuals.
To optimize the “Other” part, you could:
Limit the number of visuals in the Dashboard. 4 quadrants with maximum of 2 tables/matrix would be optimum.
Remove any unnecessary interactions between visuals. If the table is not going to cross filter the bar, then removing this interaction would lessen the time for the Bar visual.
Limit the number of slicers, especially detailed ones. Slicers perform two queries, fetching data and fetching selection detail.
To reduce firing multiple queries due to multiple selects on a slicer that cause a load on the report, we can use the apply button on slicers and filters.
Most of the time, the time taken is because of the DAX Query. It could be due to DAX formulae being written without optimization or due to a very heavy data set.
Following points can be helpful in optimizing DAX Query:
Use variables instead of repeating measures :
Incorrect DAX :
Growth = IF ([Total Sales LY] > 0, ([Total Sales TY] / [Total Sales LY]) - 1, 0)
Here LY Sales are being calculated twice.
Correct DAX :
VAR ly_sales = [Total Sales LY]
Growth = IF (ly_sales > 0, DIVIDE ([Total Sales TY]-ly_sales, ly_sales,0)
VAR stores the measure value, and it is called multiple times rather than being calculated multiple times which will be laborious.
Also DIVIDE ([Total Sales TY]-ly_sales, ly_sales,0) is better than DIVIDE ([Total Sales TY], ly_sales,0)-1 as if ly_sales is 0 then measure will return -1, but since we have IF condition for ly_sales is greater than 0, any one is fine.
Use DIVIDE () instead of / :
The DIVIDE () function internally performs a check to validate whether the denominator is zero. If it is, it returns the value specified in a third argument.
If DAX is very complicated to process, Data Models could also be adjusted to give the best Performance :
To optimize the Dashboard further, we can include last year values in the backend table.
If multiple data sources and dim tables exist, we can join them using SQL query and provide a single source of Truth.
For additional Information refer to this article here.
The main reason for the above is bad DAX Queries and heavy data.
DAX Query optimization is covered in the above section and will also be covered in the next part of the blog, so let us discuss Data Size Optimization:
It is best to remove unnecessary columns, especially with high cardinality. For example, if the Line Order Number column (akin to Primary Key) of a Sales Data is of no use, we could remove the same. This would enable Power BI to compress Data in a more efficient manner.
Temporary Data Tables are formed due to Time Intelligence. This could be avoided by unchecking the Auto date/time (highlighted) in the Options.
The key learnings from this blog to improve performance of Power BI reports are summarized below:
Use Power BI Rest APIs to reduce dataset refresh time.
Narrow down the time-consuming dashboard elements and steps using Performance Analyser.
Optimize DAX queries.
Remove high cardinality columns if not being used in dashboard.
TABLEAU vs POWERBI - WHICH ONE TO CHOOSE?
June 02, 2021
Things you need to know about Power BI
May 30, 2017
Power BI Custom Visualization - Polar Scatter Plot
June 28, 2021