Power BI version control with Powershell

Image

Objective

Powershell is filled with interesting cmdlets that can be used to connect to the service obtain useful information and interact with the PowerBI reports, workspace, datasets, tiles, etc., and PowerBI REST APIs. This scripting language can be used to perform cumbersome tasks, and we will be using it to download reports from PowerBI and go one step further by creating version control to keep track of reports.

Version control helps in tracking changes made to each file. Changes made to a report sometimes may not align with the expectations or requirements, and hence the change may need to be rolled back. As each and every change is painful to document, it's easier to maintain version control of reports and refer back to the previous version. As there can be multiple developers working on the same file, considering the file that is published to the production workspace as the final version of that report, the Power BI service itself can be used as a source for the final versions of each report

As a tenant’s PowerBI administrator, it is also useful to keep stock of what kind of reports are being developed and published in the organization.

Powershell cmdlets

In simple terms, built-in Powershell cmdlets can be used to obtain Power BI report information from their workspaces, and a simple for loop is used to perform the export command again and again till all reports are downloaded.

Powershell will not come pre-installed with the required modules to work and interact with PowerBI objects.

Install-Module -Name MicrosoftPowerBIMgmt

The above command will install all modules that are necessary to interact with PowerBI objects

Connect-PowerBIServiceAccount

The above command will redirect you to log in to your Powerbi account through a familiar login GUI. The account being used to log in decides the information that is available at execution. Admins can have access to workspace details of the entire organization, even including myWorkspace details of individual users.

Brunt of the code

Building on the above commands, a simple for loop can be used to iterate through the variables used to store report information and use the export command to download Power BI reports. These reports can be downloaded to files created on the go by using Powershell to segregate the files based on the downloaded data.

The following code block shows how folders can be created and dropped easily, and importantly to delete older versions that are not needed as these files can be heavy. Assuming that the code is run at an interval of 7 days, the files that are 21 days old w.r.t the current run date are dropped during this run.

$PBIWorkspace = Get-PowerBIWorkspace
$date = Get-Date
$TodaysDate = $date.ToString("yyyy_MM_dd")
$OutPutPath = "D:\PowerShell\Reports" + $TodaysDate
$ErrorActionPreference = "silentlycontinue"
#the interval of days between two consecutive runs of the script
$interval = 7
$drop_folder = "D:\PowerShell\Reports" + $date.AddDays(-$interval*3).ToString("yyyy_MM_dd")
Following is the main part of the code that downloads the files through a for-loop
#Loop through the $Workspace variable
    ForEach($Workspace in $PBIWorkspace)
    {
      $Folder = $OutPutPath + "\" + $Workspace.name
      #If the folder doesn't exist, it will be created.
      If(!(Test-Path $Folder))
      {
        New-Item –ItemType Directory –Force –Path $Folder
      }
      $PBIReports = Get-PowerBIReport –WorkspaceId $Workspace.Id
        #Now loop through these reports:
        ForEach($Report in $PBIReports)
        {
          $OutputFile = $OutPutPath + "\" + $Workspace.name + "\" + $Report.name + ".pbix"
          # If the file exists, delete it first; otherwise, the Export-PowerBIReport will fail.
           if (Test-Path $OutputFile)
            {
              Remove-Item $OutputFile
            }
          #The pbix is now really getting downloaded
          Export-PowerBIReport –WorkspaceId $Workspace.ID –Id $Report.ID –OutFile $OutputFile
        }
    }

As each workspace holds multiple reports, there is a need to export the details of all these reports into a text file or a CSV. Here, Powershell can be used to export the variables that store the details into a CSV file at the end of the execution of the code. This helps in summarizing the reports present in each workspace.

Note: In some cases, it might be necessary to run Powershell as an administrator.

Using a task scheduler to automate the script:

The Powershell script can be automated to run with the help of a Windows task scheduler. When the task scheduler runs the PowerShell script, it would stop for user input needed to login to the required account. As we need to automate the task, we can use a silent connection or “silent login” so that the credentials are entered automatically

#using silent connection to PBI service account
$username = 	< insert username between double inverted commas >
$password = 	< insert password between double inverted commas > |ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential

Task Scheduler

Windows task scheduler can be used to run the script on the required frequency. The following are some considerations when scheduling the task:

  • General: Assuming that the script would be run on a VM, it is important to select “Run whether the user is logged in or not” for successful execution of the script.
  • Trigger: There are plenty of settings to work with in the task scheduler which allows for customization of timing the execution of the script. It is important to remember that if the script is designed to drop old files, the frequency should be set there as well.

New Trigger

  • Action: The task scheduler should be pointed to the location where the script is saved for it to be run.

 New Action

What if I have MFA enabled for my email ID?

As organizations like to have MFA enabled for logging in, “silent login” through Powershell will not work and an error will be returned, causing the script to fail. In such cases, Service Principal [5] is used for automation, acting as a substitute for a user's credential. This is a better method as the user's credentials are not exposed in any piece of code is free of cost and does not require allocation of any new Power BI pro or PPu account to the service principal.

A service principal can be created by registering an application in Azure AD [6]. This creates an identity that can log in to Powerbi and download required reports when logged in. The app will require certain Power BI permissions and the service principal will require tenant-level changes that allow it to export required reports. Security groups can be created for the service principal to be added to keep these options enabled only for service principals that are approved. [3]. Most importantly, the service principal should be a member of the workspace from which the reports are to be downloaded.

Here, the code will also require a “silent login”, but with the service principal, which uses different credentials.

$username = $password = | ConvertTo-SecureString -asPlainText -Force $credential = New-Object System.Management.Automation.PSCredential($username, $password) Connect-PowerBIServiceAccount -TenantId -ServicePrincipal -Credential $credential

References:

Written by:

Siddhartha Ravishankar

Power BI Developer

LinkedIn

Related Post

Leave a Reply