Case
I want to develop my Power BI dataset separately from my Power BI reports. So we started using Tabular Editor. This also allows us to work on the model with multiple people at the same time because every object is a separate JSON file. But how can we automatically deploy these files using Azure DevOps?
Solution
In this serie of posts we will deploy the Tabular Editor project as an Power BI dataset via DevOps. We will deploy this to the development workspace. For the deployment from Dev to Test and Test to Prod we will use Power BI Deployment pipelines.
I want to develop my Power BI dataset separately from my Power BI reports. So we started using Tabular Editor. This also allows us to work on the model with multiple people at the same time because every object is a separate JSON file. But how can we automatically deploy these files using Azure DevOps?
Deploy your Power BI datasets with DevOps |
Solution
In this serie of posts we will deploy the Tabular Editor project as an Power BI dataset via DevOps. We will deploy this to the development workspace. For the deployment from Dev to Test and Test to Prod we will use Power BI Deployment pipelines.
- Setup DevOps and security
- Building the project
- Publishing as dataset
In this second post we will create the YAML file that will build the Tabular Editor project that is stored in the repository. The filename we used is BuildModel.yml and it is stored in the repository under CICD\YAML\BuildModel.yml. There are two additional YAML files in that same folder and those will be explained in the next post. However they will be mentioned in one of the tasks below.
1) BuildModel.yml
The first YAML file is BuildModel.yml that builds the JSON files of the tabular editor project into a deployable *.bim file (the dataset) and then calls the deployment YAML files to do the actual deployment.
################################### # General Variables ################################### variables: - group: PBIModelParamsGen
Secondly we need to determine when to trigger this pipeline. This part is of course heavily depending on your branch strategy. In this example it triggers on changes in multiple branches but only for models in a certain Power BI workspace folder. Note that the path is hardcoded and that it can't use variables from your Variable groups.
################################### # When to create a pipeline run ################################### trigger: branches: include: - Development - Acceptance - main paths: include: - Models/myPowerBIWorkspace/*
The next step is describing the first stage of the pipeline where we build the model and set it ready for publishing. The important property is pool where you either configure a self hosted agent (via name) or a Microsoft hosted agent (via vmImage). If you are using a self hosted agent then the clean: all setting helps you to cleanup your agent before you start.
stages: ################################### # Create Artifact of datamodels ################################### - stage: CreateModelArtifact displayName: Create PBI Model Artifact jobs: - job: CreateModel displayName: 'Create Model' workspace: clean: all pool: name: mySelfHostedAgent # vmImage: 'windows-latest' #'ubuntu-latest' steps:
The first real task is retrieving all the files from the repository. After this step your datasets/models will be available on the agent it self. This is necessary for the build step.
################################### # 1 Retrieve Repository ################################### - checkout: self displayName: '1 Retrieve Repository' clean: true
The next task is downloading and unzipping the tool Tabular Editor. If you have a self hosted agent then you could also choose to download/install this on your Virtual Machine. However installing it via YAML makes it very easy to use a newer version just by changing the 'DownloadLinkTabularEditor' variable in the general variable group (see previous post). That variable should be filled with the download url of the portable/zip file: https://github.com/TabularEditor/TabularEditor/releases/
################################### # 2 Download Tabular Editor ################################### - powershell: | # Download URL for Tabular Editor portable: # Create Download folder $DownloadFolder = join-path (get-location) "TabularEditor" new-item -type directory -path $DownloadFolder -Force Write-Host "Create download location: $($DownloadFolder)" # Download destination (root of PowerShell script execution path): $DownloadFile = join-path $DownloadFolder "TabularEditor.zip" Write-Host "Save download as: $($DownloadFile)" # Download from GitHub: Write-Host "Downloading Tabular Editor from: $(DownloadLinkTabularEditor)" Invoke-WebRequest -Uri $(DownloadLinkTabularEditor) -OutFile $DownloadFile # Unzip Tabular Editor portable, and then delete the zip file: Write-host "Extracting downloaded zip file" Expand-Archive -Path $DownloadFile -DestinationPath $DownloadFolder Write-host "Clean up download file" Remove-Item $DownloadFile displayName: '2 Download Tabular Editor'
To check whether the checkout step and the download/unzip step were succesful we have an optional step that shows a treeview of the agent. Once everything is working you can remove or comment-out this step.
################################### # 3 Show treeview of agent ################################### - powershell: | tree "$(Pipeline.Workspace)" /F displayName: '3 Treeview of Pipeline.Workspace'
Now that we have succesfully copied the models to the agent and downloaded Tabluar editor, we can build the json files into a BIM file which we can later on release as a dataset to a Power BI workspace.
This Powershell step loops through all database.json files (the main/project file) that are on the agent. It retrieves the subfolder/parentfolder name so that is can be used as a dataset name and then uses TabularEditor.exe to build the project. All BIM files are stored in the artifact staging folder.
This is again the same treeview step to check the result of the build. In this case it shows all files and subfolders of the artifact staging folder. You can remove or comment-out this step once everything works.
The last step of the build stage is creating an artifact of all (BIM) files in the artifact staging folder. This artifact will auttomatically be downloaded to the agent in the next stage.
Now it's time to do the actual deployment. We have 3 extra stages (Deploy to Dev, Acc and Prd), but we used a condition on them. If a change happened in the Acceptance branch then we will only execute the Deploy to Acc stage and skip the rest of the stages. In the pipeline runs overview it will look like this:
In this example the deployment method for Development is a little PowerShell script and for Acceptance and Production we will use the Power BI Deployment Pipeline. Therefore those stages have different parameters. The details of the those yaml files will be explained in the next blog post.
################################### # 4 Build data models ################################### - powershell: | # Get repos folder with model files $ModelsFolder = Join-Path -Path $(Build.SourcesDirectory) -ChildPath "\Models\$($ReposWorkspaceFolder)\" # Loop through local repos folder to get all datamodels $AllDatamodels = Get-ChildItem -Path $ModelsFolder -Recurse -Filter database.json Write-Host "$($AllDatamodels.Count) model$(If ($AllDatamodels.Count -ne 1) {"s"}) found in $($ModelsFolder)`n" foreach ($DataModel in $AllDatamodels) { $Path = "$($DataModel.Fullname)" Write-Host "Building model $($Path)" # extract datasetname and workspacename from folder path of model $DatasetName = $Path | Split-Path -Parent | Split-Path -Leaf $WorkspaceName = $Path | Split-Path -Parent | Split-Path -Parent | Split-Path -Leaf # Create target path for staging folder $TargetFolder = Join-Path -Path $(Build.ArtifactStagingDirectory) -ChildPath "\Models\" $TargetFolder = Join-Path -Path $TargetFolder -ChildPath $WorkspaceName $TargetFolder = Join-Path -Path $TargetFolder -ChildPath $DatasetName Write-Host "Saving build model in $($TargetFolder)" # Build model into bim file by executing tabular editor $(Build.SourcesDirectory)\TabularEditor\TabularEditor.exe "$($DataModel.Fullname)" -B "$($TargetFolder)\Model.bim" Wait-Process -Name "TabularEditor" } displayName: '4 Build data models'
This is again the same treeview step to check the result of the build. In this case it shows all files and subfolders of the artifact staging folder. You can remove or comment-out this step once everything works.
################################### # 5 Show treeview of agent ################################### - powershell: | tree "$(Build.ArtifactStagingDirectory)" /F displayName: '5 Treeview of Build.ArtifactStagingDirectory'
The last step of the build stage is creating an artifact of all (BIM) files in the artifact staging folder. This artifact will auttomatically be downloaded to the agent in the next stage.
################################### # 6 Publish artifact ################################### - task: PublishPipelineArtifact@1 displayName: '6 Publish ARM template as artifact' inputs: targetPath: $(Build.ArtifactStagingDirectory) artifact: 'PBIModelsArtifact' publishLocation: 'pipeline'
Now it's time to do the actual deployment. We have 3 extra stages (Deploy to Dev, Acc and Prd), but we used a condition on them. If a change happened in the Acceptance branch then we will only execute the Deploy to Acc stage and skip the rest of the stages. In the pipeline runs overview it will look like this:
In this example the deployment method for Development is a little PowerShell script and for Acceptance and Production we will use the Power BI Deployment Pipeline. Therefore those stages have different parameters. The details of the those yaml files will be explained in the next blog post.
################################### # Deploy Dev environment ################################### - stage: DeployDev displayName: Deploy DEV variables: - group: PBIModelParamsDev pool: name: mySelfHostedAgent #vmImage: 'windows-latest' condition: eq(variables['Build.SourceBranchName'], 'Development') jobs: - template: DeployPBIModelDev.yml parameters: env: DEV SpApplicationId: $(SpApplicationId) SpClientSecret: $(SpClientSecret) SpTenantId: $(SpTenantId) DownloadLinkTabularEditor: $(DownloadLinkTabularEditor)
################################### # Deploy Acc environment ################################### - stage: DeployAcc displayName: Deploy ACC variables: - group: PBIModelParamsAcc pool: name: mySelfHostedAgent #vmImage: 'windows-latest' condition: eq(variables['Build.SourceBranchName'], 'Acceptance') jobs: - template: DeployPBIModelAccPrd.yml parameters: env: ACC DeployStagePowerBI: Test # PBI doesn't use Acceptance ServiceConnection: 'PowerBI' ReposWorkspaceFolder: $(ReposWorkspaceFolder) DeploymentPipelinePBI: $(DeploymentPipelinePBI) ################################### # Deploy Prd environment ################################### - stage: DeployPrd displayName: Deploy PRD variables: - group: PBIModelParamsPrd pool: name: mySelfHostedAgent #vmImage: 'windows-latest' condition: eq(variables['Build.SourceBranchName'], 'main') dependsOn: CreateModelArtifact jobs: - template: DeployPBIModelAccPrd.yml parameters: env: PRD DeployStagePowerBI: Production ServiceConnection: 'PowerBI' ReposWorkspaceFolder: $(ReposWorkspaceFolder) DeploymentPipelinePBI: $(DeploymentPipelinePBI)Conclusion
In this post we showed how to build a Tabular Editor project with JSON files into one BIM file which we can deploy to the Development workspace in Power BI. You could do the same for the other enviroments, but in this example we will use the Power BI Deployment pipeline for Test and Production which you will see in the next post.
No comments:
Post a Comment
All comments will be verified first to avoid URL spammers. यूआरएल स्पैमर से बचने के लिए सभी टिप्पणियों को पहले सत्यापित किया जाएगा।