Thursday 31 October 2019

Databases in DevOps - Build Pipeline

Case
Before we can deploy our database project through the DTAP environment we need to build it to check whether the merged code of all developers still works. The result of the build is a dacpac file which can be used for the deployment. But what is a dacpac file and how do we create it in Azure DevOps?
Build Pipeline














Solution
First, Dacpac stands for Data Application Component Package. It is a single file containing a database model that is equal to the database project in Visual Studio. It contains all the information to create a database. You can rename it from .dacpac to .zip to see the content of the file.


1) Create empty build pipeline
There are several pipeline templates that you can use, but for this example we will start with an empty build pipeline.
  • Go to Pipelines (Builds) in the left menu
  • Click on New to create a New Build Pipeline
  • Use the classic editor link on the bottom
  • Select Azure Repos Git as the source
  • Then select the Team project, Repository and the default branch
  • Next choose an Empty job
Create empty Build Pipeline
















Now give the new Build pipeline a useful name. Next is choosing the Agent pool. For this example we will use the default Microsoft hosted pool: Azure Pipelines. The agent for this pipeline will be 'vs2017-win2016'.

2) Add trigger
To 'Enable continuous integration' for the database project we will need to add a trigger. For this example we will use a Branch filter on 'Master' and a Path filter on the path of this database project in GIT. This trigger will automatically start the build of the database project when someone changes code in the master branch for this particularly project (if done right via a pull request from a feature branch to the master branch).
  • Go to the Triggers tab
  • Enable continuous integration by checking the checkbox
  • First set the branch filter to: Include and master (or the branch you want to use)
  • Click on + Add below Path filters
  • Set the path filter to: include and the path of your project in Git: DB/HST/HST_BB/*
Add trigger

















3) Builds tasks: Build solution
Now we need to add tasks to the Agent job generated in step 1. The first task is MSBuild which will build the Visual Studio database project. Once succeeded, this will generate a dacpac file which will be used by the next tasks
  • Click on Agent job 1 and optionally change its name
  • Click on the + icon to add a task the Agent job
  • Search for msbuild in the seachbox
  • Select MSBuild (Build with MSBuild) and click Add
  • The only property you need to set is the filepath of the sqlproj file in GIT: DB/HST/HST_BB/HST_BB.sqlproj
Add MSBuild task

















4) Builds tasks: Copy dacpac and publish files
With this task we will copy the result of the build task (a dacpac file) to the Artifact Staging Directory. We need to specify which file we want to copy and to which folder. For the target folder we will use a predefined variable pointing to the right folder. For the files we will use a wildcard filter. Note that we not only publish the dacpac file, but also the Publish profile xml file.

  • Click on Agent job 1 (or the new name you provided in the previous step)
  • Click on the + icon to add a second task the Agent job
  • Search for 'copy files' in the seachbox
  • Select Copy files (Copy files from .... folder paths) and click Add
  • In the contents field add two rows:
    • **\*.dacpac
    • **\*.publish.xml
  • As target folder add: $(Build.ArtifactStagingDirectory)
Copy dacpac to Artifact Staging folder

















5) Builds tasks: Publish build artifact
The last step of the build pipeline is to publish all files in the Artifact Staging Directory and give them an Artifact Name. This Artifact Name will be used in the Release pipeline which we will explain in a subsequent post.
  • Click on Agent job 1 (or the new name)
  • Click on the + icon to add a second task the Agent job
  • Search for 'Publish build artifact' in the searchbox
  • Select Publish build artifact (Publish build .... file share) and click Add
  • Give the artifact a new name (instead of drop)
Publish the artifacts

















Note: in this case the artifact is a dacpac file, but it could also be ispac for SSIS or a assembly for C#.

6) Test the pipeline
Now it's time to test your build pipeline by committing some changes to your master branch via Visual Studio (or to create a pull request to pull changes to your master branch). This will trigger an automatic build. If you don't want to make any changes at this moment, you could just click on (Save &) Queue in devops to do a build manually.
Dry run your build pipeline

















Summary
In this post you saw how to build your database project automatically when someone changes something in the master branch (like a pull request). You don't have to check the result manually if you setup notifications in DevOps. Then you will receive an email when something succeeds of fails.















Bisides master branches you could also build other branches like feature or personal branches by using different filters. Or you could schedule your build to build each night. After setting up this basic example explorer and try out all the other options from the build pipeline. In a followup post we will show you how to deploy this built artifact through your DTAP environment using Continuous Deployment.

Monday 21 October 2019

Flow Snack: Refresh Power BI

Case
Exiting news from the Microsoft Power Platform. You can now use Microsoft Flow to refresh a Power BI dataset. How does it work and why would you want to use it?
Refresh a Power BI dataset







Solution
Until now you could schedule your refresh or do a manual refresh in Power BI, but now you can also use Flow to trigger a refresh when new data arrives. For example when a new source file arrives in OneDrive, DropBox, SharePoint or a Azure Blob Storage Container. But the list of triggers in Flow is nearly endless and the number of possible sources in Power BI is growing steadily so most likely one of these sources fits your needs.


1) Choose your trigger
First choose a trigger to refresh your dataset. In this example the trigger is 'When a blob is added or modified' in the Azure Blob Storage container.
Trigger when new data arrives in a Blob Storage container









2) Search for Power BI
Add a new action and search for Power BI and select Refresh a dataset. Note that it is still a Preview version and could change.
Refresh a Power BI dataset
























3) Choose Workspace and Dataset
Now you must select your workspace and then the dataset in that workspace. After that save your Flow and check whether it is turned on and then start testing it by adding new files.
Select Workspace and Dataset that needs to be refreshed











Summary
An exiting new feature where you can use all kinds of triggers to refresh your Power BI dataset. Note that the refresh limitations still applies. So for PRO you have a maximum number of 8 refreshes a day and that is including any scheduled refreshes.