Monday, 18 November 2019

Runbook snack: read parameters from Data Factory

Case
I want to pass parameters from an Azure Data Factory pipeline web(hook) activity to my Azure Automation Runbook. How do I do that?
Supplying parameters via the Body property in Web(hook) activity























Solution
You cannot pass through a single parameters and then read that particularly parameter with PowerShell. Instead you have to read all the data coming from the webhook as one big Object parameter. That object also contains the Body property which on its turn contains a JSON message with a parameter name and value.

1) JSON Message
The parameter 'ContainerName'' will be provided in Azure Data Factory (ADF) via a JSON message that looks like
{"ContainerName":"OfficeData"}
2) Coding
In an Azure Automation Runbook you can start your code with the following code snippit that retrieves the value from the JSON messages from ADF. For this example it gets the name of a Blob Storage Container.
# PowerShell code

# Parameters
Param
(
    # Get webhook data via webhook
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from the set of parameters
$ContainerName = $Parameters.ContainerName

# Write the value of the parameter
Write-Output "The value of the container parameter is $($ContainerName)"

3) Adding Webhook
Now that the code is ready, you have to create a Webhook for ADF. Make sure to choose a correct expire date. By default it is only valid for one year. Also make sure to copy the URL, because you can only see and copy it once. You need this URL in ADF for the Web(hook) activity. Don't share the URL because it is a URL and Password in one.
Adding a Webhook to a Runbook
















4) Azure Data Factory
In the pipeline of ADF you can use the Web(hook) activity to call the Webhook and use a JSON message in the Body parameter to provide a value for a parameter. The URL is the Webhook URL from the previous step that you copied. For Method you need to select "POST" and in the Body you can enter the JSON message from step 1.
Supplying parameters via the Body property in Web(hook) activity























Conclusion
Perhaps a bit strange, but this seems to be the only way when using webhooks for your runbook and calling them in ADF. As an alternative you could provide the value of the parameters in the Webhook when you create the Webhook instead of supplying them via ADF, but that is not very flexible if you expect a lot of different values. The code for retrieving parameters from the Webhook itself looks like this:
# PowerShell code

# Parameters
Param
(
    # Get parameter via webhook
    [Parameter(Mandatory=$False,Position=1)]
    [string] $ContainerName
)

# Write the value of the parameter
Write-Output "The value of the container parameter is $($ContainerName)"

Sunday, 17 November 2019

Runbook snack: Using the Azure Run As Connection

Case
I want to manage Azure resources via a runbook (for example upscaling, downscaling, pausing and resuming resources), how do I provide authentication to do this?
Automation Account























Solution
For this we can use an Azure Run As account which gives the Automation Account the Contributor role at the subscription level. However this also means that you need to be an Application administrator in Azure Active Directory or an Owner in the subscription to create an Azure Run As account! If you do not have sufficient rights, then the Create Azure Run As account option will be grayed out when you create a new Automation Account.
Insufficient permissions



















Now you can start your PowerShell runbook with the following code which allows your code to manage the various Azure resource. The first example is for those using the PowerShell RM modules and the second example is for the newer PowerShell AZ modules.
# PowerShell code
########################################################
# Log in to Azure with RM (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'

# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName        
 
    'Log in to Azure...'
    $null = Add-AzureRmAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# test code getting all resource groups
Get-AzureRmResourceGroup

Nearly the same code, but now with AZ modules.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'

# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName        
 
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Test code getting all resource groups
Get-AzResourceGroup

If you used the Create Run As account option while creating the Automation account, then the name will be AzureRunAsConnection. Otherwise you can find the name under connections in the Azure Automation Account.
Connections in Azure Automation













Conclusion
This code, in combination with creating the Azure Run As account, allows you to perform all kinds of operations on Azure resources like up- and downscaling Azure SQL Databases or pausing and resuming Azure Analysis Services. The only problem could be insufficient rights to create this. In that case you could ask someone with sufficient rights to create the Automation Account with the Azure Run As account option on and then you could create the runbooks yourself.

Note that support for RM modules will end in December 2020. So don't wait to long to convert your scripts to AZ.

Thursday, 31 October 2019

Databases in DevOps - Build Pipeline

Case
Before we can deploy our database project through the DTAP environment we need to build it to check whether the merged code of all developers still works. The result of the build is a dacpac file which can be used for the deployment. But what is a dacpac file and how do we create it in Azure DevOps?
Build Pipeline














Solution
First, Dacpac stands for Data Application Component Package. It is a single file containing a database model that is equal to the database project in Visual Studio. It contains all the information to create a database. You can rename it from .dacpac to .zip to see the content of the file.


1) Create empty build pipeline
There are several pipeline templates that you can use, but for this example we will start with an empty build pipeline.
  • Go to Pipelines (Builds) in the left menu
  • Click on New to create a New Build Pipeline
  • Use the classic editor link on the bottom
  • Select Azure Repos Git as the source
  • Then select the Team project, Repository and the default branch
  • Next choose an Empty job
Create empty Build Pipeline
















Now give the new Build pipeline a useful name. Next is choosing the Agent pool. For this example we will use the default Microsoft hosted pool: Azure Pipelines. The agent for this pipeline will be 'vs2017-win2016'.

2) Add trigger
To 'Enable continuous integration' for the database project we will need to add a trigger. For this example we will use a Branch filter on 'Master' and a Path filter on the path of this database project in GIT. This trigger will automatically start the build of the database project when someone changes code in the master branch for this particularly project (if done right via a pull request from a feature branch to the master branch).
  • Go to the Triggers tab
  • Enable continuous integration by checking the checkbox
  • First set the branch filter to: Include and master (or the branch you want to use)
  • Click on + Add below Path filters
  • Set the path filter to: include and the path of your project in Git: DB/HST/HST_BB/*
Add trigger

















3) Builds tasks: Build solution
Now we need to add tasks to the Agent job generated in step 1. The first task is MSBuild which will build the Visual Studio database project. Once succeeded, this will generate a dacpac file which will be used by the next tasks
  • Click on Agent job 1 and optionally change its name
  • Click on the + icon to add a task the Agent job
  • Search for msbuild in the seachbox
  • Select MSBuild (Build with MSBuild) and click Add
  • The only property you need to set is the filepath of the sqlproj file in GIT: DB/HST/HST_BB/HST_BB.sqlproj
Add MSBuild task

















4) Builds tasks: Copy dacpac and publish files
With this task we will copy the result of the build task (a dacpac file) to the Artifact Staging Directory. We need to specify which file we want to copy and to which folder. For the target folder we will use a predefined variable pointing to the right folder. For the files we will use a wildcard filter. Note that we not only publish the dacpac file, but also the Publish profile xml file.

  • Click on Agent job 1 (or the new name you provided in the previous step)
  • Click on the + icon to add a second task the Agent job
  • Search for 'copy files' in the seachbox
  • Select Copy files (Copy files from .... folder paths) and click Add
  • In the contents field add two rows:
    • **\*.dacpac
    • **\*.publish.xml
  • As target folder add: $(Build.ArtifactStagingDirectory)
Copy dacpac to Artifact Staging folder

















5) Builds tasks: Publish build artifact
The last step of the build pipeline is to publish all files in the Artifact Staging Directory and give them an Artifact Name. This Artifact Name will be used in the Release pipeline which we will explain in a subsequent post.
  • Click on Agent job 1 (or the new name)
  • Click on the + icon to add a second task the Agent job
  • Search for 'Publish build artifact' in the searchbox
  • Select Publish build artifact (Publish build .... file share) and click Add
  • Give the artifact a new name (instead of drop)
Publish the artifacts

















Note: in this case the artifact is a dacpac file, but it could also be ispac for SSIS or a assembly for C#.

6) Test the pipeline
Now it's time to test your build pipeline by committing some changes to your master branch via Visual Studio (or to create a pull request to pull changes to your master branch). This will trigger an automatic build. If you don't want to make any changes at this moment, you could just click on (Save &) Queue in devops to do a build manually.
Dry run your build pipeline

















Summary
In this post you saw how to build your database project automatically when someone changes something in the master branch (like a pull request). You don't have to check the result manually if you setup notifications in DevOps. Then you will receive an email when something succeeds of fails.















Bisides master branches you could also build other branches like feature or personal branches by using different filters. Or you could schedule your build to build each night. After setting up this basic example explorer and try out all the other options from the build pipeline. In a followup post we will show you how to deploy this built artifact through your DTAP environment using Continuous Deployment.

Monday, 21 October 2019

Flow Snack: Refresh Power BI

Case
Exiting news from the Microsoft Power Platform. You can now use Microsoft Flow to refresh a Power BI dataset. How does it work and why would you want to use it?
Refresh a Power BI dataset







Solution
Until now you could schedule your refresh or do a manual refresh in Power BI, but now you can also use Flow to trigger a refresh when new data arrives. For example when a new source file arrives in OneDrive, DropBox, SharePoint or a Azure Blob Storage Container. But the list of triggers in Flow is nearly endless and the number of possible sources in Power BI is growing steadily so most likely one of these sources fits your needs.


1) Choose your trigger
First choose a trigger to refresh your dataset. In this example the trigger is 'When a blob is added or modified' in the Azure Blob Storage container.
Trigger when new data arrives in a Blob Storage container









2) Search for Power BI
Add a new action and search for Power BI and select Refresh a dataset. Note that it is still a Preview version and could change.
Refresh a Power BI dataset
























3) Choose Workspace and Dataset
Now you must select your workspace and then the dataset in that workspace. After that save your Flow and check whether it is turned on and then start testing it by adding new files.
Select Workspace and Dataset that needs to be refreshed











Summary
An exiting new feature where you can use all kinds of triggers to refresh your Power BI dataset. Note that the refresh limitations still applies. So for PRO you have a maximum number of 8 refreshes a day and that is including any scheduled refreshes.


Thursday, 12 September 2019

Databases in DevOps - Pre-Deployment scripts

Case
I'm using database projects combined with Azure DevOps CI CD and I want to deploy data through my DTAP enviroment for some reference tables. Another reason could be that you want to do some data preparations before deploying all database changes. How do I do that in Visual Studio Database projects?
Adding Pre- and Post-Deployment scripts















Solution
This is where Pre-Deployment and Post-Deployment scripts could help you out. It's all in the name, but a Pre-Deployment script will be execute before deploying the database project and a Post-Deployment script will be executed after that deployment. These are all just regular .sql files containing T-SQL code.

For example a Pre-Deployment script can be used to secure data in a temporary table before a table change. A Post-Deployment script can be used to copy the data back to the appropriate tables and then clean up the temporary table afterwards.

Deploying data through the DTAP environments is nothing more then creating a delete statement and some subsequent insert queries. If the reference table already exits you could use a Pre-Deployment script or else you should use a Post-Deployment script.

1) Creating a deployment script
To keep our database project clean we will first create a folder called 'Scripts' to store the Pre-Deployment and Post-Deployment scripts. If you have a lot of scripts you could also create two separate folders.

To add scripts to your project, right click new folder in your the project and choose: Add, New Item..., User Scripts and then Pre-Deployment Script or Post-Deployment Script. Then give the file a suitable name.
Adding Pre- or Post-Deployment scripts















Note: see the file property of the deployment file that is called Build Action. This is where you determine the purpose and execution moment of a regular '.sql' file. However you can have only one Pre-Deployment and one Post-Deployment file in a project. There is a workaround below if you need multiple files.

2) Adding code
Now you can add your T-SQL code, but make sure it is repeatable (in case of a crash or when deploying multiple times). So if you want create a temporary table, make sure it doesn't already exits by adding a drop if-exists construction before the create statement.

Below a script that pushes data through the DTAP environment for a reference table.
/*
Post-Deployment Script Template       
--------------------------------------------------------------------------------------
 This file contains SQL statements that will be appended to the build script.  
 Use SQLCMD syntax to include a file in the post-deployment script.   
 Example:      :r .\myfile.sql        
 Use SQLCMD syntax to reference a variable in the post-deployment script.  
 Example:      :setvar TableName MyTable       
               SELECT * FROM [$(TableName)]     
--------------------------------------------------------------------------------------
*/

DELETE FROM [ref].[myTable]
GO
INSERT [ref].[myTable] ([Column1], [Column2], [Column3]) VALUES ('bla 1', 'bla 2', 'bla 3')
GO
INSERT [ref].[myTable] ([Column1], [Column2], [Column3]) VALUES ('bla a', 'bla b', 'bla c')
GO

If you have a big deployment script and you want to divide the code over multiple files then you have to create one 'master' script to call the other 'child' scripts because you can only have one Pre-Deployment and one Post-Deployment script in a database project. For this example we added multiple child scripts in the same folder as the parent script and called those by adding ":r .\" in front of the file names.
/*
Post-Deployment Script Template       
--------------------------------------------------------------------------------------
 This file contains SQL statements that will be appended to the build script.  
 Use SQLCMD syntax to include a file in the post-deployment script.   
 Example:      :r .\myfile.sql        
 Use SQLCMD syntax to reference a variable in the post-deployment script.  
 Example:      :setvar TableName MyTable       
               SELECT * FROM [$(TableName)] 
      
There can be only one post-deployment script per database! (One script to rule them all...)
--------------------------------------------------------------------------------------
*/

:r .\PostDeploymentFile1.sql
:r .\PostDeploymentFile2.sql
:r .\PostDeploymentFile3.sql

Conslusion
In this post you learned how to add some scripts before and after the general deployment of the database project. A good way to either add some reference data or to do some cleanup before you deploy your changes. Next take look at the publishing profile file for you SQL DB project.