Tuesday, 31 December 2019

Schedule start & stop of Azure Analysis Services (Az)

Case
To save some money on my Azure Bill, I want to pause my Azure Analysis Services (AAS) at night when nobody is using it and then resume it in the morning. How do you arrange that in Azure?
Save some money on your Azure Bill by pausing AAS



















Solution
A few years ago we showed you how to do this with some PowerShell code in an Azure Automation Runbook with the AzureRM modules. However these old modules will be be out of support by the end of 2020. So now it is time to change those scripts.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account
















2) Add Module Az.AnalysisServices
Before we start writing some code we need to add a PowerShell module called Az.AnalysisServices. This module contains methods we need in our code to pause and resume Azure Analysis Services. But first we need to add Az.Accounts because Az.AnalysisServices depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are not recognized:
Get-AzAnalysisServicesServer : The term 'Get-AzAnalysisServicesServer' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.AnalysisServices (but wait until Az.Accounts is actually imported)
  • Click on Az.AnalysisServices in the result and import this module
Adding a new module to your Automation Account














Note: if you are using an existing Automation Account then you probably already added Az.Accounts.

3) Create Runbook
Now we are ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook like StartStopAas
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button
Create a Runbook














4) Edit Runbook code
Next edit the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code that can both start and stop your Azure Analysis Services (AAS). It exists of five parts:
  1. Parameters
  2. Log in to Azure
  3. Get current state
  4. Pause or Resume
  5. Logging
Parameters
To pause or resume the script needs three parameters. The first parameter 'AasAction' is a string that indicates whether you want to stop or start the AAS. The second parameter 'ResourceGroupName' indicates the location (resourcegroup) of your AAS and the last parameter 'AnalysisServerName' is the name of your AAS. There are a couple of validations which you could extend to make your script even more monkey proof.
Note: if you want to call this script via Azure Data Factory (ADF), then you need to change the parameter part. You can find all the details to do that in our blog posts about Runbook parameters and ADF and using the Webhook activity in ADF. If this is your first time creating a runbook then first try the standard script and then adjust it to your needs.

Log in to Azure
This is a standard piece of code that you will see in all of our examples. Please read our blog post about the Azure Run as Account for more detailed information.

Get current state
This piece of code tests whether it can find the AAS and gets its current state. It stores the current state and uses it later on for an extra check when pausing or resuming the AAS.

Pause or Resume
This is the actual code for pausing or resuming the AAS. There is an extra check to compare the current state with the new desired state. It now throws an error when you want to pause an AAS that is already paused. You could change that to write an warning instead of an error.
Note: you could also send emails to notify you of any errors

Logging
The last piece of code is for logging purposes. It shows you that it successfully changed the state of the AAS and how long it took to accomplish that.

# PowerShell code

########################################################
# Parameters
########################################################
[CmdletBinding()]
param(
    [Parameter(Mandatory=$True,Position=0)]
    [ValidateSet('Start','Stop')]
    [string]$AasAction,
    
    [Parameter(Mandatory=$True,Position=1)]
    [ValidateLength(1,100)]
    [string]$ResourceGroupName,

    [Parameter(Mandatory=$True,Position=2)]
    [ValidateLength(1,100)]
    [string]$AnalysisServerName
)

# Keep track of time
$StartDate=(GET-DATE)



########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################
 


########################################################
# Getting the AAS for testing and logging purposes
########################################################
$myAzureAnalysisServer = Get-AzAnalysisServicesServer -ResourceGroupName $ResourceGroupName -Name $AnalysisServerName
if (!$myAzureAnalysisServer)
{
    Write-Error "$($AnalysisServerName) not found in $($ResourceGroupName)"
    return
}
else
{
    Write-Output "Current status of $($AnalysisServerName): $($myAzureAnalysisServer.State)"
}



########################################################
# Pause or Resume AAS
########################################################
# Check for incompatible actions
if (($AasAction -eq "Start" -And $myAzureAnalysisServer.State -eq "Succeeded") -Or ($AasAction -eq "Stop" -And $myAzureAnalysisServer.State -eq "Paused"))
{
    Write-Error "Cannot $($AasAction) $($AnalysisServerName) while the status is $($myAzureAnalysisServer.State)"
    return
}
# Resume Azure Analysis Services
elseif ($AasAction -eq "Start")
{
    Write-Output "Now starting $($AnalysisServerName)"
    $null = Resume-AzAnalysisServicesServer -ResourceGroupName $ResourceGroupName -Name $AnalysisServerName
}
# Pause Azure Analysis Services
else
{
    Write-Output "Now stopping $($AnalysisServerName)"
    $null = Suspend-AzAnalysisServicesServer -ResourceGroupName $ResourceGroupName -Name $AnalysisServerName
}



########################################################
# Show when finished
########################################################
$Duration = NEW-TIMESPAN –Start $StartDate –End (GET-DATE)
Write-Output "Done in $([int]$Duration.TotalMinutes) minute(s) and $([int]$Duration.Seconds) second(s)"

5) Testing
Testing the functionality of your code can be done in the runbook editor. Click on the Test pane button above your script. After that you need to fill in the parameters and hit the Start button to execute the script.
Testing your script















6) Scheduling Runbook
To schedule your runbook in Azure Automation you first need to publish it via the Runbook editor. After it has been published you can add a schedule to this runbook.
  • Edit the script in the runbook editor
  • Click on publish (the editor will close and you will be redirected to the overview page)
  • In the overview page click on Link to schedule
  • In the Schedule menu you can select an existing schedule or create a new one
  • In the Parameter menu you can provide the value for the parameters
Add schedule to runbook















Note: If you have multiple Azure Analysis Services that you all want to pause/resume on the same time then you have a slight problem because you cannot reuse a schedule for the same runbook multiple times with different parameters (please upvote or add a comment). Workarounds:

  1. create multiple identical schedules (ugly but works)
  2. do everything in one big script (less flexible but works)
Log of runbook executions















Summary
In this blog post you learned how to schedule a stop and start for your Azure Analysis Services to save money in case you don't need it to be live 24*7. Scheduling is done in Azure Automation, but with some minor changes you can also do that via an ADF pipeline.

Monday, 9 December 2019

SendGrid - Send emails via a Runbook

Case
I want to send email notifications in my Azure Automation runbook when pausing or downscaling fails, but I don't want to create an extra Office365 account for just sending email notifications. How do you send emails in a PowerShell runbook?
Sending free emails in Azure













Solution
Microsoft added the third party service SendGrid to the marketplace under 'Software as a Service (SaaS)' which allows you to send a massive number of 25000 emails a month for free. Hopefully more than enough for a couple of failure notifications.

In this blog we will show you how to create a SendGrid account in Azure and then show you how to send email notifications via a PowerShell runbook. This could be useful to send notifications when pausing or downscaling an Azure resource in your runbook fails. In an other blog post we already showed you how to use SendGrid in Azure LogicApps in combination with Azure Data Factory to send notifications when the ETL process fails.

Part 1: Create SendGrid Account
The first part of this solution is the same as the previous LogicApps post. You can skip it if you already have a SendGrid account.

1) Create new resource
The first step is to create a SendGrid resource which you can find in the Azure marketplace. SendGrid will send the actual emails.
  1. Go to the Azure Portal and create a new resource
  2. Search in de Marketplace for SendGrid or find it under the topic Software as a Service (SaaS)
  3. Select SendGrid and then click on the Create button
  4. Give your new resource a useful name and a secure password
  5. Select the right subscription and Resource Group
  6. Choose the pricing tier you need (F1 is free) and optionally enter a Promotion Code if you expect to send millions of emails a month
  7. Fill out the contact details which will be send to SendGrid (Twilio) for support reasons
  8. Review (and accept) the legal terms
  9. Click on the create button
Create Azure SendGrid account















After these first steps a SendGrid resource will be created in Azure, but it will also create a SendGrid account at sendgrid.com. You will also receive an email to activate your account on sendgrid.com. Note that the pricing details can be changed in Azure, but all the other (non-azure) settings can only be edited on sendgrid.com.

2) Create API key
To send emails via SendGrid we first need the API key. This key can only be generated on the sendgrid.com website.
  1. Go to the Azure SendGrid resource and click on the Manage button on the overview page. This will redirect you to the sendgrid.com website
  2. Go to Settings in the left menu and collapse the sub menu items
  3. Go to the API Keys and click on the Create API Key button.
  4. Then enter a name for your API key, choose which permissions you need and click on the Create and View button.
  5. Copy and save this API key in a password manager. This is the only option you get to retrieve this API key
Retrieve API key















The first part is done. You now have successfully obtained the SendGrid API key which we will use in the second part of this blog.


Part 2: Use SendGrid in Automation Runbooks
Now the coding part of the solution where we will use PowerShell code to send an email.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose an useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • Next decide whether you need the Azure Run As account. The first code example does not require it, but the second example (where the SendGrid API key is stored in an Azure Key Vault) does need it. So decide and hit the create button
Create Azure Automation Account















2) Create Runbook
Next we need to create a runbook to host the PowerShell code.
  • Go to the newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button

Create Runbook















3) Edit Runbook code
Next edit the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to think about how to integrate it in your own code.
# PowerShell code

# Hardcode the API key of sendgrid. We need it in the header of the API call
$SENDGRID_API_KEY = "SG.O_g2Bl634FAKEzVOskYInv.8yu_ID2cdxQgoFAKEyG0ByK5xjvPu30DKgOET_n1D7U"

# Create the headers for the API call
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Bearer " + $SENDGRID_API_KEY)
$headers.Add("Content-Type", "application/json")

# Parameters for sending the email
$fromEmailAddress = "noreply@bitools.com"
$destEmailAddress = "joost@bitools.com"
$subject = "Testemail send from runbook via SendGrid"
$content = "Testing 123. This is a test email send via SendGrid"

# Create a JSON message with the parameters from above
$body = @{
personalizations = @(
    @{
        to = @(
                @{
                    email = $destEmailAddress
                }
        )
    }
)
from = @{
    email = $fromEmailAddress
}
subject = $subject
content = @(
    @{
        type = "text/plain"
        value = $content
    }
)
}

# Convert the string into a real JSON-formatted string
# Depth specifies how many levels of contained objects
# are included in the JSON representation. The default
# value is 2
$bodyJson = $body | ConvertTo-Json -Depth 4

# Call the SendGrid RESTful web service and pass the
# headers and json message. More details about the 
# webservice and the format of the JSON message go to
# https://sendgrid.com/docs/api-reference/
$response = Invoke-RestMethod -Uri https://api.sendgrid.com/v3/mail/send -Method Post -Headers $headers -Body $bodyJson

The JSON message that will be send to the webservice looks like this:
{
   "personalizations":[
      {
         "to":[
            {
               "email":"joost@bitools.com"
            }
         ]
      }
   ],
   "from":{
      "email":"no_reply@bitools.com"
   },
   "content":[
      {
         "value":"Testing 123. This is a test email send via SendGrid",
         "type":"text/plain"
      }
   ],
   "subject":"Testemail send from runbook via SendGrid"
}
In the example above the API key is hardcode which is avoidable with some extra steps. In a previous blog we showed you how to store a secret in Azure Key Vault and retrieve it with PowerShell. The code example below shows you how to do that, but please read the Key Vault - Runbook post for more details. The first part is login to Azure and the second change is filling the variable SENDGRID_API_KEY.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Retrieve the API key of sendgrid from the Key Vault. We need it in the header of the API call
$SENDGRID_API_KEY = (Get-AzKeyVaultSecret -VaultName "MyVault" -Name "SendGrid").SecretValueText

# Create the headers for the API call
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Bearer " + $SENDGRID_API_KEY)
$headers.Add("Content-Type", "application/json")

# Parameters for sending the email
$fromEmailAddress = "noreply@bitools.com"
$destEmailAddress = "joost@bitools.com"
$subject = "Testemail send from runbook via SendGrid"
$content = "Testing 123. This is a test email send via SendGrid"

# Create a JSON message with the parameters from above
$body = @{
personalizations = @(
    @{
        to = @(
                @{
                    email = $destEmailAddress
                }
        )
    }
)
from = @{
    email = $fromEmailAddress
}
subject = $subject
content = @(
    @{
        type = "text/plain"
        value = $content
    }
)
}

# Convert the string into a real JSON-formatted string
# Depth specifies how many levels of contained objects
# are included in the JSON representation. The default
# value is 2
$bodyJson = $body | ConvertTo-Json -Depth 4

# Call the SendGrid RESTful web service and pass the
# headers and json message. More details about the 
# webservice and the format of the JSON message go to
# https://sendgrid.com/docs/api-reference/
$response = Invoke-RestMethod -Uri https://api.sendgrid.com/v3/mail/send -Method Post -Headers $headers -Body $bodyJson

4) Testing
Testing the functionality of your code can be done in the runbook editor. Click on the Test pane button above your script. After that just hit the Start button to execute the script. The example doesn't have code that writes to the screen. You can add that yourself or just wait for the email the arrive in your mailbox. Below an animation of testing an other script.
Testing PowerShell code in a Runbook














Summary
In this post we showed you how to use SendGrid in an Azure Automation runbook with PowerShell code. To integrate it in your own code you probably want to create a function for it. It is advisable to move the SendGrid API key to an Azure Key Vault to keep your code clean of any passwords or keys.

If you have any suggestions to use SendGrid in other Azure services then please leave a comment below.

Sunday, 8 December 2019

Azure Data Factory Webhook activity calls Runbook

Case
The Azure Data Factory (ADF) Web activity calls my runbooks asynchronous (execute and don't wait for an answer). Is there a way to do this synchronous and even provide feedback to ADF in case it fails?
ADF Webhook activity























Solution
This year Microsoft added the Webhook activity to ADF. This works nearly the same as the existing Web activity, but it calls the webhook of your runbook synchronous and expects a callback to let ADF know it is ready. With some additional code you can adjust your runbook and make it work for both the Web and Webhook activity.

In a previous post we showed you how to retrieve parameters from ADF to an Azure Automation runbook. This code should be somewhere in the top of your code.
# Parameters
Param
(
    # Get webhook data
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)
 
# Get all parameters from body 
# (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

We can reuse this code to also extract the parameter 'callBackUri'. This parameter is only provided by the Webhook activity and not by the Web activity. By asking whether you can retrieve this parameter you can distinguish a call between the Webhook and the Web activity.
# Read and store the callBackUri which
# is only provided by the Webhook activity
If ($Parameters.callBackUri)
{
    $callBackUri = $Parameters.callBackUri
}

The Webhook activity waits for a callback that the script is ready. Therefore we need to add some code at the bottom of your script. If you forget this step then the Webhook activity waits unit the timeout occurs and then fail your pipeline. So this step is mandatory for the Webhook, but cannot be used for the Web activity since it doesn't provide a call back uri.
# Let the Webhook activity know that the script is ready
# Otherwise its waits unit its timeout
If ($callBackUri)
{
    Invoke-WebRequest -Uri $callBackUri -Method Post
}
By default the Webhook activity has a timeout of 10 minutes. For scripts executing more than 10 minutes (like resuming the Integration Runtime in ADF) you should raise this number.
Timeout of 10 minutes























An other handy option is to provide some feedback to the Webhook activity. For example when something fails in your script or when you expect certain parameters that have not been provided. This saves you some time to search for what went wrong in your runbook activity log. You have to provide the error details via a JSON message:
# Let the Webhook activity know that the script is ready
# Otherwise it waits unit its timeout
If ($callBackUri)
{
    # Create an error message
    # Message and statuscode will show up in ADF
    $body = [ordered]@{
    error = @{
    ErrorCode = "ParameterError"
    Message = "Required parameters where not provided"
    }
    statusCode = "404"
    }

    # Convert the string into a real JSON-formatted string
    $bodyJson = $body | ConvertTo-Json

    # Call back with error message in body and a JSON contenttype
    Invoke-WebRequest -Uri $callBackUri -Method Post -Body $bodyJson -ContentType "application/json"
}

To enable this functionality you have to turn on the option 'Report status on callback' in the ADF Webhook activity. ADF will ignore your message when you do not enable it.
Report status on callback























The error message and statuscode will show up in ADF pipeline monitor.
Runbook error message shown in ADF









You can provide various HTTP status codes in the JSON message to give your error message a certain category. For example:
  • 401 - Unauthorized
  • 403 - Forbidden
  • 404 - Not Found
  • 429 - Too Many Requests
  • 500 - Internal Server Error
  • 503 - Service Unavailable

A complete list of HTTP status codes can be found here. All status codes from 400 and above are errors.

Summary
In this post we showed you how to use the ADF Webhook activity benefits in an Azure Automation runbook and also still make it work for calling via the ADF Web activity. The benefits are synchronously calling your runbook and providing error details in case of a failure.

Thursday, 5 December 2019

Use Azure Key Vault for Automation Runbooks

Case
I created an Azure Automation Runbook to managed various Azure resources, but I don't want to store keys or passwords in my code. What is the secure way to manage keys and passwords in a Runbook?
Azure Automation Runbook with Azure Key Vault














Solution
The answer is by using the Azure Key Vault. You can store your secrets in the Key Vault and then give the account running the Runbook the appropriate rights to retrieve them with a script.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account
















2) Add Module Az.KeyVault
Before we start writing code we need to add a PowerShell module called Az.KeyVault. This module contains methods we need in our code to retrieve secrets from the Azure Key Vault. But first we need to add Az.Accounts because Az.KeyVault depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are nog recognized:
Get-AzKeyVaultSecret : The term 'Get-AzKeyVaultSecret' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.KeyVault (but wait until Az.Accounts is actually imported)
  • Click on Az.KeyVault in the result and import this module
Adding a new module to your Automation Account















3) Get account name
We need to determine the name of the Azure Run As account because we have to give this account access to the secrets inside the Azure Key Vault. The name is usually the name of the Automation Account followed by a string of random chars.
  • Locate Connections in the left menu of the Automation account and click on it.
  • Click on the row containing 'AzureRunAsConnection'
  • A new pane with details will appear. You need to remember the Guid of the ApplicationId.

Find ApplicationId













  • Search for App registrations in the Azure search bar on the top of the Azure dashboard
  • Search for the App registration with the same Application (client) ID as above
  • Remember the Display name of this account for later on
Find account name with ApplicationId














4) Create Key Vault
Next step is creating a key vault. If you already have one then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for key vault
  • Select Key Vault and click on Create
  • Select your Subscription and Resource Group 
  • Choose a useful name for the Key Vault
  • Select your Region (the same as your other resources)
  • And choose the Pricing tier. We will use Standard for this demo
Creating a new Key Vault
















5) Add Secret
Now that we have a Key Vault we have to put in a password for testing. The Key Vault stores three types of items: Secrets, Keys and Certificates. For passwords, account keys or connectionstrings you need the Secret.
  • Go to the newly created Azure Key Vault
  • Go to Secrets in the left menu
  • Click on the Generate/Import button to create a new secret
  • Choose Manual in the upload options
  • Enter a recognizable and descriptive name. You will use this name in the runbook code
  • Next step is to add the secret value which we will retrieve in the code
  • Keep Content type Empty and don't use the activation or expiration date for this example
  • Make sure the secret is enabled and then click on the Create button
Adding a new secret to Azure Key Vault
















6) Access policies
By default the Run As Account can only see the Azure Key Vault, but it can't read its content. In step 3 you retrieved the name of the Azure Run As Account. Now we will give it access to read the secrets.
  1. Go to Access policies in the left menu
  2. Click on the blue + Add Access Policy link
  3. Leave Configure from template empty
  4. Leave Key permissions unselected (we will only use a Secret for this example)
  5. Select Get for Secret permissions
  6. Leave Certificate permissions unselected (we will only use a Secret for this example)
  7. Click on the field of Select principal to find the account from step 3
  8. Leave Authorized application unchanged
  9. Click on Add and a new Application will appear in the list of Current Access Policies
Add Access policy
















7) Create Runbook
Now we are finally ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button

Create Runbook















8) Edit Runbook code
Next open the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to make sure we don't steal your data. The first part about login to Azure is described in this previous post.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Variables for retrieving the correct secret from the correct vault
$VaultName = "bitools"
$SecretName = "MyPassword"

# Retrieve value from Key Vault
$MySecretValue = (Get-AzKeyVaultSecret -VaultName $VaultName -Name $SecretName).SecretValueText

# Write value to screen for testing purposes
Write-Output "The value of my secret is $($MySecretValue)"

Hardcoding the parameter for the Key Vault name is not very flexible, but you could also pass them from Azure Data Factory by changing it into a parameter. This is especially useful when you have multiple environments (Development, Test, Acceptance and Production).
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################


########################################################
# PARAMETERS 
########################################################
Param
(
    # ContainerName is required
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from set of parameters
$ContainerName = $Parameters.ContainerName

# Variables for retrieving the correct secret from the correct vault
$VaultName = $Parameters.VaultName
$SecretName = "MyPassword"

# Retrieve value from Key Vault
$MySecretValue = (Get-AzKeyVaultSecret -VaultName $VaultName -Name $SecretName).SecretValueText

# Write value to screen for testing purposes
Write-Output "The value of my secret is $($MySecretValue)"
The parameters that will be provided in Azure Data Factory (ADF) via a JSON message will look like
{
"VaultName":"bitools"
}

9) Testing
Testing the functionality of your code is the easiest if you still have the hardcoded parameters. Then you can just use the Test pane in the Runbook editor like the animation below. If you want to test it with the parameters for ADF then you first need to create a webhook and then create and run an ADF pipeline to test your code.
Testing the code















Summary
In this post you learned how to use the Azure Key Vault in an Azure Automation Runbook. The code to retrieve the secret is very simple (only one command), but giving the right Automation Run As account the correct access is a bit more research/fiddling around. In futures posts we will show you how to use Azure Key Vault in combination with other tools like Azure Data FactoryAzure Databricks or Azure Functions.

Sunday, 1 December 2019

SendGrid - Sending emails via ADF & LogicApps

Case
I want to send email notifications when my ETL process fails, but I don't want to create an extra Office365 account for just sending email notifications.
Sending free emails in Azure













Solution
Microsoft added the third party services SendGrid to the marketplace under 'Software as a Service (SaaS)' which allows you to send a massive number of 25000 emails a month for free. Hopefully more than enough for a couple of ETL failure notifications.

In this blog we will show you how to create a SendGrid account in Azure and then show you how to send email notifications via ADF/LogicApps/SendGrid. In a second post we will show you that you can also use this in an Azure Automation Runbook to send notifications about the up- or downscale and the pause or resume of services.


Part 1: Create SendGrid Account

1) Create new resource
The first step is to create a SendGrid resource which you can find in the Azure marketplace.
  1. Go to the Azure Portal and create a new resource
  2. Search in de Marketplace for SendGrid or find it under the topic Software as a Service (SaaS)
  3. Select SendGrid and then click on the Create button
  4. Give your new resource an useful name and a secure password
  5. Select the right subscription and Resource Group
  6. Choose the pricing tier you need (F1 is free) and optionally enter a Promotion Code if you expect to send millions of emails a month
  7. Fill out the contact details which will be send to SendGrid (Twilio) for support reasons
  8. Review (and accept) the legal terms
  9. Click on the create button
Create Azure SendGrid account















After these first steps a SendGrid resource will be created in Azure, but it will also create a SendGrid account at sendgrid.com. You will also receive an email to activate your account on sendgrid.com. Note that the pricing details can be changed in Azure, but all the other (non-azure) settings can only be edited on sendgrid.com.

2) Create API key
To send emails via SendGrid we first need the API key. This key can only be generated on the sendgrid.com website.
  1. Go to the Azure SendGrid resource and click on the Manage button on the overview page. This will redirect you to the sendgrid.com website
  2. Go to Settings in the left menu and collapse the sub menu items
  3. Go to the API Keys and click on the Create API Key button.
  4. Then enter a name for your API key, choose which permissions you need and click on the Create and View button.
  5. Copy and save this API key in a password manager. This is the only option you get to retrieve this API key
Retrieve API key















The first part is done. You now have successfully obtained the SendGrid API key which we will use in the second part of this blog.


Part 2: Use SendGrid in LogicApps
In this previous post we showed you how to send an email notification in Azure Data Factory (ADF) via LogicApps. That solution exists of two parts: ADF (error handling) and Logic Apps (send email). For this post we will only slightly change the Logic Apps part of that solution. Instead of using the Office 365 Outlook - Send an email action we will use the SendGrid - Send email action.
Replace Office 365 Outlook by SendGrid























The communication between these two Azure resources is done with a JSON message via an HTTP (post) request. The JSON message contains the name of the Data Factory and the pipeline that failed, an error message and a send to email address.







{
    "properties": {
        "DataFactoryName": {
            "type": "string"
        },
        "PipelineName": {
            "type": "string"
        },
        "ErrorMessage": {
            "type": "string"
        },
        "EmailTo": {
            "type": "string"
        }
    },
    "type": "object"
}

1) Create new Logic App
Let's create an Azure Logic App that receives parameters from ADF and sends an email using these parameters.
  1. Click on Create a resource on the Azure portal, search for Logic App, select it and hit the Create button. You can also locate Logic App under Integration in the Azure Marketplace.
  2. Pick a descriptive name like "ADF-Notifications"
  3. Then select the Subscription, Resource Group and Location
  4. Now hit the Create button and wait for it to be generated
Create new Logic App















2) HTTP Trigger
Next step is to edit the newly created Logic App and to pick a trigger. This is the event that starts this Logic App to send emails. To call in from the ADF Web(hook) activity we need an HTTP trigger.
  1. Go to your new Logic App and edit it
  2. Pick the HTTP trigger When a HTTP request is received
  3. Edit the trigger and copy and paste the JSON message from above into the big text-area. 
Adding the HTTP trigger















Adding the JSON message to the HTTP trigger will generate new variables for the next action that will be used to generate the email message.

3) Send an email
This next step deviates from the previous post and adds the SendGrid action to send emails. At the moment of writing there is a v3 and a v4 preview version. For important processes you probably should not choose the preview version.
  1. Add a New Step
  2. Search for SendGrid and select it
  3. Choose the Send email action
  4. Give the connection a name and paste the SendGrid API and click on Create
  5. The From-field is a hardcoded email address in this example
  6. The To-field comes from the variable 'EmailTo' generated by the JSON message in the HTTP trigger
  7. The Subject is a concatenation of the DFT name and the pipeline name
  8. Body comes from the 'ErrorMessage' variable
  9. Save the Logic App. The first save action will reveal the HTTP post URL.
Note that the top of the dynamic content window didn't appear on the screen. Collapsing the HTTP trigger above did the trick.
Adding SendGrid - Send email action















Tip: keep the message setup dynamic with variables which allows you to use it for all your ADF pipelines.

4) Copy URL from HTTP trigger
The Logic App is ready. Click on the HTTP trigger and copy the URL. We need this in ADF to trigger the Logic App.
Copy the URL for ADF
















Part 3: Setup Data Factory
Next step is adding a Web activity in the ADF pipeline to call the Logic App from above that sends the email. Since nothing changed on the ADF part of the solution since last time, you can continue for that ADF part on our previous post.

Summary
In this post we showed you how to use SendGrid in LogicApps to send you an email notification in case of a failing pipeline in Azure Data Factory. This saves you a daily login to the Azure portal to check the pipelines monitor. An additional advantage of using SendGrid above an Office 365 account like in the previous post is that it is free as long as you not send more than 25000 emails a month and you don't have to ask the administrator for an Office 365 account.

In a second blog post about SendGrid we will show you how to use it in an Azure Automation Runbook. This allows you to send notifications with PowerShell when for example your runbook fails.

Tuesday, 26 November 2019

Archiving within a Blob Storage container

Case
New files are delivered to my Blob Storage container each day. After processing them I want to archive them into folders with a timestamp in the name and after x days delete them. Which Azure resource can I use for this task?
Container with archive folders















Solution
One of the services you could use is an Azure Automation runbook with some PowerShell code. Note that blob storage containers only have virtual folders which means that the foldername is stored in the filename. Microsoft Azure Storage Explorer will show it as if it are real folders.

1) Create Automation Account
First we need to create an Azure Automation account to host the runbook with PowerShell code.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • Decide whether you need an Azure Run As account and click on the Create button.
    Note: We don't need it for this specific Runbook, because we will use the access key to access the Blob Storage Container. If you intend to add more runbooks to manage other Azure Resources you should probably enable it.
Create Azure Automation Account
















2) Add Module Az.Storage
Before we start writing code we need to add a PowerShell module called Az.Storage. This module contains methods we need in our code.

If you forget this step you will get error messages while running your code that state that some of your commands are nog recognized:
New-AzStorageContext : The term 'New-AzStorageContext' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Storage
  • Click on Az.Storage in the result and import this module
Adding a new module to your Automation Account
















3) Create Runbook
Now we need to add a PowerShell Runbook to this new Automation Account.
  • Go to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button
Create Runbook
















4) Edit Runbook code
Now first look up the Storage Account Name and key (1 or 2) from your storage account that you want to cleanup with this Archive script. In the first example we will hardcode these two parameters in the PowerShell code it self. The second example will show you an alternative.
Lookup Account Name and Key















Next open the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to make sure we don't steal your data. If you never want to delete files then just remove the cleanup part starting on row 74.

Since the storage account container doesn't have real (sub)folders, we will use virtual folders which means the foldername is actually stored in the filename. Therefore we cannot move files to a subfolder, but we can rename them instead. Unfortunately renaming a blob is not a standard method available in the Azure Storage module. Therefore we will use a custom function from Martin Brandl that makes a copy with the new name and deletes the original.

# PowerShell code

##########################################################################
############################### PARAMETERS ###############################
##########################################################################
$StorageAccountName = "bitools"
$StorageAccountKey = "XTZSqCcF7q43SwX3FAKEgG5ezCC3l5jor5gUajoTWnnk4qFAKEk52UuR3lYqw1eaFAKEbQe3M4CpSbGDVnSCKg=="
$ContainerName = "sensordata"
$DaysHistory = 31


##########################################################################
########################## Rename-AzStorageBlob ##########################
##########################################################################
# Custom function because Rename method does not exist in Storage modules
# Function copied from about-azure.com and adjusted from AzureRM to Az
# https://about-azure.com/2018/02/13/rename-azure-storage-blob-using-powershell/

function Rename-AzStorageBlob
{
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
        [Object]$Blob,
        [Parameter(Mandatory=$true, Position=1)]
        [string]$NewName
    )


  Process {
    $blobCopyAction = Start-AzStorageBlobCopy `
        -ICloudBlob $Blob.ICloudBlob `
        -DestBlob $NewName `
        -Context $Blob.Context `
        -DestContainer $Blob.ICloudBlob.Container.Name
 
    $status = $blobCopyAction | Get-AzStorageBlobCopyState
 
    while ($status.Status -ne 'Success')
    {
        $status = $blobCopyAction | Get-AzStorageBlobCopyState
        Start-Sleep -Milliseconds 50
    }
 
    $Blob | Remove-AzStorageBlob -Force
  }
}


##########################################################################
############################### ARCHIVING ################################
##########################################################################
Write-Output "Start archiving $($ContainerName) in $($StorageAccountName)"

# Get 'context' of the source container
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

# Determine prefix of datetime
$Prefix = (get-date).ToString("yyyyMMdd-HHmmssfff")

# Loop through all blobs that are not in a virtual folder
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $StorageContext | where {$_.Name -notlike "*/*" }
foreach ($blob in $blobs)
{
    $NewName = $Prefix + "\" + $blob.name
    Write-Output "Archive $($blob.name) to $($NewName)"
   
    #Rename blob file.csv => 20190226\file.csv
    Get-AzStorageBlob -Container $ContainerName -Context $StorageContext -Blob $blob.name  | Rename-AzStorageBlob -NewName $NewName
}


##########################################################################
################################ CLEANUP #################################
##########################################################################
Write-Output "Start cleanup $($ContainerName) in $($StorageAccountName)"

# Determine date in history based on given days
$HistoryLimit = (get-date).AddDays(-$DaysHistory).ToString("yyyyMMdd-HHmmssfff")

# Get all blobs where the names are at least 19 chararters long
# and from that set get the files that matches 99999999-999999999/
# example: 20191123-171423559/filename.csv
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $StorageContext | where {$_.Name.Length -gt 19} | where {($_.Name).Substring(0,19) -match "^\d{8}[-]\d{9}/$"}

# Loop through the set with blobs that matches our requirements
# and delete those from the container
foreach ($blob in $blobs)
{
    if ( ($blob.name).Substring(0,8) -lt $HistoryLimit)
    {
        Write-Output "Remove old blob $($blob.name)"
        Remove-AzStorageBlob -Context $StorageContext -Blob $blob.name -Container $ContainerName
    }
}

Not very flexible and secure to store your Storage Accountname and Accesskey in the code it self. So in this second example we will replace it by parameters that can be passed from Azure Data Factory. You can find more details in this blog post. Only the parameter part is different and the rest of the code is unchanged.
# PowerShell code

##########################################################################
############################### PARAMETERS ###############################
##########################################################################
Param
(
    # ContainerName is required
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from set of parameters
$ContainerName = $Parameters.ContainerName

$StorageAccountName = $Parameters.StorageAccountName
$StorageAccountKey = $Parameters.StorageAccountKey
$ContainerName = $Parameters.ContainerName
$DaysHistory = $Parameters.DaysHistory
The parameters that will be provided in Azure Data Factory (ADF) via a JSON message will look like this:
{
"StorageAccountName":"BiTools",
"StorageAccountKey ":"XTZSqCcF7q43SwX3FAKEgG5ezCC3l5jor5gUajoTWnnk4qFAKEk52UuR3lYqw1eaFAKEbQe3M4CpSbGDVnSCKg==",
"ContainerName":"SensorData",
"DaysHistory":"31"
}

Another alternative, for when you don't want to use ADF, is using the Azure Key Vault to store the key and then use some PowerShell code to retrieve it in your runbook. The code is very simple, but see this post for more details.
# PowerShell code snippet
# Retrieve value from Key Vault
$StorageAccountKey = (Get-AzKeyVaultSecret -VaultName "MyKeyVault" -Name "StorageAccountKey").SecretValueText

5) Testing
Testing the functionality of your code is the easiest if you still have the hardcoded parameters. Then you can just use the Test pane in the Runbook editor like the animation below. If you want to test it with the parameters for ADF then you first need to create a webhook and then create and run an ADF pipeline to test your code.
Testing the code
















Summary
In this post we explained how to archive files in a container from a Storage Account using PowerShell. We also showed how to pass the parameters like the account key from ADF instead of hard coding it. However we also don't want to store secrets in ADF. In a next post we will show you how to store the Storage account key in an Azure Key Vault and show how to read that secret within ADF to use it as a parameter for this Runbook.
Passing secrets parameters using KeyVault, DataFactory and a Runbook