Showing posts with label RUNBOOK. Show all posts
Showing posts with label RUNBOOK. Show all posts

Sunday 8 December 2019

Azure Data Factory Webhook activity calls Runbook

Case
The Azure Data Factory (ADF) Web activity calls my runbooks asynchronous (execute and don't wait for an answer). Is there a way to do this synchronous and even provide feedback to ADF in case it fails?
ADF Webhook activity























Solution
This year Microsoft added the Webhook activity to ADF. This works nearly the same as the existing Web activity, but it calls the webhook of your runbook synchronous and expects a callback to let ADF know it is ready. With some additional code you can adjust your runbook and make it work for both the Web and Webhook activity.

In a previous post we showed you how to retrieve parameters from ADF to an Azure Automation runbook. This code should be somewhere in the top of your code.
# Parameters
Param
(
    # Get webhook data
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)
 
# Get all parameters from body 
# (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

We can reuse this code to also extract the parameter 'callBackUri'. This parameter is only provided by the Webhook activity and not by the Web activity. By asking whether you can retrieve this parameter you can distinguish a call between the Webhook and the Web activity.
# Read and store the callBackUri which
# is only provided by the Webhook activity
If ($Parameters.callBackUri)
{
    $callBackUri = $Parameters.callBackUri
}

The Webhook activity waits for a callback that the script is ready. Therefore we need to add some code at the bottom of your script. If you forget this step then the Webhook activity waits unit the timeout occurs and then fail your pipeline. So this step is mandatory for the Webhook, but cannot be used for the Web activity since it doesn't provide a call back uri.
# Let the Webhook activity know that the script is ready
# Otherwise its waits unit its timeout
If ($callBackUri)
{
    Invoke-WebRequest -Uri $callBackUri -Method Post
}
By default the Webhook activity has a timeout of 10 minutes. For scripts executing more than 10 minutes (like resuming the Integration Runtime in ADF) you should raise this number.
Timeout of 10 minutes























An other handy option is to provide some feedback to the Webhook activity. For example when something fails in your script or when you expect certain parameters that have not been provided. This saves you some time to search for what went wrong in your runbook activity log. You have to provide the error details via a JSON message:
# Let the Webhook activity know that the script is ready
# Otherwise it waits unit its timeout
If ($callBackUri)
{
    # Create an error message
    # Message and statuscode will show up in ADF
    $body = [ordered]@{
    error = @{
    ErrorCode = "ParameterError"
    Message = "Required parameters where not provided"
    }
    statusCode = "404"
    }

    # Convert the string into a real JSON-formatted string
    $bodyJson = $body | ConvertTo-Json

    # Call back with error message in body and a JSON contenttype
    Invoke-WebRequest -Uri $callBackUri -Method Post -Body $bodyJson -ContentType "application/json"
}

To enable this functionality you have to turn on the option 'Report status on callback' in the ADF Webhook activity. ADF will ignore your message when you do not enable it.
Report status on callback























The error message and statuscode will show up in ADF pipeline monitor.
Runbook error message shown in ADF









You can provide various HTTP status codes in the JSON message to give your error message a certain category. For example:
  • 401 - Unauthorized
  • 403 - Forbidden
  • 404 - Not Found
  • 429 - Too Many Requests
  • 500 - Internal Server Error
  • 503 - Service Unavailable

A complete list of HTTP status codes can be found here. All status codes from 400 and above are errors.

Summary
In this post we showed you how to use the ADF Webhook activity benefits in an Azure Automation runbook and also still make it work for calling via the ADF Web activity. The benefits are synchronously calling your runbook and providing error details in case of a failure.

Thursday 5 December 2019

Use Azure Key Vault for Automation Runbooks

Case
I created an Azure Automation Runbook to managed various Azure resources, but I don't want to store keys or passwords in my code. What is the secure way to manage keys and passwords in a Runbook?
Azure Automation Runbook with Azure Key Vault














Solution
The answer is by using the Azure Key Vault. You can store your secrets in the Key Vault and then give the account running the Runbook the appropriate rights to retrieve them with a script.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account
















2) Add Module Az.KeyVault
Before we start writing code we need to add a PowerShell module called Az.KeyVault. This module contains methods we need in our code to retrieve secrets from the Azure Key Vault. But first we need to add Az.Accounts because Az.KeyVault depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are nog recognized:
Get-AzKeyVaultSecret : The term 'Get-AzKeyVaultSecret' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.KeyVault (but wait until Az.Accounts is actually imported)
  • Click on Az.KeyVault in the result and import this module
Adding a new module to your Automation Account















3) Get account name
We need to determine the name of the Azure Run As account because we have to give this account access to the secrets inside the Azure Key Vault. The name is usually the name of the Automation Account followed by a string of random chars.
  • Locate Connections in the left menu of the Automation account and click on it.
  • Click on the row containing 'AzureRunAsConnection'
  • A new pane with details will appear. You need to remember the Guid of the ApplicationId.

Find ApplicationId













  • Search for App registrations in the Azure search bar on the top of the Azure dashboard
  • Search for the App registration with the same Application (client) ID as above
  • Remember the Display name of this account for later on
Find account name with ApplicationId














4) Create Key Vault
Next step is creating a key vault. If you already have one then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for key vault
  • Select Key Vault and click on Create
  • Select your Subscription and Resource Group 
  • Choose a useful name for the Key Vault
  • Select your Region (the same as your other resources)
  • And choose the Pricing tier. We will use Standard for this demo
Creating a new Key Vault
















5) Add Secret
Now that we have a Key Vault we have to put in a password for testing. The Key Vault stores three types of items: Secrets, Keys and Certificates. For passwords, account keys or connectionstrings you need the Secret.
  • Go to the newly created Azure Key Vault
  • Go to Secrets in the left menu
  • Click on the Generate/Import button to create a new secret
  • Choose Manual in the upload options
  • Enter a recognizable and descriptive name. You will use this name in the runbook code
  • Next step is to add the secret value which we will retrieve in the code
  • Keep Content type Empty and don't use the activation or expiration date for this example
  • Make sure the secret is enabled and then click on the Create button
Adding a new secret to Azure Key Vault
















6) Access policies
By default the Run As Account can only see the Azure Key Vault, but it can't read its content. In step 3 you retrieved the name of the Azure Run As Account. Now we will give it access to read the secrets.
  1. Go to Access policies in the left menu
  2. Click on the blue + Add Access Policy link
  3. Leave Configure from template empty
  4. Leave Key permissions unselected (we will only use a Secret for this example)
  5. Select Get for Secret permissions
  6. Leave Certificate permissions unselected (we will only use a Secret for this example)
  7. Click on the field of Select principal to find the account from step 3
  8. Leave Authorized application unchanged
  9. Click on Add and a new Application will appear in the list of Current Access Policies
Add Access policy
















7) Create Runbook
Now we are finally ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button

Create Runbook















8) Edit Runbook code
Next open the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to make sure we don't steal your data. The first part about login to Azure is described in this previous post.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Variables for retrieving the correct secret from the correct vault
$VaultName = "bitools"
$SecretName = "MyPassword"

# Retrieve value from Key Vault
$MySecretValue = (Get-AzKeyVaultSecret -VaultName $VaultName -Name $SecretName).SecretValueText

# Write value to screen for testing purposes
Write-Output "The value of my secret is $($MySecretValue)"

Hardcoding the parameter for the Key Vault name is not very flexible, but you could also pass them from Azure Data Factory by changing it into a parameter. This is especially useful when you have multiple environments (Development, Test, Acceptance and Production).
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################


########################################################
# PARAMETERS 
########################################################
Param
(
    # ContainerName is required
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from set of parameters
$ContainerName = $Parameters.ContainerName

# Variables for retrieving the correct secret from the correct vault
$VaultName = $Parameters.VaultName
$SecretName = "MyPassword"

# Retrieve value from Key Vault
$MySecretValue = (Get-AzKeyVaultSecret -VaultName $VaultName -Name $SecretName).SecretValueText

# Write value to screen for testing purposes
Write-Output "The value of my secret is $($MySecretValue)"
The parameters that will be provided in Azure Data Factory (ADF) via a JSON message will look like
{
"VaultName":"bitools"
}

9) Testing
Testing the functionality of your code is the easiest if you still have the hardcoded parameters. Then you can just use the Test pane in the Runbook editor like the animation below. If you want to test it with the parameters for ADF then you first need to create a webhook and then create and run an ADF pipeline to test your code.
Testing the code















Summary
In this post you learned how to use the Azure Key Vault in an Azure Automation Runbook. The code to retrieve the secret is very simple (only one command), but giving the right Automation Run As account the correct access is a bit more research/fiddling around. In futures posts we will show you how to use Azure Key Vault in combination with other tools like Azure Data FactoryAzure Databricks or Azure Functions.

Tuesday 26 November 2019

Archiving within a Blob Storage container

Case
New files are delivered to my Blob Storage container each day. After processing them I want to archive them into folders with a timestamp in the name and after x days delete them. Which Azure resource can I use for this task?
Container with archive folders















Solution
One of the services you could use is an Azure Automation runbook with some PowerShell code. Note that blob storage containers only have virtual folders which means that the foldername is stored in the filename. Microsoft Azure Storage Explorer will show it as if it are real folders.

1) Create Automation Account
First we need to create an Azure Automation account to host the runbook with PowerShell code.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • Decide whether you need an Azure Run As account and click on the Create button.
    Note: We don't need it for this specific Runbook, because we will use the access key to access the Blob Storage Container. If you intend to add more runbooks to manage other Azure Resources you should probably enable it.
Create Azure Automation Account
















2) Add Module Az.Storage
Before we start writing code we need to add a PowerShell module called Az.Storage. This module contains methods we need in our code.

If you forget this step you will get error messages while running your code that state that some of your commands are nog recognized:
New-AzStorageContext : The term 'New-AzStorageContext' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Storage
  • Click on Az.Storage in the result and import this module
Adding a new module to your Automation Account
















3) Create Runbook
Now we need to add a PowerShell Runbook to this new Automation Account.
  • Go to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button
Create Runbook
















4) Edit Runbook code
Now first look up the Storage Account Name and key (1 or 2) from your storage account that you want to cleanup with this Archive script. In the first example we will hardcode these two parameters in the PowerShell code it self. The second example will show you an alternative.
Lookup Account Name and Key















Next open the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to make sure we don't steal your data. If you never want to delete files then just remove the cleanup part starting on row 74.

Since the storage account container doesn't have real (sub)folders, we will use virtual folders which means the foldername is actually stored in the filename. Therefore we cannot move files to a subfolder, but we can rename them instead. Unfortunately renaming a blob is not a standard method available in the Azure Storage module. Therefore we will use a custom function from Martin Brandl that makes a copy with the new name and deletes the original.

# PowerShell code

##########################################################################
############################### PARAMETERS ###############################
##########################################################################
$StorageAccountName = "bitools"
$StorageAccountKey = "XTZSqCcF7q43SwX3FAKEgG5ezCC3l5jor5gUajoTWnnk4qFAKEk52UuR3lYqw1eaFAKEbQe3M4CpSbGDVnSCKg=="
$ContainerName = "sensordata"
$DaysHistory = 31


##########################################################################
########################## Rename-AzStorageBlob ##########################
##########################################################################
# Custom function because Rename method does not exist in Storage modules
# Function copied from about-azure.com and adjusted from AzureRM to Az
# https://about-azure.com/2018/02/13/rename-azure-storage-blob-using-powershell/

function Rename-AzStorageBlob
{
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
        [Object]$Blob,
        [Parameter(Mandatory=$true, Position=1)]
        [string]$NewName
    )


  Process {
    $blobCopyAction = Start-AzStorageBlobCopy `
        -ICloudBlob $Blob.ICloudBlob `
        -DestBlob $NewName `
        -Context $Blob.Context `
        -DestContainer $Blob.ICloudBlob.Container.Name
 
    $status = $blobCopyAction | Get-AzStorageBlobCopyState
 
    while ($status.Status -ne 'Success')
    {
        $status = $blobCopyAction | Get-AzStorageBlobCopyState
        Start-Sleep -Milliseconds 50
    }
 
    $Blob | Remove-AzStorageBlob -Force
  }
}


##########################################################################
############################### ARCHIVING ################################
##########################################################################
Write-Output "Start archiving $($ContainerName) in $($StorageAccountName)"

# Get 'context' of the source container
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

# Determine prefix of datetime
$Prefix = (get-date).ToString("yyyyMMdd-HHmmssfff")

# Loop through all blobs that are not in a virtual folder
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $StorageContext | where {$_.Name -notlike "*/*" }
foreach ($blob in $blobs)
{
    $NewName = $Prefix + "\" + $blob.name
    Write-Output "Archive $($blob.name) to $($NewName)"
   
    #Rename blob file.csv => 20190226\file.csv
    Get-AzStorageBlob -Container $ContainerName -Context $StorageContext -Blob $blob.name  | Rename-AzStorageBlob -NewName $NewName
}


##########################################################################
################################ CLEANUP #################################
##########################################################################
Write-Output "Start cleanup $($ContainerName) in $($StorageAccountName)"

# Determine date in history based on given days
$HistoryLimit = (get-date).AddDays(-$DaysHistory).ToString("yyyyMMdd-HHmmssfff")

# Get all blobs where the names are at least 19 chararters long
# and from that set get the files that matches 99999999-999999999/
# example: 20191123-171423559/filename.csv
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $StorageContext | where {$_.Name.Length -gt 19} | where {($_.Name).Substring(0,19) -match "^\d{8}[-]\d{9}/$"}

# Loop through the set with blobs that matches our requirements
# and delete those from the container
foreach ($blob in $blobs)
{
    if ( ($blob.name).Substring(0,8) -lt $HistoryLimit)
    {
        Write-Output "Remove old blob $($blob.name)"
        Remove-AzStorageBlob -Context $StorageContext -Blob $blob.name -Container $ContainerName
    }
}

Not very flexible and secure to store your Storage Accountname and Accesskey in the code it self. So in this second example we will replace it by parameters that can be passed from Azure Data Factory. You can find more details in this blog post. Only the parameter part is different and the rest of the code is unchanged.
# PowerShell code

##########################################################################
############################### PARAMETERS ###############################
##########################################################################
Param
(
    # ContainerName is required
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from set of parameters
$ContainerName = $Parameters.ContainerName

$StorageAccountName = $Parameters.StorageAccountName
$StorageAccountKey = $Parameters.StorageAccountKey
$ContainerName = $Parameters.ContainerName
$DaysHistory = $Parameters.DaysHistory
The parameters that will be provided in Azure Data Factory (ADF) via a JSON message will look like this:
{
"StorageAccountName":"BiTools",
"StorageAccountKey ":"XTZSqCcF7q43SwX3FAKEgG5ezCC3l5jor5gUajoTWnnk4qFAKEk52UuR3lYqw1eaFAKEbQe3M4CpSbGDVnSCKg==",
"ContainerName":"SensorData",
"DaysHistory":"31"
}

Another alternative, for when you don't want to use ADF, is using the Azure Key Vault to store the key and then use some PowerShell code to retrieve it in your runbook. The code is very simple, but see this post for more details.
# PowerShell code snippet
# Retrieve value from Key Vault
$StorageAccountKey = (Get-AzKeyVaultSecret -VaultName "MyKeyVault" -Name "StorageAccountKey").SecretValueText

5) Testing
Testing the functionality of your code is the easiest if you still have the hardcoded parameters. Then you can just use the Test pane in the Runbook editor like the animation below. If you want to test it with the parameters for ADF then you first need to create a webhook and then create and run an ADF pipeline to test your code.
Testing the code
















Summary
In this post we explained how to archive files in a container from a Storage Account using PowerShell. We also showed how to pass the parameters like the account key from ADF instead of hard coding it. However we also don't want to store secrets in ADF. In a next post we will show you how to store the Storage account key in an Azure Key Vault and show how to read that secret within ADF to use it as a parameter for this Runbook.
Passing secrets parameters using KeyVault, DataFactory and a Runbook









Monday 18 November 2019

Runbook snack: read parameters from Data Factory

Case
I want to pass parameters from an Azure Data Factory pipeline web(hook) activity to my Azure Automation Runbook. How do I do that?
Supplying parameters via the Body property in Web(hook) activity























Solution
You cannot pass through a single parameters and then read that particularly parameter with PowerShell. Instead you have to read all the data coming from the webhook as one big Object parameter. That object also contains the Body property which on its turn contains a JSON message with a parameter name and value.

1) JSON Message
The parameter 'ContainerName'' will be provided in Azure Data Factory (ADF) via a JSON message that looks like
{"ContainerName":"OfficeData"}
2) Coding
In an Azure Automation Runbook you can start your code with the following code snippit that retrieves the value from the JSON messages from ADF. For this example it gets the name of a Blob Storage Container.
# PowerShell code

# Parameters
Param
(
    # Get webhook data via webhook
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from the set of parameters
$ContainerName = $Parameters.ContainerName

# Write the value of the parameter
Write-Output "The value of the container parameter is $($ContainerName)"

3) Adding Webhook
Now that the code is ready, you have to create a Webhook for ADF. Make sure to choose a correct expire date. By default it is only valid for one year. Also make sure to copy the URL, because you can only see and copy it once. You need this URL in ADF for the Web(hook) activity. Don't share the URL because it is a URL and Password in one.
Adding a Webhook to a Runbook
















4) Azure Data Factory
In the pipeline of ADF you can use the Web(hook) activity to call the Webhook and use a JSON message in the Body parameter to provide a value for a parameter. The URL is the Webhook URL from the previous step that you copied. For Method you need to select "POST" and in the Body you can enter the JSON message from step 1.
Supplying parameters via the Body property in Web(hook) activity























Conclusion
Perhaps a bit strange, but this seems to be the only way when using webhooks for your runbook and calling them in ADF. As an alternative you could provide the value of the parameters in the Webhook when you create the Webhook instead of supplying them via ADF, but that is not very flexible if you expect a lot of different values. The code for retrieving parameters from the Webhook itself looks like this:
# PowerShell code

# Parameters
Param
(
    # Get parameter via webhook
    [Parameter(Mandatory=$False,Position=1)]
    [string] $ContainerName
)

# Write the value of the parameter
Write-Output "The value of the container parameter is $($ContainerName)"

Sunday 17 November 2019

Runbook snack: Using the Azure Run As Connection

Case
I want to manage Azure resources via a runbook (for example upscaling, downscaling, pausing and resuming resources), how do I provide authentication to do this?
Automation Account























Solution
For this we can use an Azure Run As account which gives the Automation Account the Contributor role at the subscription level. However this also means that you need to be an Application administrator in Azure Active Directory or an Owner in the subscription to create an Azure Run As account! If you do not have sufficient rights, then the Create Azure Run As account option will be grayed out when you create a new Automation Account.
Insufficient permissions



















Now you can start your PowerShell runbook with the following code which allows your code to manage the various Azure resource. The first example is for those using the PowerShell RM modules and the second example is for the newer PowerShell AZ modules.
# PowerShell code
########################################################
# Log in to Azure with RM (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'

# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName        
 
    'Log in to Azure...'
    $null = Add-AzureRmAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# test code getting all resource groups
Get-AzureRmResourceGroup

Nearly the same code, but now with AZ modules.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'

# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName        
 
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Test code getting all resource groups
Get-AzResourceGroup

If you used the Create Run As account option while creating the Automation account, then the name will be AzureRunAsConnection. Otherwise you can find the name under connections in the Azure Automation Account.
Connections in Azure Automation













Conclusion
This code, in combination with creating the Azure Run As account, allows you to perform all kinds of operations on Azure resources like up- and downscaling Azure SQL Databases or pausing and resuming Azure Analysis Services. The only problem could be insufficient rights to create this. In that case you could ask someone with sufficient rights to create the Automation Account with the Azure Run As account option on and then you could create the runbooks yourself.

Note that support for RM modules will end in December 2020. So don't wait to long to convert your scripts to AZ.

Friday 27 April 2018

Runbook with ADF: Method not found

Case
A little over two weeks ago we created a new Azure Automation account with a PowerShell runbook to pause and resume the ADF Integration Runtime, but got a strange error when running the script:
Method not found








Get-AzureRmDataFactoryV2IntegrationRuntime: Method not found: 'Newtonsoft.Json.Serialization.IAtrributeProvider Newtonsoft.Json.Serialization.JsonProperty.get_Attribute()'.

Of course the script did work before so what had changed since the last time?

Solution
One of the steps to get the pause and resume script working, is to add assemblies for Azure Data Factory (AzureRM.DataFactoryV2) and Azure Profile (AzureRM.profile). Both where updated on April 10. The ADF assembly from 0.5.2 to 0.5.3 and Profile from 4.5.0 to 4.6.0.

Apparently there is a bug: The updated AzureRM.profile assembly requires a newer version of Newtonsoft.Json which is not yet available on Azure Automation. This means the script will work on your local machine, but not yet in Azure Automation. Microsoft is working on an update.

Meanwhile you can use the following temporally workaround: use the previous version of both assemblies. First remove the two new assemblies from your Azure Automation account and then go to AzureRM.profile. Scroll down to the Version History list and click in the previous version. Then hit the Deploy to Azure Automation button.
Deploy to Azure Automation











You will be redirected to the Azure portal where you must select the Automation account to deploy the previous version of the assembly. Select the correct Automation account and click on the OK button.
Select Automation account

















Repeat this for AzureRM.DataFactoryV2 and then you are good to go.

Summary
There is a bug on which Microsoft is already working. In the meantime just downgrade to the previous versions or wait for an update.

Monday 1 January 2018

Pause everything on your Azure playground - Tags

Case
To prevent unnecessary high bills because I forgot to turn off services, I want to pause everything in my Azure 'playground' subscription. However I want to give my co-workers more control to decide which machines and services they don't want to pause each night. Your current solution works with a centralized exception list that needs to be maintained by someone. Is there an alternative solution?
Pause everything v2
















Solution
You should of course make some agreements about being careful with pricey services, but you can support that with a 'simple' technical solution: run a PowerShell script in Azure Automation Runbook that pauses all often used services each night. In this version of the script, exceptions are handled with tags that people can add to their own server or service. Here is how you can add a tag to for example Azure Analysis Services.
Add tags to your service or server


















For this example we will pause the following Azure parts:
  • Azure Virtual Machines (not classics)
  • Azure SQL Data Warehouses
  • Azure Analysis Services
This is not a complete list, but once you understand these three, it should be relatively easy to add more services. Still to hard? Add a comment to suggest more services.

1) Automation Account
First we need an Azure Automation Account to run the Runbook with PowerShell code. If you don't have one or want to create a new one, then search for Automation under Monitoring + Management and give it a suitable name like 'maintenance', then select your subscription, resource group and location. For this example I will choose West Europe since I'm from the Netherlands. Keep 'Create Azure Run As account' on Yes. We need it in the code. See step 3 for more details.
Azure Automation Account














2) Credentials
Next step is to create Credentials to run this runbook with. This works very similar to the Credentials in SQL Server Management Studio. Go to the Azure Automation Account and click on Credentials in the menu. Then click on Add New Credentials. You could just use your own Azure credentials, but the best options is to use a service account with a non-expiring password. Otherwise you need to change this regularly.
Create new credentials

























3) Connections
This step is for your information only and to understand the code. Under Connections you will find a default connection named 'AzureRunAsConnection' that contains information about the Azure environment, like the tendant id and the subscription id. To prevent hardcoded connection details we will retrieve these fields in the PowerShell code.
Azure Connections
























4) Modules
The Azure Analysis Services methods (cmdlets) are in a separate PowerShell module which is not included by default. If you do not add this module you will get errors telling you that the method is not recognized. See below for more details.
The term 'Get-AzureRmAnalysisServicesServer' is not recognized
as the name of a cmdlet, function, script file, or operable program.














Go to the Modules page and check whether you see AzureRM.AnalysisServices in the list. If not then use the 'Browse gallery' button to add it, but first add AzureRM.Profile because the Analysis module will ask for it. Adding the modules could take a few minutes!
Add modules














5) Runbooks
Now it is time to add a new Azure Runbook for the PowerShell code. Click on Runbooks and then add a new runbook (There are also several example runbooks of which AzureAutomationTutorialScript could be useful as an example). Give your new Runbook a suitable name like 'PauseEverything' and choose PowerShell as type.
Add Azure Runbook















6) Edit Script
After clicking Create in the previous step the editor will be opened. When editing an existing Runbook you need to click on the Edit button to edit the code. You can copy and paste the code below to your editor. Study the green comments to understand the code. Notice that this version doesn't use Runbook variables.
Edit the PowerShell code














# PowerShell code
# Connect to a connection to get TenantId and SubscriptionId
$Connection = Get-AutomationConnection -Name "AzureRunAsConnection"
$TenantId = $Connection.TenantId
$SubscriptionId = $Connection.SubscriptionId
  
# Get the service principal credentials connected to the automation account. 
$null = $SPCredential = Get-AutomationPSCredential -Name "Administrator"

# Login to Azure ($null is to prevent output, since Out-Null doesn't work in Azure)
Write-Output "Login to Azure using automation account 'Administrator'."
$null = Login-AzureRmAccount -TenantId $TenantId -SubscriptionId $SubscriptionId -Credential $SPCredential
  
# Select the correct subscription
Write-Output "Selecting subscription '$($SubscriptionId)'."
$null = Select-AzureRmSubscription -SubscriptionID $SubscriptionId


################################
# Pause AnalysisServicesServers
################################
Write-Output "Checking Analysis Services Servers"

# Get list of all AnalysisServicesServers that are turned on (ProvisioningState = Succeeded)
# but skip AnalysisServicesServers that have an Environment tag with the value Production
$AnalysisServicesServers = Get-AzureRmAnalysisServicesServer | 
Where-Object {$_.ProvisioningState -eq "Succeeded" -and $_.Tag['Environment'] -ne "Production"}

# Loop through all AnalysisServicesServers to pause them
foreach ($AnalysisServicesServer in $AnalysisServicesServers)
{
    Write-Output "- Pausing Analysis Services Server $($AnalysisServicesServer.Name)"
    $null = Suspend-AzureRmAnalysisServicesServer -Name $AnalysisServicesServer.Name
}


################################
# Pause Virtual Machines
################################
Write-Output "Checking Virtual Machines"

# Get list of all Azure Virtual Machines that are not deallocated (PowerState <> VM deallocated)
# Filtering on tags is not supported for Azure Virtual Machines
$VirtualMachines = Get-AzureRmVM -Status |
Where-Object {$_.PowerState -ne "VM deallocated"} #-and $_.Tag['Environment'] -ne "Production"}

# Loop through all Virtual Machines to pause them
foreach ($VirtualMachine in $VirtualMachines)
{
    # Get-AzureRmVM does not show tags therefor
    # filtering in Where-Object does not work.
    # Workaround: if statement within loop
    if ($VirtualMachine.Tags['Environment'] -ne "Production")
    {
        Write-Output "- Deallocating Virtual Machine $($VirtualMachine.Name) "
        $null = Stop-AzureRmVM -ResourceGroupName $VirtualMachine.ResourceGroupName -Name $VirtualMachine.Name -Force 
    }
}
# Note: Classic Virtual machines are excluded with this script because they don't support Tags.
 

################################
# Pause SQL Data Warehouses
################################
Write-Output "Checking SQL Data Warehouses"

# Get list of all Azure SQL Servers
$SqlServers = Get-AzureRmSqlServer

# Loop through all SQL Servers to check if they host a DWH
foreach ($SqlServer in $SqlServers)
{
    # Get list of all SQL Data Warehouses (Edition=DataWarehouse) that are turned on (Status = Online)
    # but skip SQL Data Warehouses that have an Environment tag with the value Production
    $SqlDatabases = Get-AzureRmSqlDatabase -ServerName $SqlServer.ServerName -ResourceGroupName $SqlServer.ResourceGroupName |
    Where-Object {$_.Edition -eq 'DataWarehouse' -and $_.Status -eq 'Online' -and $_.Tag['Environment'] -ne "Production"} 

    # Loop through all SQL Data Warehouses to pause them
    foreach ($SqlDatabase in $SqlDatabases)
    {
        Write-Output "- Pausing SQL Data Warehouse $($SqlDatabase.DatabaseName)"
        $null = Suspend-AzureRmSqlDatabase -DatabaseName $SqlDatabase.DatabaseName -ServerName $SqlServer.ServerName -ResourceGroupName $SqlDatabase.ResourceGroupName
    }
}


Write-Output "Done"

Note 1: This is a very basic script. No error handling has been added. Check the AzureAutomationTutorialScript for an example. Finetune it for you own needs.
Note 2: There are often two versions of an method like Get-AzureRmSqlDatabase and Get-AzureSqlDatabase. Always use the one with "Rm" in it (Resource Managed), because that one is for the new Azure portal. Without Rm is for the old/classic Azure portal.
Note 3: Because Azure Automation doesn't support Out-Null I used an other trick with the $null =. However the Write-Outputs are for testing purposes only. Nobody sees them when they are scheduled.
Note 4: The code for Data Warehouses first loops through the SQL Servers and then through all databases on that server filtering on edition 'DataWarehouse'.
Note 5: The method to get Virtual Machines (Get-AzureRmVM) doesn't show tags. Therefor we cannot use the Where-Object filter to filter out certain tags. Workaround: if-statement within foreach loop.

7) Testing
You can use the Test Pane menu option in the editor to test your PowerShell scripts. When clicking on Run it will first Queue the script before Starting it. If nothing needs to be paused the script runs in about a minute, but pausing or deallocating items takes several minutes.
Testing the script in the Test Pane















8) Publish
When your script is ready, it is time to publish it. Above the editor click on the Publish button. Confirm overriding any previously published versions.
Publish the Runbook










9) Schedule
And now that we have a working and published Azure Runbook, we need to schedule it. Click on Schedule to create a new schedule for your runbook. For this pause everything script I created a schedule that runs every day on 2:00AM (02:00). This gives late working colleagues more than enough time to play with all the Azure stuff before there service will be paused.
Add Schedule















Summary
In this post you saw how you can pause all expensive services in an Azure playground environment. If a co-worker don't wants to pause his/her service then he/she can skip that by adding a tag to the specific server or service. As mentioned before: this is not a complete list. Feel free to suggest more services, that can be paused, in the comments.














Saturday 30 December 2017

Pause everything on your Azure playground

Case
A lot of IT companies have Azure subscriptions that are not primarily for customers, but are more playgrounds to create or try out stuff for clients. A lot of times you try out stuff like Azure Data Warehouse or Azure Analysis Services, services where you are not paying for usage but you are paying because the services are turned on.

When you are playing around with those services and forget to turn them off afterwards, it could get little costly especially when you have dozens of colleagues also trying out all the cool Azure stuff. How do you prevent those unnecessary high bills because of forgotten services?
Pause everything















Solution
You should of course make some agreements about being careful with pricey services, but you can support that with a 'simple' technical solution: run a PowerShell script in Azure Automation Runbook that pauses all often used services each night. An exception list takes care of specific machines or servers that should not be paused. For this example we will pause the following Azure parts:
  • Azure Virtual Machines (not classics)
  • Azure SQL Data Warehouses
  • Azure Analysis Services
This is not a complete list, but once you understand these three, it should be relatively easy to add more services. Still to hard? Add a comment to suggest more services.

1) Automation Account
First we need an Azure Automation Account to run the Runbook with PowerShell code. If you don't have one or want to create a new one, then search for Automation under Monitoring + Management and give it a suitable name like 'maintenance', then select your subscription, resource group and location. For this example I will choose West Europe since I'm from the Netherlands. Keep 'Create Azure Run As account' on Yes. We need it in the code. See step 3 for more details.
Azure Automation Account














2) Credentials
Next step is to create Credentials to run this runbook with. This works very similar to the Credentials in SQL Server Management Studio. Go to the Azure Automation Account and click on Credentials in the menu. Then click on Add New Credentials. You could just use your own Azure credentials, but the best options is to use a service account with a non-expiring password. Otherwise you need to change this regularly.
Create new credentials

























3) Connections
This step is for your information only and to understand the code. Under Connections you will find a default connection named 'AzureRunAsConnection' that contains information about the Azure environment, like the tendant id and the subscription id. To prevent hardcoded connection details we will retrieve these fields in the PowerShell code.
Azure Connections
























4) Variables
For the exception lists we will be using string variables to prevent hardcode machine and service names in de code. Go to Variables and add three new string variables: one for each type of machine or service we need to pause:
  • ExceptionListVM
  • ExceptionListDWH
  • ExceptionListAAS
Fill them with a semicolon separated list of names or, if you do not have any exception, with a semicolon (;) only.
Add string variables


















5) Modules
The Azure Analysis Services methods (cmdlets) are in a separate PowerShell module which is not included by default. If you do not add this module you will get errors telling you that the method is not recognized. See below for more details.
The term 'Get-AzureRmAnalysisServicesServer' is not recognized
as the name of a cmdlet, function, script file, or operable program.














Go to the Modules page and check whether you see AzureRM.AnalysisServices in the list. If not then use the 'Browse gallery' button to add it, but first add AzureRM.Profile because the Analysis module will ask for it. Adding the modules could take a few minutes!
Add modules














6) Runbooks
Now it is time to add a new Azure Runbook for the PowerShell code. Click on Runbooks and then add a new runbook (There are also several example runbooks of which AzureAutomationTutorialScript could be useful as an example). Give your new Runbook a suitable name like 'PauseEverything' and choose PowerShell as type.
Add Azure Runbook















7) Edit Script
After clicking Create in the previous step the editor will be opened. When editing an existing Runbook you need to click on the Edit button to edit the code. You can copy and paste the code below to your editor. Study the green comments to understand the code. Also make sure to compare the variable names in the code to the once created in step 4 and change them if necessary.
Edit the PowerShell code














# PowerShell code
# Connect to a connection to get TenantId and SubscriptionId
$Connection = Get-AutomationConnection -Name "AzureRunAsConnection"
$TenantId = $Connection.TenantId
$SubscriptionId = $Connection.SubscriptionId
  
# Get the service principal credentials connected to the automation account. 
$null = $SPCredential = Get-AutomationPSCredential -Name "Administrator"

# Login to Azure ($null is to prevent output, since Out-Null doesn't work in Azure)
Write-Output "Login to Azure using automation account 'Administrator'."
$null = Login-AzureRmAccount -TenantId $TenantId -SubscriptionId $SubscriptionId -Credential $SPCredential
  
# Select the correct subscription
Write-Output "Selecting subscription '$($SubscriptionId)'."
$null = Select-AzureRmSubscription -SubscriptionID $SubscriptionId
  
# Get variable values and split them into arrays
$ExceptionListAAS = (Get-AutomationVariable -Name 'ExceptionListAAS') -split ";"
$ExceptionListVM = (Get-AutomationVariable -Name 'ExceptionListVM') -split ";"
$ExceptionListDWH = (Get-AutomationVariable -Name 'ExceptionListDWH') -split ";"


################################
# Pause AnalysisServicesServers
################################
Write-Output "Checking Analysis Services Servers"
# Get list of all AnalysisServicesServers that are turned on (ProvisioningState = Succeeded)
$AnalysisServicesServers = Get-AzureRmAnalysisServicesServer |
Where-Object {$_.ProvisioningState -eq "Succeeded" -and $ExceptionListAAS -notcontains $_.Name}
# Loop through all AnalysisServicesServers to pause them
foreach ($AnalysisServicesServer in $AnalysisServicesServers)
{
    Write-Output "- Pausing Analysis Services Server $($AnalysisServicesServer.Name)"
    $null = Suspend-AzureRmAnalysisServicesServer -Name $AnalysisServicesServer.Name
}


################################
# Pause Virtual Machines
################################
Write-Output "Checking Virtual Machines"
# Get list of all Azure Virtual Machines that are not deallocated (PowerState <> VM deallocated)
$VirtualMachines = Get-AzureRmVM -Status |
Where-Object {$_.PowerState -ne "VM deallocated" -and $ExceptionListVM -notcontains $_.Name}
# Loop through all Virtual Machines to pause them
foreach ($VirtualMachine in $VirtualMachines)
{
    Write-Output "- Deallocating Virtual Machine $($VirtualMachine.Name) "
    $null = Stop-AzureRmVM -ResourceGroupName $VirtualMachine.ResourceGroupName -Name $VirtualMachine.Name -Force 
}
# Note: Classic Virtual machines are excluded with this script (use Get-AzureVM and Stop-AzureVM)


################################
# Pause SQL Data Warehouses
################################
Write-Output "Checking SQL Data Warehouses"
# Get list of all Azure SQL Servers
$SqlServers = Get-AzureRmSqlServer
# Loop through all SQL Servers to check if they host a DWH
foreach ($SqlServer in $SqlServers)
{
    # Get list of all SQL Data Warehouses (Edition=DataWarehouse) that are turned on (Status = Online)
    $SqlDatabases = Get-AzureRmSqlDatabase -ServerName $SqlServer.ServerName -ResourceGroupName $SqlServer.ResourceGroupName |
    Where-Object {$_.Edition -eq 'DataWarehouse' -and $_.Status -eq 'Online' -and $ExceptionListDWH -notcontains $_.DatabaseName} 
    # Loop through all SQL Data Warehouses to pause them
    foreach ($SqlDatabase in $SqlDatabases)
    {
        Write-Output "- Pausing SQL Data Warehouse $($SqlDatabase.DatabaseName)"
        $null = Suspend-AzureRmSqlDatabase -DatabaseName $SqlDatabase.DatabaseName -ServerName $SqlServer.ServerName -ResourceGroupName $SqlDatabase.ResourceGroupName
    }
}

Write-Output "Done"

Note 1: This is a very basic script. No error handling has been added. Check the AzureAutomationTutorialScript for an example. Finetune it for you own needs.
Note 2: There are often two versions of an method like Get-AzureRmSqlDatabase and Get-AzureSqlDatabase. Always use the one with "Rm" in it (Resource Managed), because that one is for the new Azure portal. Without Rm is for the old/classic Azure portal.
Note 3: Because Azure Automation doesn't support Out-Null I used an other trick with the $null =. However the Write-Outputs are for testing purposes only. Nobody sees them when they are scheduled.
Note 4: The code for Data Warehouses first loops through the SQL Servers and then through all databases on that server filtering on edition 'DataWarehouse'.

7) Testing
You can use the Test Pane menu option in the editor to test your PowerShell scripts. When clicking on Run it will first Queue the script before Starting it. If nothing needs to be paused the script runs in about a minute, but pausing or deallocating items takes several minutes.
Testing the script in the Test Pane















8) Publish
When your script is ready, it is time to publish it. Above the editor click on the Publish button. Confirm overriding any previously published versions.
Publish the Runbook










9) Schedule
And now that we have a working and published Azure Runbook, we need to schedule it. Click on Schedule to create a new schedule for your runbook. For this pause everything script I created a schedule that runs every day on 2:00AM (02:00). This gives late working colleagues more than enough time to play with all the Azure stuff before there service will be paused.
Add Schedule















Summary
In this post you saw how you can pause all expensive services in an playground environment. If a colleague don't wants to pause his/her service then we can use the variables to skip the particular service. As mentioned before: this is not a complete list. Feel free to suggest more services, that can be paused, in the comments.

This script requires someone to maintain the exception list variables. I have created an alternative script that uses tags instead of centralized Runbook variables. This allows you to add a certain tag to your service to avoid the pause which gives colleagues more control.




Update: code for classic Virtual Machines (for separate Runbook)
# PowerShell code
# Connect to a connection to get TenantId and SubscriptionId
$Connection = Get-AutomationConnection -Name "AzureRunAsConnection"
$TenantId = $Connection.TenantId
$SubscriptionId = $Connection.SubscriptionId
  
# Get the service principal credentials connected to the automation account. 
$null = $SPCredential = Get-AutomationPSCredential -Name "Administrator"

# Login to Azure ($null is to prevent output, since Out-Null doesn't work in Azure)
Write-Output "Login to Azure using automation account 'Administrator'."
$null = Add-AzureAccount -Credential $SPCredential -TenantId $TenantId

# Select the correct subscription (method without Rm)
Write-Output "Selecting subscription '$($SubscriptionId)'."
$null = Select-AzureSubscription -SubscriptionID $SubscriptionId

# Get variable values and split them into arrays
$ExceptionListVM = (Get-AutomationVariable -Name 'ExceptionListVM') -split ";"


#################################
# Pause Classic Virtual Machines
#################################
Write-Output "Checking Classic Virtual Machines"

# Get list of all Azure Virtual Machines that are not deallocated (Status <> StoppedDeallocated)
$VirtualMachines = Get-AzureVM  |
Where-Object {$_.Status -ne "StoppedDeallocated" -and $ExceptionListVM -notcontains $_.Name}

# Loop through all Virtual Machines to pause them
foreach ($VirtualMachine in $VirtualMachines)
{
    Write-Output "- Deallocating Classic Virtual Machine $($VirtualMachine.Name) ($($VirtualMachine.ServiceName))"
    $null = Stop-AzureVM -ServiceName $VirtualMachine.ServiceName -Name $VirtualMachine.Name -Force 
}


Write-Output "Done"
Note 1: Login method name is slightly different
Note 2: Other methods use the version without Rm in the name: Stop-AzureRmVM => Stop-AzureVM