Showing posts with label RUNBOOK. Show all posts
Showing posts with label RUNBOOK. Show all posts

Saturday, 10 October 2020

Snack: Create Azure Automation Runbook

Case
How do you write and execute PowerShell code in Azure?

Azure Automation PowerShell Runbook



















Solution
There are several options to execute PowerShell code in Azure. For example in an Azure Function or in an Azure DevOps pipeline. This post shows you how to write and execute PowerShell code in an Azure Automation Runbook. Other blog posts with PowerShell solutions for DWH projects will link to this blogpost (to don't repeat ourselves). You can find them here.

Note: the screenshots and animated gifs could be slightly outdated on the next layout change of the Azure portal, but we will try to occasionally update those.

1) Create Azure Automation Account
To create a PowerShell runbook we first we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step. The Run As Account allows you to login and easily interact with other Azure Services.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account (it will probably host multiple runbooks)
  • Select your Subscription, Resource Group and the Region
  • Most examples will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account


















2) Add Modules
Before you start writing code you often first need to add some PowerShell modules to your Azure Automation Account. For this example we will add a PowerShell module called Az.Sql which for example is used to up- and downscale Azure SQL databases. Note that modules often depend on other modules like this one depends on Az.Accounts. Make sure to add those modules first, but you will be notified if an other module is required. Try to avoid the outdated AzureRm modules and use the Az modules instead (you cannot mix them).

If you forget this step you will get error messages while running your code that state that some of your commands are not recognized:
Get-AzSqlDatabase : The term 'Get-AzSqlDatabase' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.Sql (but wait until Az.Accounts is actually imported)
  • Click on Az.Sql in the result and import this module
Adding a new module to your Automation Account


















Note: if you are using an existing Automation Account then you probably already added Az.Accounts, but you might need to update it to a newer version.

2b)  Update Modules
Sometimes you need to update one or more modules within your automation account. Not sure why, but unfortunately the removed this option from the Azure Automation gui! More information here. The new way is to import PowerShell code to a new runbook and then run that runbook to update all your modules (or manually delete all the modules one by one and then add them again). 
  • Go to github and click on the little triangle on the Code button to download the zip file
  • Unzip the downloaded file
  • Go to runbooks and click on Import a runbook
  • Import the local file Update-AutomationAzureModulesForAccount.ps1
  • Give the runbook a description (Name and Runbook type are automatically populated)
  • Edit the runbook and hit the testpanel
  • Enter values for the parameters, besides ResourceGroupName and AutomationAccountName fill in the AzureModuleClass with Az (default value is AzureRm)
  • Hit the start button and check
Update all modules













3) Create Runbook
Now we are ready to create a PowerShell runbook in the newly created Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook like UpDownScaleDb
  • Select PowerShell as Runbook type
  • Add a short description to explain the purpose your code and click on the Create button
Create PowerShell Runbook



















4) Edit Runbook code
Next edit the new Runbook if it wasn't already opened by the previous step and start writing (or pasting) code in the editor. It often exists of three parts: Parameters, Login and the actual code. Make sure not to store secrets in you code, but use Azure Key Vault instead. Want to send a notification from within your runbook, then use SendGrid.
# PowerShell example code for testing
Param
(
    # Get your name
    [Parameter(Mandatory=$true,Position=1)]
    [ValidateLength(1,50)]
    [string]$Name
)

Write-Output "Hello $($Name)"

5) Testing
It is often easier to write and test the main code first in PowerShell ISE or Visual Studio Code on your local machine which is much faster and easier to debug. But to test the complete code including parameters and the login you can test your code within the runbook editor. Note that it can take over a minute before the code executes since it will first enter an execution queue.
  • Click on the Test pane button above your script.
  • Then optionally fill in the parameter values on the left side
  • And then hit the Start button to execute the script.
Testing your runbook


















6) Scheduling Runbook
If you want to schedule the execution of your runbook in Azure Automation you first need to publish it via the Runbook editor. After it has been published you can add a schedule to this runbook.
  • Edit the script in the runbook editor
  • Click on publish (the editor will close and you will be redirected to the overview page)
  • In the overview page click on Link to schedule
  • In the Schedule menu you can select an existing schedule or create a new one
  • In the Parameter menu you can provide the value for the parameters
Add schedule to runbook


















7) Add Webhook
If you don't want to schedule your Runbook, but call it from an other service like Azure Data Factory, you have to create a Webhook for ADF

Make sure to choose a correct expire date. By default it is only valid for one year. Also make sure to copy the URL, because you can only see and copy it once. You need this URL in ADF for the Web(hook) activity. Don't share the URL because it is a URL and Password in one.
Adding a Webhook to a Runbook


















Summary
In this post a general explanation on how to create and use an Azure Automation PowerShell Runbook. The real solutions are available here and from now on those posts about Azure Automation will link to this post for the general steps.






Friday, 2 October 2020

Pause all/an Azure Synapse SQL Pool(s) (Az)

Case
I want to schedule a pause of my Azure Synapse SQL Pools to save some money on my Azure bills. Back in 2017 when Synapse was still called SQL Data Warehouse we wrote a post with the AzureRM modules which are now outdated. How does it work with AZ?
Pause and resume Synapse SQL Pools
























Solution
For this example we will have two scripts. The first to stop all Azure Synapse SQL Pools within an Azure subscription. This is especially handy for Development, Test or Acceptance environments where you only turn on the Synapse SQL Pools when you need to develop or test something. When you forget to pause them afterwards this script will pause them all on a scheduled moment. The second script is for pausing (or resuming) a specific Synapse SQL Pool. Probably more suitable for a production environment where you don't want to ruthlessly pause all Synapse SQL Pools.

Note: All basics to create your first Azure Automation Runbook can be found here. Combine that with the code below.

1) Modules
If you want to use this script in an Azure Automation Runbook you first need to add the module Az.Sql and that module first wants the module Az.Accounts to be installed. If you already added AzureRM modules to your Azure Automation account then that is no problem. As long as you only don't mix them up in your runbooks. On your own PC you cannot have both installed. You first need to remove all AzureRm modules before you can add Az modules. 
  • Go to your Azure Automation account in the Azure Portal
  • Click on Modules in the left menu
  • Click on Browse Gallery (not on Add a module)
  • Search for Az.Sql and click on it
  • Next click on Import and then on the Ok button
Note 1: If you haven't installed Az.Account then it will ask you to do that first.
Note 2: It takes a few minutes to import a module
Adding a module

















2) Login
If you want to run the script in PowerShell ISE then you first need to login and select your subscription if you have multiple subscriptions.
# PowerShell code
# Login to Azure (browser popup will appear)
Connect-AzAccount -Confirm

# Optional: select your subscription
Set-AzContext -SubscriptionName "mysubscription"
Login to Azure with PowerShell ISE





















In your Runbook you need to add the following code instead which uses your Run as Account to login. Please read this blogpost for all details.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'

# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName        
 
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

3) The pause all script
Below the above script we need to add the following script which first gets all Azure SQL Servers and then checks whether they also host a Synapse SQL Pool. When one is found it checks the status. If it is still Online (active) then it will pause it. This will take several minutes. Afterwards it will recheck the status and show how long it took to pause. Feel free to add more checks and let us know in the comments what you added.
# PowerShell code
# Get all SQL Servers to check whether they host a Synapse SQL Pool
$allSqlServers = Get-AzSqlServer

# Loop through all SQL Servers
foreach ($sqlServer in $allSqlServers)
{
    # Log which SQL Servers are checked and in which resource group
    Write-Output "Checking SQL Server [$($sqlServer.ServerName)] in Resource Group [$($sqlServer.ResourceGroupName)] for Synapse SQL Pools"
    
    # Get all databases from a SQL Server, but filter on Edition = "DataWarehouse"
    $allSynapseSqlPools = Get-AzSqlDatabase -ResourceGroupName $sqlServer.ResourceGroupName `
                                           -ServerName $sqlServer.ServerName `
                                           | Where-Object {$_.Edition -eq "DataWarehouse"}
    # Loop through each found Synapse SQL Pool
    foreach ($synapseSqlPool in $allSynapseSqlPools)
    {
        # Show status of found Synapse SQL Pool
        # Available statuses: Online Paused Pausing Resuming
        Write-Output "Synapse SQL Pool [$($synapseSqlPool.DatabaseName)] found with status [$($synapseSqlPool.Status)]"
        
        # If status is online then pause Synapse SQL Pool
        if ($synapseSqlPool.Status -eq "Online")
        {
            # Pause Synapse SQL Pool
            $startTimePause = Get-Date
            Write-Output "Pausing Synapse SQL Pool [$($synapseSqlPool.DatabaseName)]"
            $resultsynapseSqlPool = $synapseSqlPool | Suspend-AzSqlDatabase

            # Show that the Synapse SQL Pool has been pause and how long it took
            $endTimePause = Get-Date
            $durationPause = NEW-TIMESPAN –Start $startTimePause –End $endTimePause
            $synapseSqlPool = Get-AzSqlDatabase -ResourceGroupName $sqlServer.ResourceGroupName `
                                                -ServerName $sqlServer.ServerName `
                                                -DatabaseName $synapseSqlPool.DatabaseName
            Write-Output "Synapse SQL Pool [$($synapseSqlPool.DatabaseName)] paused in $($durationPause.Hours) hours, $($durationPause.Minutes) minutes and  $($durationPause.Seconds) seconds. Current status [$($synapseSqlPool.Status)]"
        }
    }
}
In PowerShell ISE the result will look this. To see the result of your runbook you need to check the jobs and the the output.
Running the script in PowerShell ISE







4) The pause one script - parameters
To only select one Synapse SQL Pool we need to provide three parameters: Resource Group name, SQL Server name and the name of the SQL Pool. For this we will add parameter code at the beginning of the complete script (above login). You can add parameter validations to make it more monkey proof.
# PowerShell code
<#
    .SYNOPSIS
        Pause an Azure Synapse SQL Pool
    .DESCRIPTION
        By providing the following parameters you can pause one
        specific Azure Synapse SQL Pool. It will only pause when
        the status is 'online'

    .PARAMETER resourceGroupName
        This is the Resource group where Azure Synapse Analytics
        SQL Pool is located

    .PARAMETER sqlServerName
        This is the name of the Azure SQL Server hosting the Azure
        Synapse Analytics SQL Pool
    
    .PARAMETER SynapseSqlPoolName
        This is the name of the Azure Synapse Analytics SQL Pool

#>
Param(
    # This is the Resource group where Azure Synapse Analytics SQL Pool is located   
    [Parameter(Mandatory=$True)]  
    [String] $resourceGroupName
    ,
    # This is the name of the Azure SQL Server hosting the Azure Synapse Analytics SQL Pool
    [Parameter(Mandatory=$True)]  
    [String] $sqlServerName
    ,
    # This is the name of the Azure Synapse Analytics SQL Pool
    [Parameter(Mandatory=$True)]  
    [String] $SynapseSqlPoolName
) 

5) The pause one script
Now the actual pause one script to pause only one SQL Pool. This replaces the script of step 3. If you also want a resume script you just have to replace Pause-AzSqlDatabase by Resume-AzSqlDatabase. and of course chancing some of the texts and if statement (Online => Paused). You could also merge the pause and resume script into one script by adding an extra parameter to indicate what you want to do.
# PowerShell code
# Get one specific Synapse SQL Pool
$synapseSqlPool = Get-AzSqlDatabase -ResourceGroupName $resourceGroupName `
                                    -ServerName $sqlServerName `
                                    -DatabaseName $SynapseSqlPoolName `
                                    | Where-Object {$_.Edition -eq "DataWarehouse"}

# Check if the Synapse SQL Pool can be found with the provided parameters
if ($synapseSqlPool)
{
    # Show status of found Synapse SQL Pool
    # Available statuses: Online Paused Pausing Resuming
    Write-Output "Synapse SQL Pool [$($synapseSqlPool.DatabaseName)] found with status [$($synapseSqlPool.Status)]"

    # If status is online then pause Synapse SQL Pool
    if ($synapseSqlPool.Status -eq "Online")
    {
        # Pause Synapse SQL Pool
        $startTimePause = Get-Date
        Write-Output "Pausing Synapse SQL Pool [$($synapseSqlPool.DatabaseName)]"
        $resultsynapseSqlPool = $synapseSqlPool | Suspend-AzSqlDatabase

        # Show that the Synapse SQL Pool has been pause and how long it took
        $endTimePause = Get-Date
        $durationPause = NEW-TIMESPAN –Start $startTimePause –End $endTimePause
        $synapseSqlPool = Get-AzSqlDatabase -ResourceGroupName $resourceGroupName `
                                            -ServerName $sqlServerName `
                                            -DatabaseName $SynapseSqlPoolName
        Write-Output "Synapse SQL Pool [$($synapseSqlPool.DatabaseName)] paused in $($durationPause.Hours) hours, $($durationPause.Minutes) minutes and  $($durationPause.Seconds) seconds. Current status [$($synapseSqlPool.Status)]"
    }
}
else
{
    Throw "Synapse SQL Pool [$($SynapseSqlPoolName)] not found. Check parameter values."
}

Running the script in PowerShell ISE





Summary
In this post you saw how to pause one or more Synapse SQL Pools to save some money on your Azure bill, but note that the storage costs will continue when you pause Synapse. Next step is to either schedule it within your Azure Automation account or to add a Webhook and execute it from an other service like Azure Data Factory

Besides scripting you can also use the Rest API of Synapse which is particularly easy if you want to pause or resume from within an ADF pipeline, but that will be explored in an other blogpost.

Friday, 31 January 2020

Schedule Up/Downscale Azure SQL Database (Az)

Case
To save some money on my Azure bill, I want to downscale the vCore or the number of DTUs for my Azure SQL Databases when we don't need a high performance and then upscale them again when for example the ETL starts. How do you arrange that in Azure?
Change the Tier of your SQL Azure DB














Solution
A few years ago we showed you how to do this with some PowerShell code in an Azure Automation Runbook with the AzureRM modules. However these old modules will be be out of support by the end of 2020. So now it is time to change those scripts.

The script works for both Database Transaction Units (DTU) and Virtuals Cores (vCore). DTU uses a Service Tier (Basic, Standard and Premium) and then the actual Performance Level (B, S0 to S12, P1 to P15) which indicates the number of used DTUs. More DTUs means more performance and of course more costs.

The newer vCore model is now recommended by Microsoft. It is a bit more flexible and transparent: it actually shows you the processor, memory and IOPS. But the lowest vCore price is much more expensive than the lowest DTU price. So for development and testing purposes you probably still want to use DTU. The vCore model also uses a Service Tier (GeneralPurpose, BusinessCritical) and then number of vCores combined with the processor (e.x. GP_Gen4_1 or GP_Gen5_2).

Use the Microsoft documentation to compare both models, but a little comparison. S3 with 100 DTUs is about the same as General Purpose with 1 vCore. P1 with 125 DTUs is about the same as Premium with 1 vCore.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account
















2) Add Module Az.Sql
Before we start writing some code we need to add a PowerShell module called Az.Sql. This module contains methods we need in our code to upscale or downscale the Azure SQL database, but first we need to add Az.Accounts because Az.Sql depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are not recognized:
Get-AzSqlDatabase : The term 'Get-AzSqlDatabase' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.Sql (but wait until Az.Accounts is actually imported)
  • Click on Az.Sql in the result and import this module
Adding a new module to your Automation Account















Note: if you are using an existing Automation Account then you probably already added Az.Accounts.

3) Create Runbook
Now we are ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook like UpDownScaleDb
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button
Create a Runbook















4) Edit Runbook code
Next edit the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code that can upscale and downscale your Azure SQL DB. It exists of five parts:
  1. Parameters
  2. Log in to Azure
  3. Get current pricing tier
  4. Upscale or downscale
  5. Logging
Parameters
To up or downscale the script needs five parameters. The first three parameters 'ResourceGroupName', ServerName'' and 'DatabaseName' are to indicate for which database the DTU or vCore needs to be changed. The fourth and fifth parameters 'Edition' and 'PricingTier' are used to up or downscale your DB. There are a couple of parameter validations which you could extend to make your script even more monkey proof.
Note: if you want to call this script via Azure Data Factory (ADF), then you need to change the parameter part. You can find all the details to do that in our blog posts about Runbook parameters and ADF and using the Webhook activity in ADF. If this is your first time creating a runbook then first try the standard script and then adjust it to your needs.

Log in to Azure
This is a standard piece of code that you will see in all of our examples. Please read our blog post about the Azure Run as Account for more detailed information.

Get current pricing tier
This piece of code tests whether it can find the database and gets its current pricing tier. It stores the current pricing tier and uses it later on for an extra check when upscaling or downscaling the DB.

Upscale or downscale
This is the actual code for changing the database pricing tier. There is an extra check to compare the current pricing tier with the new pricing tier. It now throws an error when they are equal. You could change that to write an warning instead of an error.
Note: you could also send emails to notify you of any errors

Logging
The last piece of code is for logging purposes. It shows you that it successfully changed pricing tier of the DB (or DWH) and how long it took to accomplish that.

# PowerShell code
########################################################
# Parameters
########################################################
[CmdletBinding()]
param(
    [Parameter(Mandatory=$True,Position=0)]
    [ValidateLength(1,100)]
    [string]$ResourceGroupName,

    [Parameter(Mandatory=$True,Position=1)]
    [ValidateLength(1,100)]
    [string]$ServerName,

    [Parameter(Mandatory=$True,Position=2)]
    [ValidateLength(1,100)]
    [string]$DatabaseName,

    [Parameter(Mandatory=$False,Position=3)]
    [ValidateLength(1,100)]
    [string]$Edition,
    
    [Parameter(Mandatory=$False,Position=4)]
    [ValidateLength(1,100)]
    [string]$PricingTier
)

# Keep track of time
$StartDate=(GET-DATE)



########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################
 


########################################################
# Getting the database for testing and logging purposes
########################################################
$MyAzureSqlDatabase = Get-AzSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName
if (!$MyAzureSqlDatabase)
{
    Write-Error "$($ServerName)\$($DatabaseName) not found in $($ResourceGroupName)"
    return
}
else
{
    Write-Output "Current pricing tier of $($ServerName)\$($DatabaseName): $($MyAzureSqlDatabase.Edition) - $($MyAzureSqlDatabase.CurrentServiceObjectiveName)"
}



########################################################
# Set Pricing Tier Database
########################################################
# Check for incompatible actions
if ($MyAzureSqlDatabase.Edition -eq $Edition -And $MyAzureSqlDatabase.CurrentServiceObjectiveName -eq $PricingTier)
{
    Write-Error "Cannot change pricing tier of $($ServerName)\$($DatabaseName) because the new pricing tier is equal to current pricing tier"
    return
}
else
{
    Write-Output "Changing pricing tier to $($Edition) - $($PricingTier)"
    $null = Set-AzSqlDatabase -DatabaseName $DatabaseName -ServerName $ServerName -ResourceGroupName $ResourceGroupName -Edition $Edition -RequestedServiceObjectiveName $PricingTier
}



########################################################
# Show when finished
########################################################
$Duration = NEW-TIMESPAN –Start $StartDate –End (GET-DATE)
Write-Output "Done in $([int]$Duration.TotalMinutes) minute(s) and $([int]$Duration.Seconds) second(s)"

Note: once your database grows to a certain size you can not downscale it to certain levels because there is a max database size for certain pricing tiers. For example if your DB is larger than 250 GB then you can note downscale it below S3. There are size checks that you could add to avoid errors.

5) Testing
Testing the functionality of your code can be done in the runbook editor. Click on the Test pane button above your script. After that you need to fill in the five parameters and hit the Start button to execute the script.
Testing a Powershell script
















6) Scheduling Runbook
To schedule your runbook in Azure Automation you first need to publish it via the Runbook editor. After it has been published you can add a schedule to this runbook.
  • Edit the script in the runbook editor
  • Click on publish (the editor will close and you will be redirected to the overview page)
  • In the overview page click on Link to schedule
  • In the Schedule menu you can select an existing schedule or create a new one
  • In the Parameter menu you can provide the value for the five parameters
Add schedule to a runbook
















Note: If you have multiple Azure SQL Database that you all want to up or downscale on the same time then you have a slight problem because you cannot reuse a schedule for the same runbook multiple times with different parameters (please upvote or add a comment). Workarounds:
  1. create multiple identical schedules (ugly but works)
  2. do everything in one big script (less flexible but works)
Log of runbook executions

Details of single execution







































Summary
In this blog post you learned how to schedule an up- or scale of your Azure SQL DB (or DWH) to save money in case you don't need a high performace 24 hours a day. Scheduling is done in Azure Automation, but with some minor changes you can also do that via an ADF pipeline.

The Powershell scripting gives you the most flexibility for example to add additional tests and notifications, but for those who don't like scripting: in a next blog post we will show you how to accomplish this with TSQL code only. You can either use that for an on-the-fly up- or downscale or use it in a Store Procedure Activity in Azure Data Factory.

Sunday, 5 January 2020

Schedule start and stop of Azure Virtual Machine (Az)

Case
There is an option to automatically shutdown your Azure Virtual Machine at a certain time, but where is the option to automatically start it at a certain time?
Auto-shutdown Auzre VM















Solution
In 2017 we showed you how to solve this with a scheduled PowerShell Runbook in Azure Automation. That post used the old Azure_RM PowerShell modules which will no longer be supported by the end of 2020. Therefor we have rewritten the code to use the Az PowerShell modules and made a few tweaks and extra checks. It can also shutdown the VM if you want to do everything in one place.

1) Create Automation Account
First we need to create an Automation Account to host our PowerShell code. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account















2) Add Module Az.Compute
Before we start writing code we need to add a PowerShell module called Az.Compute. This module contains Virtual Machine methods we need in our code. But first we need to add Az.Accounts because Az.Compute depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are not recognized:
Get-AzVM : The term 'Get-AzVM' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.Compute (but wait until Az.Accounts is actually imported)
  • Click on Az.Compute in the result and import this module
Adding a new module to your Automation Account















Note: if you are using an existing Automation Account then you probably already added Az.Accounts.

3) Runbook
Now we are ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook like StartStopVM
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button
Create a Runbook














4) Edit Script
Next edit the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code that can both start and stop your Azure Virtual Machine (VM). It exists of five parts:
  1. Parameters
  2. Log in to Azure
  3. Get current state
  4. Pause or Resume
  5. Logging
Parameters
To pause or resume the script needs three parameters. The first parameter 'VirtualMachineAction' is a string that indicates whether you want to stop or start the VM. The second parameter 'ResourceGroupName' indicates the location (resourcegroup) of your VM and the last parameter 'VirtualMachineName' is the name of your VM. There are a couple of validations which you could extend to make your script even more monkey proof.

Log in to Azure
This is a standard piece of code that you will see in all of our examples. Please read our blog post about the Azure Run as Account for more detailed information.

Get current state
This piece of code tests whether it can find the VM and gets its current state. It stores the current state and uses it later on for an extra check when stopping or starting the VM.

Stop or Start
This is the actual code for stopping or starting the VM. There is an extra check to compare the current state with the new desired state. It now throws an error when you want to stop a VM that is already stopped. You could change that to write an warning instead of an error.
Error






Note: you could also send emails to notify you of any errors

Logging
The last piece of code is for logging purposes. It shows you that it successfully changed the state of the VM and how long it took to accomplish that.

# PowerShell code

########################################################
# Parameters
########################################################
[CmdletBinding()]
param(
    [Parameter(Mandatory=$True,Position=0)]
    [ValidateSet('Start','Stop')]
    [string]$VirtualMachineAction,
    
    [Parameter(Mandatory=$True,Position=1)]
    [ValidateLength(1,100)]
    [string]$ResourceGroupName,

    [Parameter(Mandatory=$True,Position=2)]
    [ValidateLength(1,100)]
    [string]$VirtualMachineName
)

# Keep track of time
$StartDate=(GET-DATE)



########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################
 



########################################################
# Getting the VM for testing and logging purposes
########################################################
$myVirtualMachine = Get-AzVM -ResourceGroupName $ResourceGroupName -Name $VirtualMachineName
if (!$myVirtualMachine)
{
    Write-Error "$($VirtualMachineName) not found in $($ResourceGroupName)"
    return
}
else
{
    # Get status of VM when user provides data for existing VM
    $myVirtualMachineState = ((Get-AzVM -ResourceGroupName $ResourceGroupName -Name $VirtualMachineName -Status).Statuses | Where {$_.code -like 'PowerState/*'}).Code
    Write-Output "Current status of $($VirtualMachineName): $($myVirtualMachineState)"
}



########################################################
# Start or Stop VM
########################################################
# Check for incompatible actions
if (($VirtualMachineAction -eq "Start" -And $myVirtualMachineState -eq "PowerState/running") -Or ($VirtualMachineAction -eq "Stop" -And $myVirtualMachineState -eq "PowerState/deallocated"))
{
    Write-Error "Cannot $($VirtualMachineAction) $($VirtualMachineName) while the status is $($myVirtualMachineState)"
    return
}
# Resume Azure Analysis Services
elseif ($VirtualMachineAction -eq "Start")
{
    Write-Output "Now starting $($VirtualMachineName)"
    Start-AzVM -ResourceGroupName $ResourceGroupName -Name $VirtualMachineName -Confirm:$false
}
# Pause Azure Analysis Services
else
{
    Write-Output "Now stopping $($VirtualMachineName)"
    $null = Stop-AzVM -ResourceGroupName $ResourceGroupName -Name $VirtualMachineName -Confirm:$false -Force
}



########################################################
# Show when finished
########################################################
$Duration = NEW-TIMESPAN –Start $StartDate –End (GET-DATE)
Write-Output "Done in $([int]$Duration.TotalMinutes) minute(s) and $([int]$Duration.Seconds) second(s)"

5) Testing
Testing the functionality of your code can be done in the runbook editor. Click on the Test pane button above your script. After that you need to fill in the parameters and hit the Start button to execute the script.
Testing your script















6) Scheduling Runbook
To schedule your runbook in Azure Automation you first need to publish it via the Runbook editor. After it has been published you can add a schedule to this runbook.
  • Edit the script in the runbook editor
  • Click on publish (the editor will close and you will be redirected to the overview page)
  • In the overview page click on Link to schedule
  • In the Schedule menu you can select an existing schedule or create a new one
  • In the Parameter menu you can provide the value for the parameters
Add schedule a runbook















Note: If you have multiple Azure Virtual Machines that you all want to start on the same time then you have a slight problem because you cannot reuse a schedule for the same runbook multiple times with different parameters (please upvote or add a comment). Workarounds:
  1. create multiple identical schedules (ugly but works)
  2. do everything in one big script (less flexible but works)
Log of runbook executions (start VM on weekdays)














Summary
In this post you saw how you can start an Azure Virtual Machine with PowerShell because the menu only supports auto-shutdown. With just a few lines of code and a schedule you can accomplish an auto-startup.
If you also add a webhook to this runbook then you could call this runbook with PowerShell code from other applications. For example Microsoft Automate (Flow) or a Power App on your mobile. In a future blog post we will show that.

Tuesday, 31 December 2019

Schedule start & stop of Azure Analysis Services (Az)

Case
To save some money on my Azure Bill, I want to pause my Azure Analysis Services (AAS) at night when nobody is using it and then resume it in the morning. How do you arrange that in Azure?
Save some money on your Azure Bill by pausing AAS



















Solution
A few years ago we showed you how to do this with some PowerShell code in an Azure Automation Runbook with the AzureRM modules. However these old modules will be be out of support by the end of 2020. So now it is time to change those scripts.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account
















2) Add Module Az.AnalysisServices
Before we start writing some code we need to add a PowerShell module called Az.AnalysisServices. This module contains methods we need in our code to pause and resume Azure Analysis Services. But first we need to add Az.Accounts because Az.AnalysisServices depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are not recognized:
Get-AzAnalysisServicesServer : The term 'Get-AzAnalysisServicesServer' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.AnalysisServices (but wait until Az.Accounts is actually imported)
  • Click on Az.AnalysisServices in the result and import this module
Adding a new module to your Automation Account














Note: if you are using an existing Automation Account then you probably already added Az.Accounts.

3) Create Runbook
Now we are ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook like StartStopAas
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button
Create a Runbook














4) Edit Runbook code
Next edit the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code that can both start and stop your Azure Analysis Services (AAS). It exists of five parts:
  1. Parameters
  2. Log in to Azure
  3. Get current state
  4. Pause or Resume
  5. Logging
Parameters
To pause or resume the script needs three parameters. The first parameter 'AasAction' is a string that indicates whether you want to stop or start the AAS. The second parameter 'ResourceGroupName' indicates the location (resourcegroup) of your AAS and the last parameter 'AnalysisServerName' is the name of your AAS. There are a couple of validations which you could extend to make your script even more monkey proof.
Note: if you want to call this script via Azure Data Factory (ADF), then you need to change the parameter part. You can find all the details to do that in our blog posts about Runbook parameters and ADF and using the Webhook activity in ADF. If this is your first time creating a runbook then first try the standard script and then adjust it to your needs.

Log in to Azure
This is a standard piece of code that you will see in all of our examples. Please read our blog post about the Azure Run as Account for more detailed information.

Get current state
This piece of code tests whether it can find the AAS and gets its current state. It stores the current state and uses it later on for an extra check when pausing or resuming the AAS.

Pause or Resume
This is the actual code for pausing or resuming the AAS. There is an extra check to compare the current state with the new desired state. It now throws an error when you want to pause an AAS that is already paused. You could change that to write an warning instead of an error.
Note: you could also send emails to notify you of any errors

Logging
The last piece of code is for logging purposes. It shows you that it successfully changed the state of the AAS and how long it took to accomplish that.

# PowerShell code

########################################################
# Parameters
########################################################
[CmdletBinding()]
param(
    [Parameter(Mandatory=$True,Position=0)]
    [ValidateSet('Start','Stop')]
    [string]$AasAction,
    
    [Parameter(Mandatory=$True,Position=1)]
    [ValidateLength(1,100)]
    [string]$ResourceGroupName,

    [Parameter(Mandatory=$True,Position=2)]
    [ValidateLength(1,100)]
    [string]$AnalysisServerName
)

# Keep track of time
$StartDate=(GET-DATE)



########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################
 


########################################################
# Getting the AAS for testing and logging purposes
########################################################
$myAzureAnalysisServer = Get-AzAnalysisServicesServer -ResourceGroupName $ResourceGroupName -Name $AnalysisServerName
if (!$myAzureAnalysisServer)
{
    Write-Error "$($AnalysisServerName) not found in $($ResourceGroupName)"
    return
}
else
{
    Write-Output "Current status of $($AnalysisServerName): $($myAzureAnalysisServer.State)"
}



########################################################
# Pause or Resume AAS
########################################################
# Check for incompatible actions
if (($AasAction -eq "Start" -And $myAzureAnalysisServer.State -eq "Succeeded") -Or ($AasAction -eq "Stop" -And $myAzureAnalysisServer.State -eq "Paused"))
{
    Write-Error "Cannot $($AasAction) $($AnalysisServerName) while the status is $($myAzureAnalysisServer.State)"
    return
}
# Resume Azure Analysis Services
elseif ($AasAction -eq "Start")
{
    Write-Output "Now starting $($AnalysisServerName)"
    $null = Resume-AzAnalysisServicesServer -ResourceGroupName $ResourceGroupName -Name $AnalysisServerName
}
# Pause Azure Analysis Services
else
{
    Write-Output "Now stopping $($AnalysisServerName)"
    $null = Suspend-AzAnalysisServicesServer -ResourceGroupName $ResourceGroupName -Name $AnalysisServerName
}



########################################################
# Show when finished
########################################################
$Duration = NEW-TIMESPAN –Start $StartDate –End (GET-DATE)
Write-Output "Done in $([int]$Duration.TotalMinutes) minute(s) and $([int]$Duration.Seconds) second(s)"

5) Testing
Testing the functionality of your code can be done in the runbook editor. Click on the Test pane button above your script. After that you need to fill in the parameters and hit the Start button to execute the script.
Testing your script















6) Scheduling Runbook
To schedule your runbook in Azure Automation you first need to publish it via the Runbook editor. After it has been published you can add a schedule to this runbook.
  • Edit the script in the runbook editor
  • Click on publish (the editor will close and you will be redirected to the overview page)
  • In the overview page click on Link to schedule
  • In the Schedule menu you can select an existing schedule or create a new one
  • In the Parameter menu you can provide the value for the parameters
Add schedule to runbook















Note: If you have multiple Azure Analysis Services that you all want to pause/resume on the same time then you have a slight problem because you cannot reuse a schedule for the same runbook multiple times with different parameters (please upvote or add a comment). Workarounds:

  1. create multiple identical schedules (ugly but works)
  2. do everything in one big script (less flexible but works)
Log of runbook executions















Summary
In this blog post you learned how to schedule a stop and start for your Azure Analysis Services to save money in case you don't need it to be live 24*7. Scheduling is done in Azure Automation, but with some minor changes you can also do that via an ADF pipeline.

Monday, 9 December 2019

SendGrid - Send emails via a Runbook

Case
I want to send email notifications in my Azure Automation runbook when pausing or downscaling fails, but I don't want to create an extra Office365 account for just sending email notifications. How do you send emails in a PowerShell runbook?
Sending free emails in Azure













Solution
Microsoft added the third party service SendGrid to the marketplace under 'Software as a Service (SaaS)' which allows you to send a massive number of 25000 emails a month for free. Hopefully more than enough for a couple of failure notifications.

In this blog we will show you how to create a SendGrid account in Azure and then show you how to send email notifications via a PowerShell runbook. This could be useful to send notifications when pausing or downscaling an Azure resource in your runbook fails. In an other blog post we already showed you how to use SendGrid in Azure LogicApps in combination with Azure Data Factory to send notifications when the ETL process fails.

Part 1: Create SendGrid Account
The first part of this solution is the same as the previous LogicApps post. You can skip it if you already have a SendGrid account.

1) Create new resource
The first step is to create a SendGrid resource which you can find in the Azure marketplace. SendGrid will send the actual emails.
  1. Go to the Azure Portal and create a new resource
  2. Search in de Marketplace for SendGrid or find it under the topic Software as a Service (SaaS)
  3. Select SendGrid and then click on the Create button
  4. Give your new resource a useful name and a secure password
  5. Select the right subscription and Resource Group
  6. Choose the pricing tier you need (F1 is free) and optionally enter a Promotion Code if you expect to send millions of emails a month
  7. Fill out the contact details which will be send to SendGrid (Twilio) for support reasons
  8. Review (and accept) the legal terms
  9. Click on the create button
Create Azure SendGrid account















After these first steps a SendGrid resource will be created in Azure, but it will also create a SendGrid account at sendgrid.com. You will also receive an email to activate your account on sendgrid.com. Note that the pricing details can be changed in Azure, but all the other (non-azure) settings can only be edited on sendgrid.com.

2) Create API key
To send emails via SendGrid we first need the API key. This key can only be generated on the sendgrid.com website.
  1. Go to the Azure SendGrid resource and click on the Manage button on the overview page. This will redirect you to the sendgrid.com website
  2. Go to Settings in the left menu and collapse the sub menu items
  3. Go to the API Keys and click on the Create API Key button.
  4. Then enter a name for your API key, choose which permissions you need and click on the Create and View button.
  5. Copy and save this API key in a password manager. This is the only option you get to retrieve this API key
Retrieve API key















The first part is done. You now have successfully obtained the SendGrid API key which we will use in the second part of this blog.


Part 2: Use SendGrid in Automation Runbooks
Now the coding part of the solution where we will use PowerShell code to send an email.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose an useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • Next decide whether you need the Azure Run As account. The first code example does not require it, but the second example (where the SendGrid API key is stored in an Azure Key Vault) does need it. So decide and hit the create button
Create Azure Automation Account















2) Create Runbook
Next we need to create a runbook to host the PowerShell code.
  • Go to the newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button

Create Runbook















3) Edit Runbook code
Next edit the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to think about how to integrate it in your own code.
# PowerShell code

# Hardcode the API key of sendgrid. We need it in the header of the API call
$SENDGRID_API_KEY = "SG.O_g2Bl634FAKEzVOskYInv.8yu_ID2cdxQgoFAKEyG0ByK5xjvPu30DKgOET_n1D7U"

# Create the headers for the API call
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Bearer " + $SENDGRID_API_KEY)
$headers.Add("Content-Type", "application/json")

# Parameters for sending the email
$fromEmailAddress = "noreply@bitools.com"
$destEmailAddress = "joost@bitools.com"
$subject = "Testemail send from runbook via SendGrid"
$content = "Testing 123. This is a test email send via SendGrid"

# Create a JSON message with the parameters from above
$body = @{
personalizations = @(
    @{
        to = @(
                @{
                    email = $destEmailAddress
                }
        )
    }
)
from = @{
    email = $fromEmailAddress
}
subject = $subject
content = @(
    @{
        type = "text/plain"
        value = $content
    }
)
}

# Convert the string into a real JSON-formatted string
# Depth specifies how many levels of contained objects
# are included in the JSON representation. The default
# value is 2
$bodyJson = $body | ConvertTo-Json -Depth 4

# Call the SendGrid RESTful web service and pass the
# headers and json message. More details about the 
# webservice and the format of the JSON message go to
# https://sendgrid.com/docs/api-reference/
$response = Invoke-RestMethod -Uri https://api.sendgrid.com/v3/mail/send -Method Post -Headers $headers -Body $bodyJson

The JSON message that will be send to the webservice looks like this:
{
   "personalizations":[
      {
         "to":[
            {
               "email":"joost@bitools.com"
            }
         ]
      }
   ],
   "from":{
      "email":"no_reply@bitools.com"
   },
   "content":[
      {
         "value":"Testing 123. This is a test email send via SendGrid",
         "type":"text/plain"
      }
   ],
   "subject":"Testemail send from runbook via SendGrid"
}
In the example above the API key is hardcode which is avoidable with some extra steps. In a previous blog we showed you how to store a secret in Azure Key Vault and retrieve it with PowerShell. The code example below shows you how to do that, but please read the Key Vault - Runbook post for more details. The first part is login to Azure and the second change is filling the variable SENDGRID_API_KEY.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Retrieve the API key of sendgrid from the Key Vault. We need it in the header of the API call
$SENDGRID_API_KEY = (Get-AzKeyVaultSecret -VaultName "MyVault" -Name "SendGrid").SecretValueText

# Create the headers for the API call
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Bearer " + $SENDGRID_API_KEY)
$headers.Add("Content-Type", "application/json")

# Parameters for sending the email
$fromEmailAddress = "noreply@bitools.com"
$destEmailAddress = "joost@bitools.com"
$subject = "Testemail send from runbook via SendGrid"
$content = "Testing 123. This is a test email send via SendGrid"

# Create a JSON message with the parameters from above
$body = @{
personalizations = @(
    @{
        to = @(
                @{
                    email = $destEmailAddress
                }
        )
    }
)
from = @{
    email = $fromEmailAddress
}
subject = $subject
content = @(
    @{
        type = "text/plain"
        value = $content
    }
)
}

# Convert the string into a real JSON-formatted string
# Depth specifies how many levels of contained objects
# are included in the JSON representation. The default
# value is 2
$bodyJson = $body | ConvertTo-Json -Depth 4

# Call the SendGrid RESTful web service and pass the
# headers and json message. More details about the 
# webservice and the format of the JSON message go to
# https://sendgrid.com/docs/api-reference/
$response = Invoke-RestMethod -Uri https://api.sendgrid.com/v3/mail/send -Method Post -Headers $headers -Body $bodyJson

4) Testing
Testing the functionality of your code can be done in the runbook editor. Click on the Test pane button above your script. After that just hit the Start button to execute the script. The example doesn't have code that writes to the screen. You can add that yourself or just wait for the email the arrive in your mailbox. Below an animation of testing an other script.
Testing PowerShell code in a Runbook














Summary
In this post we showed you how to use SendGrid in an Azure Automation runbook with PowerShell code. To integrate it in your own code you probably want to create a function for it. It is advisable to move the SendGrid API key to an Azure Key Vault to keep your code clean of any passwords or keys.

If you have any suggestions to use SendGrid in other Azure services then please leave a comment below.