Showing posts with label KEY_VAULT. Show all posts
Showing posts with label KEY_VAULT. Show all posts

Sunday, 14 November 2021

ADF Release - Update Linked Service while deploying

Case
I'm deploying Azure Data Factory via DevOps pipelines through my DTAP environment. During the deployment I want to change the URL of the Linked Service from Azure Key Vault to point it to the Key Vault of that specific environment. How do I change that Linked Service in DevOps?
ADF Linked Service






















Solution
This is possible by changing the ARM template parameter definition which on its turn will switch certain properties into overridable parameters during deployment. There is one downside: you cannot create a parameter for one specific Linked Service, because it will work for all Linked Services with that same property. However you can narrow it down to one particular type of Linked Service (for all Key Vaults in this example). 

1) Check Linked Service
For this example we will override the URL of the Azure Key Vault Linked Service, but first we need to find the actual property name that we want to override.
  • In ADF Studio go to Manage (toolbox icon in left menu) and then to Linked Services.
  • Now find your Key Vault Linked Service and hover your mouse over .it and click on the code icon {}.
  • Now check which property identifies a specific Key Vault. It should be within the typeProperties tag. In this case the baseUrl property contains a URL that points to one specific Key Vault.
baseURL property











Also notice the type property which will be used further on: AzureKeyVault


2) ARM template
Next step is to make this property overridable in the ARM template parameter definition (arm-template-parameters-definition.json). First the easiest way:
  • Under the same Manage menu item as step 1 go to ARM template
  • Click on Edit parameter configuration
  • Now find the baseUrl property within the Linked Services tag and change its value from "=" to "-" and then click on the OK button
Making baseUrl overridable
























This will create a new parameter with the name: [LinkedServiceName]_properties_typeProperties_baseUrl

As mentioned before, this will now work for all Linked Services that have a (filled) property called baseUrl. A bit nicer is to instead create a Key Vault specific parameter by adding a piece JSON code below the general tag with the *. The name 'AzureKeyVault' from the code below can be found in the code of step 1.
 
        "AzureKeyVault": {
            "properties": {
                "typeProperties": {
                    "baseUrl": "-"
                }
            }
        },
























This will result in the same (long) parameter name, but now only for Linked Services pointing to Azure Key Vault. We can shorten that very long parameter name by adding -:-BaseUrl where the colon : is the separator for the next part of the property. This is the name of the parameter. Adding a minus - in front of that name will remove _properties_typeProperties from the parametername and shorten it to:
[LinkedServiceName]_baseUrl 
        "AzureKeyVault": {
            "properties": {
                "typeProperties": {
                    "baseUrl": "-:-BaseUrl"
                }
            }
        }
This is much nicer. More info about this can be found in the documentation.

3) Adjust release pipeline
If you are using YAML to publish the changes then the only thing you have to change is the overrideParameters property by adding the new parameter ls_kv_bitools_baseUrl and adding either a variable or a hardcoded value. The > behind the property helps you to break the string over multiple lines and keep the YAML code more readable.
          ###################################
          # Deploy ADF Artifact
          ###################################
          - task: AzureResourceManagerTemplateDeployment@3
            displayName: '4 Deploy ADF Artifact'
            inputs:
              deploymentScope: 'Resource Group'
              azureResourceManagerConnection: 'sc_mcacc-adf-devopssp'
              subscriptionId: $(DataFactorySubscriptionId)
              action: 'Create Or Update Resource Group'
              resourceGroupName: $(DataFactoryResourceGroupName)
              location: 'West Europe'
              templateLocation: 'Linked artifact'
              csmFile: '$(Pipeline.Workspace)/ArmTemplatesArtifact/ARMTemplateForFactory.json'
              csmParametersFile: '$(Pipeline.Workspace)/ArmTemplatesArtifact/ARMTemplateParametersForFactory.json'
              overrideParameters: > 
                -factoryName $(DataFactoryName) 
                -ls_kv_bitools_baseUrl "https://bitools-prd.vault.azure.net/"
              deploymentMode: 'Incremental'

            env: 
                SYSTEM_ACCESSTOKEN: $(System.AccessToken)
If you are not sure about which parameternames you can use in the YAML, then you can lookup that name by exporting the ARM template under ARM template. Then check arm_template.json or arm_template_parameters.json
export template to find parametername













And if you're using the Release pipelines with the ARM template deployment task then you can just go to the Override template parameters property, click on the edit button and replace the value with a new value or a variable from a variable group.
ARM template deployment - Override template parameters

















Conclusion
In this post you learned how to override properties of a Linked Service during deployment via Azure DevOps. This allows you to point your Linked Service to the correct service for that specific environment. The most likely candidate for this is probably the Linked Service pointing to Azure Key Vault where you store all other connection details.

In a previous post we also showed you how to change Global Parameters during deployment and in a next post we will show you how to change triggers during deployment because you probably don't want to use the same triggers on development, test, acceptance and production.



Friday, 1 May 2020

Azure Data Factory - Use Key Vault Secret in pipeline

Case
I want to use secrets from Azure Key Vault in my Azure Data Factory (ADF) pipeline, but only certain properties of the Linked Services can be filled by secrets of a Key Vault. Is it possible to retrieve Key Vault secrets in the ADF pipeline itself?

Using Azure Key Vault for Azure Data Factory













Solution
Yes the easiest way is to use a web activity with a RestAPI call to retrieve the secret from Key Vault. The documentation is a little limited and only shows how to retrieve a specific version of the secret via the secret identifier (a guid). This is not a workable solution for two reasons:
  1. The guid changes each time you change the secret. In probably 99% of the cases you just want to get the latest of version of the secret. This means you need to change that guid in ADF as well when you change a secret.
  2. The guid differs on each environment of your key vault (dev, test, acceptance and production). This makes it hard to use this solution in a multi ADF environment.
For this example we will misuse Key Vault a little bit as a configuration table and retrieve the RestAPI url of a database from the Key Vault. The example assumes you already have a Key Vault filled with secrets. If you don't have that then executes the first two steps of this post.

1) Access policies
First step is to give ADF access to the Key Vault to read its content. You can now find ADF by its name so you don't have to search for its managed identity guid, but using that guid is also still possible.
  • Go to Access policies in the left menu of your Key Vault
  • Click on the blue + Add Access Policy link
  • Leave Configure from template empty
  • Leave Key permissions unselected (we will only use a Secret for this example)
  • Select Get for Secret permissions
  • Leave Certificate permissions unselected (we will only use a Secret for this example)
  • Click on the field of Select principal to find the name of your Azure Data Factory
  • Leave Authorized application unchanged
  • Click on Add and a new Application will appear in the list of Current Access Policies
Add Access policy
















Note: for this specific example we do not need to create a Key Vault Linked Service in ADF.

2) Determine URL of secret
To retrieve the secrets we need the RestAPI URL of that secret. This URL is constructed as
https://{Name Keyvault}.vault.azure.net/secrets/{SecretName}?api-version=7.0

{Name Keyvault} : is the name of the keyvault you are using
{SecretName} : is the secretName

In this example the secretName is "SQLServerURL" and the URL should be looking like this https://{Name Keyvault}.vault.azure.net/secrets/SQLServerURL?api-version=7
Get the SecretName from Key Vault












3) Web activity
Next we have to add the activity ‘web’ into the ADF pipeline. Use the following settings in the settings tab.
  • Set URL to https://{Name Keyvault}.vault.azure.net/secrets/SQLServerURL?api-version=7.0
  • Set Method to Get
  • Under Advanced select MSI
  • And set the resource to https://vault.azure.net

Configuring the Web activity to retrieve the secret


















4) Retrieve value
Now we want to use the secret from the Key Vault in a successive activity, in this case another web activity to upscale a database. In the URL property of this activity we now use the output value from the previous webactivity.
@activity('GetURL').output.value
Retrieve output value via expression















5) The result
To check the result of the changes we need to execute the pipeline.
Execute the pipeline to check the result

















Note: if you are using this to retrieve real secrets like passwords and you don't want them to show up in the logging of Azure Data Factory then check the Secure output property on the general tab of your activity.
Don't show secret in logging




















Conclusion
In this blogpost your learned how to retrieve Key Vault secrets in ADF. The trick is to retrieve them by there name instead of by there version Guid. This will always give you the latest version and allows you to use this construction in multiple environments.

Update: ADF now supports Global parameters to store parameters that can be used by all pipelines

Thursday, 20 February 2020

Use Azure Key Vault for Azure Databricks

Case
I need to use some passwords and keys in my Databricks notebook, but for security reasons I don't want to store them in the notebook. How do I prevent storing sensitive data in Databricks?
Using Azure Key Vault for Azure Databricks














Solution
Let's say you want to connect to an Azure SQL Database with SQL Authentication or an Azure Blob Storage container with an Access key in Databricks. Instead of storing the password or key in the notebook in plain text, we will store it in an Azure Key Vault as a secret. With an extra line of code we will retrieve the secret and use its value for the connection.

The example below will show all individual steps in detail including creating an Azure Key Vault, but assumes you already have an Azure Databricks notebook and a cluster to run its code. The steps to give Databricks access to the Key Vault slightly deviate from Azure Data Factory or Azure Automation Runbook, because the access policy is set from within Databricks itself.

1) Create Key Vault
First step is creating a key vault. If you already have one then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for key vault
  • Select Key Vault and click on Create
  • Select your Subscription and Resource Group 
  • Choose a useful name for the Key Vault
  • Select your Region (the same as your other resources)
  • And choose the Pricing tier. We will use Standard for this demo
Creating a new Key Vault
















2) Add Secret
Now that we have a Key Vault we can add the password from the SQL Server user or the key from the Azure Storage account. The Key Vault stores three types of items: Secrets, Keys and Certificates. For passwords, account keys or connectionstrings you need the Secret.
  • Go to the newly created Azure Key Vault
  • Go to Secrets in the left menu
  • Click on the Generate/Import button to create a new secret
  • Choose Manual in the upload options
  • Enter a recognizable and descriptive name. You will later on use this name in Databricks
  • Next step is to add the secret value which we will retrieve in Databricks
  • Keep Content type Empty and don't use the activation or expiration date for this example
  • Make sure the secret is enabled and then click on the Create button
Adding a new secret to Azure Key Vault
















3) Create Secret Scope
Instead of adding Databricks via the Access policies in the Key Vault we will use a special page in the Databricks workspace editor which will create an Application in the Azure Active Directory and give that application access within the Key Vault. At the moment of writing this feature is still a Public Preview feature and therefore layout could still change and it will probably become a menu item when in GA.
  • Go to your Azure Databricks overview page in the Azure portal
  • Click on the Launch workspace button in the middle (a new tab will opened)
  • Change the URL of the new tab by adding '#secrets/createScope' after the URL
    Example
    https://westeurope.azuredatabricks.net/?o=1234567890123456
    Becomes
    https://westeurope.azuredatabricks.net/?o=1234567890123456#secrets/createScope
Find 'Create Secret Scope' form






















Next step is to fill in the 'Create Secret Scope' form. This will connect Databricks to the Key Vault.
  • Fill in the name of your secret scope. It should be unique within the workspace and will be used in code to retrieve the secret from Key Vault.
  • The Manage Principal for the premium tier can either be Creator (secret scope only for you) or All Users (secret scope for all users within the workspace). For the standard tier you can only choose All Users.
  • The DNS name is the URL of your Key Vault which can be found on the Overview page of your Azure Key Vault which looks like: https://bitools.vault.azure.net/
  • The Resource ID is a path that points to your Azure Key Vault. Within the following path replace the three parts within the brackets:
    /subscriptions/[1.your-subscription]/resourcegroups/[2.resourcegroup_of_keyvault]/providers/ Microsoft.KeyVault/vaults/[3.name_of_keyvault]
    1. The guid of your subscription
    2. The name of the resource group that hosts your Key Vault
    3. The name of your Key Vault
      Tip: if you go to your Key Vault in the Azure portal then this path is part of the URL which you could copy
  • Click on the Create button and when creation has finished click on the Ok button
Create Secret Scope






















4) Verify Access policies
Next you want to verify the rights of Databricks in the Key Vault and probably restrict some options because by default it gets a lot of permissions.
  1. Go to your Key Vault in the Azure Portal
  2. Go to Access policies in the left menu
  3. Locate the application with Databricks in its name
  4. Check which permissions you need. When using secrets only, the Get and List for secrets is probably enough.
Verify permissions of Databricks in Azure Key Vault














5) Scala code
Now it is time to retrieve the secrets from the Key Vault in your notebook with Scala code (Python code in next step). First code is for when you forgot the name of your secret scope or want to know which ones are available in your workspace.
// Scala code

// Get list of all scopes
val mysecrets = dbutils.secrets.listScopes()
// Loop through list
mysecrets.foreach { println }
Scala code to get secret scopes










If you want to get the value of one secret you can execute the following code. Note that the value will not be shown in your notebook execution result
// Scala code

dbutils.secrets.get(scope = "bitools_secrets", key = "blobkey")
Scale code to retrieve secret from the Azure Key Vault








And if you want to use that code to retrieve the key from your blob storage account and get a list of files you can combine it in the following code. The name of the storage account is 'bitools2'
// Scala code

dbutils.fs.mount(
  source = "wasbs://sensordata@bitools2.blob.core.windows.net",
  mountPoint = "/mnt/bitools2",
  extraConfigs = Map("fs.azure.account.key.bitools2.blob.core.windows.net" -> dbutils.secrets.get(scope = "bitools_secrets", key = "blobkey")))

Scale code to mount Storage and get list of files














6) Python code
Now it is time to retrieve the secrets from the Key Vault in your notebook with Python code. First code is for when you forgot the name of your secret scope or want to know which ones are available in your workspace.
# Python code

# Get list of all scopes
mysecrets = dbutils.secrets.listScopes()
# Loop through list
for secret in mysecrets:
  print(secret.name)
Python code to get secret scopes










If you want to get the value of one secret you can execute the following code. Note that the value will not be shown in your notebook execution result
# Python code

dbutils.secrets.get(scope = "bitools_secrets", key = "blobkey")
Python code to retrieve secret from the Azure Key Vault








And if you want to use that code to retrieve the key from your blob storage account and get a list of files you can combine it in the following code. The name of the storage account is 'bitools2'
# Python code

dbutils.fs.mount(
  source = "wasbs://sensordata@bitools2.blob.core.windows.net",
  mount_point = "/mnt/bitools2a",
  extra_configs = {"fs.azure.account.key.bitools2.blob.core.windows.net":dbutils.secrets.get(scope = "bitools_secrets", key = "blobkey")})

Python code to mount Storage and get list of files














Conclusion
In this post you learned how to store sensitive data for your data preparation in databricks the right way by creating a Key Vault and use it in your notebook. The feature is still in public preview which will probably mean the layout will slightly change before going to GA, but the features will most likely stay the same. Another point of attention is that you don't have any influence on the name of the Databricks application in de AAD and the default permissions in the Key Vault.

In previous posts we also showed you how to use the same Key Vault in an Azure Data Factory and an Azure Automation Runbook to avoid hardcoded passwords and keys. In a future post we will show you how to use it in other tools like Azure Functions.

Sunday, 16 February 2020

Use Azure Key Vault for Azure Data Factory

Case
I need to use some passwords and keys in my Azure Data Factory (ADF) pipelines, but for security reasons I don't want to store them in ADF. How do I prevent storing sensitive data in Azure Data Factory?
Using Azure Key Vault for Azure Data Factory













Solution
Let's say you want to connect to a database in ADF with SQL Authentication. Instead of saving the password in a Linked Services connection of ADF, you can store that password in an Azure Key Vault as a secret. Then you give ADF read rights (GET Secret) to that Key Vault to retrieve the password stored in the secret. In the Linked Services connection you configure which Key Vault to use and the name of your secret.

The example below will show all individual steps in detail including creating an Azure Key Vault, but assumes you already have an Azure Data Factory with pipelines.

1) Create Key Vault
First step is creating a key vault. If you already have one then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for key vault
  • Select Key Vault and click on Create
  • Select your Subscription and Resource Group 
  • Choose a useful name for the Key Vault
  • Select your Region (the same as your other resources)
  • And choose the Pricing tier. We will use Standard for this demo
Creating a new Key Vault
















2) Add Secret
Now that we have a Key Vault we can add the password from the SQL Server user. The Key Vault stores three types of items: Secrets, Keys and Certificates. For passwords, account keys or connectionstrings you need the Secret.
  • Go to the newly created Azure Key Vault
  • Go to Secrets in the left menu
  • Click on the Generate/Import button to create a new secret
  • Choose Manual in the upload options
  • Enter a recognizable and descriptive name. You will later on use this name in ADF
  • Next step is to add the secret value which we will retrieve in ADF
  • Keep Content type Empty and don't use the activation or expiration date for this example
  • Make sure the secret is enabled and then click on the Create button
Adding a new secret to Azure Key Vault
















3) Access policies
Now we have to give ADF access to the Key Vault to read its content. You can now find ADF by its name so you don't have to search for its managed identity guid, but using that guid is also still possible.
  1. Go to Access policies in the left menu
  2. Click on the blue + Add Access Policy link
  3. Leave Configure from template empty
  4. Leave Key permissions unselected (we will only use a Secret for this example)
  5. Select Get for Secret permissions
  6. Leave Certificate permissions unselected (we will only use a Secret for this example)
  7. Click on the field of Select principal to find the name of your Azure Data Factory
  8. Leave Authorized application unchanged
  9. Click on Add and a new Application will appear in the list of Current Access Policies
Add Access policy
















If you want to use the Managed Identity GUID then first go to ADF and click on Properties in the left menu. There you will find the Managed Identity Object ID that you can use to search for principals under the access policies.
The Managed Identity of ADF


















4) Create Linked Service connection to Azure Key Vault
Now we need to let ADF know about your new Azure Key Vault by adding an extra Linked Service connection to your Key Vault.
  1. Go to ADF and open the Author & Monitor editor
  2. Within the new tab go to the Author section (Pencil icon) and click on connections to see all Linked Services
  3. Add a new Linked Service by clicking the + New button
  4. Search for (Azure) Key Vault and click on continue to enter all connection details
  5. First enter a descriptive name and optionally a description
  6. Select your subscription
  7. Select your newly created Key Vault
  8. Test the connection and if successful click on Create
Add Key Vault Linked Service


















5) Create Linked Service connection
Last step is creating the SQL Server Linked Service connection in ADF. In this example we used a SQL Server connection, but the same principal works for all connections in ADF. Note: that you can also retrieve the entire connection string from the Key Vault, but for this example we only retrieve the password secret.

Before you start make sure the Linked Service connection to the Key Vault has been  published. Otherwise you get an error message when hitting the create button
  1. Add a new Linked Service by clicking the + New button
  2. Search for (Azure) SQL Database and click on continue to enter all connection details
  3. First enter a descriptive name and a description
  4. Then use Connection string (not Azure Key Vault) to select your server and database
  5. After that use Azure Key Vault (not Password) to retrieve the password
  6. Select the Key Vault from the previous step
  7. Enter the name of the secret (that you created in step 2)
  8. Test the connection and if successful hit the create button
  9. Now you can use this Linked Services connection in your pipelines
Create connection and retrieve password via Key Vault



















Conclusion
In this post you learned how to store sensitive data for your ETL the right way by creating a Key Vault and use it in Azure Data Factory. An important step is to give ADF rights to retrieve secrets from the Key Vault. Either via its name of the Managed Identity.

In a previous post we also showed you how to use the same Key Vault in an Azure Automation Runbook to avoid passwords and keys in the PowerShell code. In a future post we will show you how to use it in other tools like Azure Databricks or Azure Functions.

It is also possible to use Key Vault secrets in the pipelines instead of in the Linked Services, but this is explained in a separate post.


Thursday, 5 December 2019

Use Azure Key Vault for Automation Runbooks

Case
I created an Azure Automation Runbook to managed various Azure resources, but I don't want to store keys or passwords in my code. What is the secure way to manage keys and passwords in a Runbook?
Azure Automation Runbook with Azure Key Vault














Solution
The answer is by using the Azure Key Vault. You can store your secrets in the Key Vault and then give the account running the Runbook the appropriate rights to retrieve them with a script.

1) Create Automation Account
First we need to create an Automation Account. If you already have one with the Run As Account enabled then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for automation
  • Select Automation Account
  • Choose a useful name for the Automation Account
  • Select your Subscription, Resource Group and the Region
  • For this example we will use the Azure Run As account. So make sure to enable it and then click on the Create button.
Create Azure Automation Account
















2) Add Module Az.KeyVault
Before we start writing code we need to add a PowerShell module called Az.KeyVault. This module contains methods we need in our code to retrieve secrets from the Azure Key Vault. But first we need to add Az.Accounts because Az.KeyVault depends on it.

If you forget this step you will get error messages while running your code that state that some of your commands are nog recognized:
Get-AzKeyVaultSecret : The term 'Get-AzKeyVaultSecret' is not recognized as the name of a cmdlet, function, script 
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct 
and try again.
  • Go to the newly created Azure Automation Account
  • Go to Modules in the left menu
  • Click on the Browse Gallery button
  • Search for Az.Accounts
  • Click on Az.Accounts in the result and import this module
  • Also search for Az.KeyVault (but wait until Az.Accounts is actually imported)
  • Click on Az.KeyVault in the result and import this module
Adding a new module to your Automation Account















3) Get account name
We need to determine the name of the Azure Run As account because we have to give this account access to the secrets inside the Azure Key Vault. The name is usually the name of the Automation Account followed by a string of random chars.
  • Locate Connections in the left menu of the Automation account and click on it.
  • Click on the row containing 'AzureRunAsConnection'
  • A new pane with details will appear. You need to remember the Guid of the ApplicationId.

Find ApplicationId













  • Search for App registrations in the Azure search bar on the top of the Azure dashboard
  • Search for the App registration with the same Application (client) ID as above
  • Remember the Display name of this account for later on
Find account name with ApplicationId














4) Create Key Vault
Next step is creating a key vault. If you already have one then you can skip this step.
  • Go to the Azure portal and create a new resource
  • Search for key vault
  • Select Key Vault and click on Create
  • Select your Subscription and Resource Group 
  • Choose a useful name for the Key Vault
  • Select your Region (the same as your other resources)
  • And choose the Pricing tier. We will use Standard for this demo
Creating a new Key Vault
















5) Add Secret
Now that we have a Key Vault we have to put in a password for testing. The Key Vault stores three types of items: Secrets, Keys and Certificates. For passwords, account keys or connectionstrings you need the Secret.
  • Go to the newly created Azure Key Vault
  • Go to Secrets in the left menu
  • Click on the Generate/Import button to create a new secret
  • Choose Manual in the upload options
  • Enter a recognizable and descriptive name. You will use this name in the runbook code
  • Next step is to add the secret value which we will retrieve in the code
  • Keep Content type Empty and don't use the activation or expiration date for this example
  • Make sure the secret is enabled and then click on the Create button
Adding a new secret to Azure Key Vault
















6) Access policies
By default the Run As Account can only see the Azure Key Vault, but it can't read its content. In step 3 you retrieved the name of the Azure Run As Account. Now we will give it access to read the secrets.
  1. Go to Access policies in the left menu
  2. Click on the blue + Add Access Policy link
  3. Leave Configure from template empty
  4. Leave Key permissions unselected (we will only use a Secret for this example)
  5. Select Get for Secret permissions
  6. Leave Certificate permissions unselected (we will only use a Secret for this example)
  7. Click on the field of Select principal to find the account from step 3
  8. Leave Authorized application unchanged
  9. Click on Add and a new Application will appear in the list of Current Access Policies
Add Access policy
















7) Create Runbook
Now we are finally ready to create a runbook in the Azure Automation Account and start writing some PowerShell code.
  • Go back to the overview page of your newly created Azure Automation Account
  • Click on Runbooks in the left menu
  • Click on the + Create a runbook button to create a new Runbook
  • Enter a descriptive name for the Runbook
  • Select PowerShell as Runbook type
  • Optionally add a description and click on the Create button

Create Runbook















8) Edit Runbook code
Next open the new Runbook if it wasn't already opened by the previous step. Copy the code below and paste it in the editor. Then study the code and its comments to understand the code and to make sure we don't steal your data. The first part about login to Azure is described in this previous post.
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################

# Variables for retrieving the correct secret from the correct vault
$VaultName = "bitools"
$SecretName = "MyPassword"

# Retrieve value from Key Vault
$MySecretValue = (Get-AzKeyVaultSecret -VaultName $VaultName -Name $SecretName).SecretValueText

# Write value to screen for testing purposes
Write-Output "The value of my secret is $($MySecretValue)"

Hardcoding the parameter for the Key Vault name is not very flexible, but you could also pass them from Azure Data Factory by changing it into a parameter. This is especially useful when you have multiple environments (Development, Test, Acceptance and Production).
# PowerShell code
########################################################
# Log in to Azure with AZ (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
 
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
try
{
    # Get the connection properties
    $ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName       
  
    'Log in to Azure...'
    $null = Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $ServicePrincipalConnection.TenantId `
        -ApplicationId $ServicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint 
}
catch 
{
    if (!$ServicePrincipalConnection)
    {
        # You forgot to turn on 'Create Azure Run As account' 
        $ErrorMessage = "Connection $ConnectionName not found."
        throw $ErrorMessage
    }
    else
    {
        # Something else went wrong
        Write-Error -Message $_.Exception.Message
        throw $_.Exception
    }
}
########################################################


########################################################
# PARAMETERS 
########################################################
Param
(
    # ContainerName is required
    [Parameter(Mandatory=$False,Position=1)]
    [object] $WebhookData
)

# Get all parameters from body (passed from Data Factory Web Activity)
$Parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)

# Get single parameter from set of parameters
$ContainerName = $Parameters.ContainerName

# Variables for retrieving the correct secret from the correct vault
$VaultName = $Parameters.VaultName
$SecretName = "MyPassword"

# Retrieve value from Key Vault
$MySecretValue = (Get-AzKeyVaultSecret -VaultName $VaultName -Name $SecretName).SecretValueText

# Write value to screen for testing purposes
Write-Output "The value of my secret is $($MySecretValue)"
The parameters that will be provided in Azure Data Factory (ADF) via a JSON message will look like
{
"VaultName":"bitools"
}

9) Testing
Testing the functionality of your code is the easiest if you still have the hardcoded parameters. Then you can just use the Test pane in the Runbook editor like the animation below. If you want to test it with the parameters for ADF then you first need to create a webhook and then create and run an ADF pipeline to test your code.
Testing the code















Summary
In this post you learned how to use the Azure Key Vault in an Azure Automation Runbook. The code to retrieve the secret is very simple (only one command), but giving the right Automation Run As account the correct access is a bit more research/fiddling around. In futures posts we will show you how to use Azure Key Vault in combination with other tools like Azure Data FactoryAzure Databricks or Azure Functions.