Monday 4 June 2018

Execute Logic Apps in Azure Data Factory (V2)

Case
In an earlier post, we showed you how to use Azure Logic Apps for extracting email attachments without programming skills. The attachments contain the source files. Because this step is part of an Data Warehouse solution, it would be nice to run this together with the ETL process that needs these source files. How can we archive this?

Azure Data Factory V2 - Execute Azure Logic App















Solution
In the first few years of Azure, it was not possible to run your Data Warehouse process entirely in the Cloud. Of course, you could store the data in Azure SQL Database or Azure SQL Data Warehouse (see here for the differences between these two), but when you are using SQL Server Integration Services (SSIS) you still had to run this on-premise or create a custom Virtual Machine. Until recently. This post explains how you can execute SQL Server Integration Services (SSIS) packages in Azure, using Azure Data Factory (ADF) V2.

Besides running SSIS packages in ADF V2, you can also execute other Azure services in here. For example: Azure Databricks, Azure Data Lake Analytics (U-SQL scripts) and HDInsight (services like Hadoop, Spark, Hive etc.).

This post shows you how to execute an Azure Logic App inside ADF V2.

1) Add and configure activity
Create a new pipeline or edit an existing one. Select "General" and choose the Web activity. Give it a suitable name and go to Settings. Fill in the URL, corresponding to the one inside the HTTP trigger in the Azure Logic App, you created earlier:

Azure Logic App - URL in HTTP Trigger

















Select the "POST" API Method. Now add a Header and enter the following:
  • KEY: Content-Type
  • VALUE: application/json
When you are finished, click Publish All.

Azure Data Factory V2 - Configure Web Activity
















NOTE:
Nowadays the Body is also mandatory, enter the following: @concat('{','}')

2) Run pipeline
After you have published your pipeline, go to Trigger and select Trigger (Now). You can also run the pipeline without publishing it: using Debug. In this mode you will see the result of the pipeline run in the bottom at Output.

NOTE:
If you do not publish your pipeline, you are getting the following error when you are trying to use Trigger (Now):

Pipeline Error - Use Trigger (now) without publishing









NOTE 2:
If you do not publish your pipeline, you are getting the following warning when you want to access the monitor screen:

Pipeline Warning - Go to Monitor without publishing









3) Result
Once you have triggered the pipeline, go to Monitor on the left in the menu. Default it will open the Pipeline Runs overview, but you can also select the Integration Runtimes or Trigger Runs overview at the top.

You can also watch the Runs history of the Logic App:


View Result - Logic App run history













Summary
This post explains how you can manage other ETL, next to SSIS, in your Data Warehouse using one orchestrator. In this case we execute an Azure Logic App using Azure Data Factory (V2).

Click here to see how you can also execute a SSIS package using Azure Logic Apps.