In an earlier post, we showed you how to use Azure Logic Apps for extracting email attachments without programming skills. The attachments contain the source files. Because this step is part of an Data Warehouse solution, it would be nice to run this together with the ETL process that needs these source files. How can we archive this?
Azure Data Factory V2 - Execute Azure Logic App |
Solution
In the first few years of Azure, it was not possible to run your Data Warehouse process entirely in the Cloud. Of course, you could store the data in Azure SQL Database or Azure SQL Data Warehouse (see here for the differences between these two), but when you are using SQL Server Integration Services (SSIS) you still had to run this on-premise or create a custom Virtual Machine. Until recently. This post explains how you can execute SQL Server Integration Services (SSIS) packages in Azure, using Azure Data Factory (ADF) V2.
Besides running SSIS packages in ADF V2, you can also execute other Azure services in here. For example: Azure Databricks, Azure Data Lake Analytics (U-SQL scripts) and HDInsight (services like Hadoop, Spark, Hive etc.).
This post shows you how to execute an Azure Logic App inside ADF V2.
1) Add and configure activity
Create a new pipeline or edit an existing one. Select "General" and choose the Web activity. Give it a suitable name and go to Settings. Fill in the URL, corresponding to the one inside the HTTP trigger in the Azure Logic App, you created earlier:
Select the "POST" API Method. Now add a Header and enter the following:
Azure Logic App - URL in HTTP Trigger |
Select the "POST" API Method. Now add a Header and enter the following:
- KEY: Content-Type
- VALUE: application/json
NOTE:
Nowadays the Body is also mandatory, enter the following: @concat('{','}')
2) Run pipeline
After you have published your pipeline, go to Trigger and select Trigger (Now). You can also run the pipeline without publishing it: using Debug. In this mode you will see the result of the pipeline run in the bottom at Output.
NOTE:
If you do not publish your pipeline, you are getting the following error when you are trying to use Trigger (Now):
Pipeline Error - Use Trigger (now) without publishing |
NOTE 2:
If you do not publish your pipeline, you are getting the following warning when you want to access the monitor screen:
Pipeline Warning - Go to Monitor without publishing |
3) Result
Once you have triggered the pipeline, go to Monitor on the left in the menu. Default it will open the Pipeline Runs overview, but you can also select the Integration Runtimes or Trigger Runs overview at the top.
You can also watch the Runs history of the Logic App:
View Result - Logic App run history |
Summary
This post explains how you can manage other ETL, next to SSIS, in your Data Warehouse using one orchestrator. In this case we execute an Azure Logic App using Azure Data Factory (V2).
Click here to see how you can also execute a SSIS package using Azure Logic Apps.
POST requests now require a valid body in azure data factory.
ReplyDeleteFYI I use the trick in URL below to work around the body requirement.
ReplyDelete@concat('{','}')
https://social.msdn.microsoft.com/Forums/en-US/1d4d01b2-0858-479c-a648-5b718e7724f2/empty-body-web-activity-a-valid-body-is-required-for-put-and-post-request?forum=AzureDataFactory
Thanks Ricardo for the Post !
ReplyDeleteWeb Activity - POST Require valid body to Process and I have passed dummy value i.e - @concat('{','}') and It is working.. thanks!
@concat('{','}')
Thanks a lot!
ReplyDelete