Showing posts with label LOGIC_APPS. Show all posts
Showing posts with label LOGIC_APPS. Show all posts

Sunday 1 December 2019

SendGrid - Sending emails via ADF & LogicApps

Case
I want to send email notifications when my ETL process fails, but I don't want to create an extra Office365 account for just sending email notifications.
Sending free emails in Azure













Solution
Microsoft added the third party services SendGrid to the marketplace under 'Software as a Service (SaaS)' which allows you to send a massive number of 25000 emails a month for free. Hopefully more than enough for a couple of ETL failure notifications.

In this blog we will show you how to create a SendGrid account in Azure and then show you how to send email notifications via ADF/LogicApps/SendGrid. In a second post we will show you that you can also use this in an Azure Automation Runbook to send notifications about the up- or downscale and the pause or resume of services.


Part 1: Create SendGrid Account

1) Create new resource
The first step is to create a SendGrid resource which you can find in the Azure marketplace.
  1. Go to the Azure Portal and create a new resource
  2. Search in de Marketplace for SendGrid or find it under the topic Software as a Service (SaaS)
  3. Select SendGrid and then click on the Create button
  4. Give your new resource an useful name and a secure password
  5. Select the right subscription and Resource Group
  6. Choose the pricing tier you need (F1 is free) and optionally enter a Promotion Code if you expect to send millions of emails a month
  7. Fill out the contact details which will be send to SendGrid (Twilio) for support reasons
  8. Review (and accept) the legal terms
  9. Click on the create button
Create Azure SendGrid account















After these first steps a SendGrid resource will be created in Azure, but it will also create a SendGrid account at sendgrid.com. You will also receive an email to activate your account on sendgrid.com. Note that the pricing details can be changed in Azure, but all the other (non-azure) settings can only be edited on sendgrid.com.

2) Create API key
To send emails via SendGrid we first need the API key. This key can only be generated on the sendgrid.com website.
  1. Go to the Azure SendGrid resource and click on the Manage button on the overview page. This will redirect you to the sendgrid.com website
  2. Go to Settings in the left menu and collapse the sub menu items
  3. Go to the API Keys and click on the Create API Key button.
  4. Then enter a name for your API key, choose which permissions you need and click on the Create and View button.
  5. Copy and save this API key in a password manager. This is the only option you get to retrieve this API key
Retrieve API key















The first part is done. You now have successfully obtained the SendGrid API key which we will use in the second part of this blog.


Part 2: Use SendGrid in LogicApps
In this previous post we showed you how to send an email notification in Azure Data Factory (ADF) via LogicApps. That solution exists of two parts: ADF (error handling) and Logic Apps (send email). For this post we will only slightly change the Logic Apps part of that solution. Instead of using the Office 365 Outlook - Send an email action we will use the SendGrid - Send email action.
Replace Office 365 Outlook by SendGrid























The communication between these two Azure resources is done with a JSON message via an HTTP (post) request. The JSON message contains the name of the Data Factory and the pipeline that failed, an error message and a send to email address.







{
    "properties": {
        "DataFactoryName": {
            "type": "string"
        },
        "PipelineName": {
            "type": "string"
        },
        "ErrorMessage": {
            "type": "string"
        },
        "EmailTo": {
            "type": "string"
        }
    },
    "type": "object"
}

1) Create new Logic App
Let's create an Azure Logic App that receives parameters from ADF and sends an email using these parameters.
  1. Click on Create a resource on the Azure portal, search for Logic App, select it and hit the Create button. You can also locate Logic App under Integration in the Azure Marketplace.
  2. Pick a descriptive name like "ADF-Notifications"
  3. Then select the Subscription, Resource Group and Location
  4. Now hit the Create button and wait for it to be generated
Create new Logic App















2) HTTP Trigger
Next step is to edit the newly created Logic App and to pick a trigger. This is the event that starts this Logic App to send emails. To call in from the ADF Web(hook) activity we need an HTTP trigger.
  1. Go to your new Logic App and edit it
  2. Pick the HTTP trigger When a HTTP request is received
  3. Edit the trigger and copy and paste the JSON message from above into the big text-area. 
Adding the HTTP trigger















Adding the JSON message to the HTTP trigger will generate new variables for the next action that will be used to generate the email message.

3) Send an email
This next step deviates from the previous post and adds the SendGrid action to send emails. At the moment of writing there is a v3 and a v4 preview version. For important processes you probably should not choose the preview version.
  1. Add a New Step
  2. Search for SendGrid and select it
  3. Choose the Send email action
  4. Give the connection a name and paste the SendGrid API and click on Create
  5. The From-field is a hardcoded email address in this example
  6. The To-field comes from the variable 'EmailTo' generated by the JSON message in the HTTP trigger
  7. The Subject is a concatenation of the DFT name and the pipeline name
  8. Body comes from the 'ErrorMessage' variable
  9. Save the Logic App. The first save action will reveal the HTTP post URL.
Note that the top of the dynamic content window didn't appear on the screen. Collapsing the HTTP trigger above did the trick.
Adding SendGrid - Send email action















Tip: keep the message setup dynamic with variables which allows you to use it for all your ADF pipelines.

4) Copy URL from HTTP trigger
The Logic App is ready. Click on the HTTP trigger and copy the URL. We need this in ADF to trigger the Logic App.
Copy the URL for ADF
















Part 3: Setup Data Factory
Next step is adding a Web activity in the ADF pipeline to call the Logic App from above that sends the email. Since nothing changed on the ADF part of the solution since last time, you can continue for that ADF part on our previous post.

Summary
In this post we showed you how to use SendGrid in LogicApps to send you an email notification in case of a failing pipeline in Azure Data Factory. This saves you a daily login to the Azure portal to check the pipelines monitor. An additional advantage of using SendGrid above an Office 365 account like in the previous post is that it is free as long as you not send more than 25000 emails a month and you don't have to ask the administrator for an Office 365 account.

In a second blog post about SendGrid we will show you how to use it in an Azure Automation Runbook. This allows you to send notifications with PowerShell when for example your runbook fails.

Monday 4 June 2018

Execute Logic Apps in Azure Data Factory (V2)

Case
In an earlier post, we showed you how to use Azure Logic Apps for extracting email attachments without programming skills. The attachments contain the source files. Because this step is part of an Data Warehouse solution, it would be nice to run this together with the ETL process that needs these source files. How can we archive this?

Azure Data Factory V2 - Execute Azure Logic App















Solution
In the first few years of Azure, it was not possible to run your Data Warehouse process entirely in the Cloud. Of course, you could store the data in Azure SQL Database or Azure SQL Data Warehouse (see here for the differences between these two), but when you are using SQL Server Integration Services (SSIS) you still had to run this on-premise or create a custom Virtual Machine. Until recently. This post explains how you can execute SQL Server Integration Services (SSIS) packages in Azure, using Azure Data Factory (ADF) V2.

Besides running SSIS packages in ADF V2, you can also execute other Azure services in here. For example: Azure Databricks, Azure Data Lake Analytics (U-SQL scripts) and HDInsight (services like Hadoop, Spark, Hive etc.).

This post shows you how to execute an Azure Logic App inside ADF V2.

1) Add and configure activity
Create a new pipeline or edit an existing one. Select "General" and choose the Web activity. Give it a suitable name and go to Settings. Fill in the URL, corresponding to the one inside the HTTP trigger in the Azure Logic App, you created earlier:

Azure Logic App - URL in HTTP Trigger

















Select the "POST" API Method. Now add a Header and enter the following:
  • KEY: Content-Type
  • VALUE: application/json
When you are finished, click Publish All.

Azure Data Factory V2 - Configure Web Activity
















NOTE:
Nowadays the Body is also mandatory, enter the following: @concat('{','}')

2) Run pipeline
After you have published your pipeline, go to Trigger and select Trigger (Now). You can also run the pipeline without publishing it: using Debug. In this mode you will see the result of the pipeline run in the bottom at Output.

NOTE:
If you do not publish your pipeline, you are getting the following error when you are trying to use Trigger (Now):

Pipeline Error - Use Trigger (now) without publishing









NOTE 2:
If you do not publish your pipeline, you are getting the following warning when you want to access the monitor screen:

Pipeline Warning - Go to Monitor without publishing









3) Result
Once you have triggered the pipeline, go to Monitor on the left in the menu. Default it will open the Pipeline Runs overview, but you can also select the Integration Runtimes or Trigger Runs overview at the top.

You can also watch the Runs history of the Logic App:


View Result - Logic App run history













Summary
This post explains how you can manage other ETL, next to SSIS, in your Data Warehouse using one orchestrator. In this case we execute an Azure Logic App using Azure Data Factory (V2).

Click here to see how you can also execute a SSIS package using Azure Logic Apps.

Thursday 31 May 2018

Snack: Executing SSIS packages in Azure Logic App

Case
Can I use a Logic App to execute an SSIS package located in the Azure Integration Runtime environment (ADF V2)? Logic App has several triggers that could be interesting as a start moment to execute an SSIS package. For example when a new file is added in a Blob Storage Container, a DropBox, a OneDrive or an (S)FTP folder. How do I do that?
Logic App














Solution
If you want to trigger something to start an SSIS package instead of scheduling it in ADF, then Logic App could be very handy. Of course you can accomplish the same in SSIS with some custom code or perhaps a Third Party component, but Logic App is much easier and probably cheaper as well.

For this example we will start an existing package when a new file is added to a certain Azure Blob Storage container. The Blob Storage Container and the SSIS package in the Integration Runtime environment already exist.

1) Logic App
Create a new Logic App by clicking on the + sign (Create new resource) on the azure portal. It is located under Integration. Give it a descriptive name like "ExecuteSsisWhenBlobFileIsCreated". After the creation of the Logic App choose the Blank Logic App as template to start.
Create new Logic App

























2) Azure Blob Storage trigger
For this example we will be using a trigger on a Azure Blob Storage container: "When a blob is added or modified (properties only) (Preview)". Search for Blob and you will find the right trigger. Create a connection to the right Azure Blob Storage (if you already have created one within the same resource group then that one will be reused). Select the correct container and set the interval to your own needs.
Logic App Blob Storage Trigger
















3) SQL Server Execute Query
The next step is to create SQL code that executes our package(s). You can easily create the SQL for this in SSMS. Go to your package in the Catalog. Right click it and choose Execute... Now set all options like Logging Level, Environment and 32/64bit. After setting all options hit the Script button instead of the Ok button. This is the code you want to use. You can finetune it with some code to check whether the package finished successfully.
Generating code in SSMS














The code below was generated and finetuned. Copy the code below (or use your own code) to use it in the next step.
-- Variables for execution and error message
DECLARE @execution_id bigint, @err_msg NVARCHAR(150)

-- Create execution and fill @execution_id variable
EXEC [SSISDB].[catalog].[create_execution] @package_name=N'Package.dtsx', @execution_id=@execution_id OUTPUT, @folder_name=N'SSISJoost', @project_name=N'MyAzureProject', @use32bitruntime=False, @reference_id=Null, @useanyworker=True, @runinscaleout=True

-- Set logging level: 0=None, 1=Basic, 2=Performance, 3=Verbose 
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id,  @object_type=50, @parameter_name=N'LOGGING_LEVEL', @parameter_value=1 

-- Set synchonized option 0=A-SYNCHRONIZED, 1=SYNCHRONIZED 
-- A-SYNCHRONIZED: don't wait for the result
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id,  @object_type=50, @parameter_name=N'SYNCHRONIZED',  @parameter_value=1 

-- Execute the package with parameters from above
EXEC [SSISDB].[catalog].[start_execution] @execution_id, @retry_count=0

-- Check if the package executed succesfully (only for SYNCHRONIZED execution)
IF(SELECT [status] FROM [SSISDB].[catalog].[executions] WHERE execution_id=@execution_id)<>7
BEGIN
 SET @err_msg=N'Your package execution did not succeed for execution ID: ' + CAST(@execution_id AS NVARCHAR(20))
 RAISERROR(@err_msg,15,1)
END


Now back to the Logic App. Add a new action called SQL Server Execute Query (not stored procedure) and create a connection to the SSISDB where your packages are located. Paste your code to execute your package in the query field of this new action.
Action SQL Server Execute Query
















Note: When creating your SSISDB in ADF, make sure the option 'Allow Azure services to access' is turned on.

4) Testing
Save your Logic App and add a new file to the selected Blob Storage Container. Then watch the Runs history of the Logic App and the Execution Report in the Integration Services Catalog to view the result.

Summary
This post explains how to execute an SSIS package with a trigger in Logic App instead of scheduling it in Azure Data Factory. If you are using the Stored Procedure Activity in ADF to execute SSIS packages then you can reuse this code. In a next post we will see an alternative for Logic App.

Note: steps to turn your Integration Runtime off or on can be added with an Azure Automation action.

Saturday 31 March 2018

Use Logic Apps for extracting Email Attachments

Case
Sometimes it happens that source files are delivered via email. In that case you could extract these files with for example a Script Task in SQL Server Integration Services (SSIS) or a PowerShell Script, but this requires some serious programming in C#, VB.net or PowerShell . Are there other and easier ways to get email attachments without programming skills?

Azure Logic Apps - Codeless and serverless




















Solution
As we slowly move from on-premises Data Warehouses (DWH) to Azure, we could use other Azure parts to solve this: Azure Logic Apps. With this, you can build automatic workflows without writing code in C# or other programming languages. For example, extract social media data like Twitter tweets or using it for Azure Data Factory V2 notifications. For a lot of BI specialist writing code can be a threshold, so this service offers a way out. We will store these email attachments in Blob Storage. After that, you can load these files into your DWH using SSIS for example.

Starting point of this post is an existing storage account and blob container.

1) Create new Logic App
Go to your Azure portal and type in "Logic Apps" in the search bar Search resources, services and docs. Locate Logic App under Services and click on Add. Give it a suitable name like "ExtractEmailAttachments" and then choose the Subscription, Resource Group and Location. For the Resource Group and this Logic app we use West Europe since we are from the Netherlands.

Azure Portal - Create Logic App














2) Add trigger
When editing the Logic App, we first need to pick a trigger. It is the event that starts this Logic App. In this case, using Azure Data Factory, we pick the HTTP trigger When a HTTP request is received


Logic Apps Designer - Add HTTP trigger














NOTE:
When you open the Logic App for the first time, you can choose several (common) triggers. You can also choose existing Logic Apps templates for known applications or purposes to use it as an example or starting point. 

3) Get emails
Next step is to retrieve the emails. Click on New step and Add an action. Choose the Connector "Office 365 Outlook", search for "Get email" and select Office 365 Outlook - Get emails. The first time that you use this action, you need to login with your Office 365 account. Now you setup this action by choosing the Outlook folder, select only unread messages including the attachments and what subject the email contains (like in the Outlook client). Click on Save in the upper left corner when you are finished.

Logic Apps Designer - Add action Get Emails















4) Filter emails
In this scenario we are receiving various source files per email each day, but we first want to retrieve attachments from emails with a specific subject and store those in a separate blob storage container. For other email subjects we can add more conditions and save all those files in there own blob storage containers. That's why we are looping through the inbox (or another Outlook folder) and filter per subject. Add a new step and choose Add a condition to filter on a specific source file. Now automatically the for each will appear, because we are receiving multiple emails in the previous step. Give the condition a suitable name, because you have to add more conditions to separate the different source files (we only show one condition in this post).

Logic Apps Designer - Add condition














5) Store data in blobs
Now that we have filtered the emails per subject, we must store the attachment (which contains the data) itself. Add an action on the 'if true' side and choose "Azure Blob Storage". Inside this category, choose Azure Blob Storage - Create blob. The first time that you use this action, you need to create a connection. Choose the storage account and give the connection a suitable name. Now you can setup this action by defining the Folder path, Blob name and Blob content. As said earlier, we are retrieving new source files every day. That's why the blob name will contain the day of load. This is the expression:
concat('DWH01_Sales_',
formatDatetime(utcNow(),'yyyy'), '-', formatDatetime(utcNow(),'MM'), '-',
formatDatetime(utcNow(),'dd'),'.csv')

Logic Apps Designer - Create blob














NOTE:
There is no separate action to create a new folder in the Blob Storage container, but it will be created automatically when you save the first file in the container.

6) Send email when succeeded
Every time the data is stored into a new blob, we will confirm this by sending an email. In this case, we will send the email to the same account as step 2 (Get Emails). You can of course send it to developers or administrators in your organization as well. In that case you must create a new connection by clicking change connection, because now this action will automatically use the same Office 365 connection as before. Insert a new step and select "Office 365 Outlook" and choose the action Office 365 Outlook - Send an email. We create the following Body:

Dear User,

The run of DWH01_Sales has completed successfully for March 31, 2018.

Kind regards,
Azure Logic Apps

Therefore we need to use the following expression:
concat('The run of DWH01_Sales has completed successfully for '
, formatDatetime(utcNow(),'MMMMM'), ' ', formatDatetime(utcNow(),'dd'), ', '
, formatDatetime(utcNow(),'yyyy'), '.')

Fill in the Subject and select "To" to send the email To the account used in the Office 365 connection.
Logic Apps Designer - Send email















7) Mark email as read
After we have sent the succeed email, we want to mark the processed emails (Daily Schedule) as read. Insert a new step and select "Office 365 Outlook" and choose the action Office 365 Outlook - Mark as read. Click on Message Id  and choose "Message Id", based on step 2 (Get Emails). You have to click on "See more" to make it appear in the list.

Logic Apps Designer - Mark as read
















8) Move email to archive
Finally, we will move the emails to an archive folder. We created separate mail folders in Outlook for each subject/source file. Insert a new step and select "Office 365 Outlook" and choose the action Office 365 Outlook - Send an email. Choose the Message Id  and select the specific sub Folder in archive.


Logic Apps Designer - Move email















Result
Now let's see if it all works. Make sure you have sent the email "DWH01_Sales DAILY SCHEDULE" and it is unread. Click in the Logic Apps Designer on Run. Now wait for the succeeded email...it works!


Result - Received succeeded email
















The source file is also stored into a new blob. See below the result in Azure Storage Explorer.

Result - File stored as new blob
















Summary
In this post we showed you how to build a Logic App to ingest email attachments as source files. All without programming skills and easily to maintain easily as part from an ETL process.

Click here to see how to execute this Logic App with Azure Data Factory V2 as part of an DWH solution in Azure.

This is why we started the Logic App with an HTTP trigger that can be called from other applications. If you do not want to integrate the Logic App and use it as a separate solution, then you should start with a different trigger. For example a Recurrence trigger or When a new email arrives

Thursday 1 March 2018

Add email notification in Azure Data Factory V2

Case
I am running SSIS packages in Azure Data Factory (ADF V2), but I want to get an email notification when my package execution fails. It seems that ADF V2 doesn't have a built-in email notification option. How can I be notified without checking the built-in pipeline monitor in ADF?
Email notification on failure

















Solution
For this solution we will be using a Logic App to send an email and trigger it, if an error occurs in ADF. The starting point of this blog post is a working pipeline that executes an SSIS package using a stored procedure.
Data Factory loves Logic App











So, the solution exists of two parts: Logic App (email) and ADF (error handling). The communication between these two Azure parts is done with a JSON message via an HTTP request (post). The JSON message contains the name of the Data Factory and the pipeline that failed, an error message and an email address. You could of course hardcode the email address in Logic Apps, but now you can reuse the Logic App for various pipelines or data factories and notify different people.
{
    "properties": {
        "DataFactoryName": {
            "type": "string"
        },
        "PipelineName": {
            "type": "string"
        },
        "ErrorMessage": {
            "type": "string"
        },
        "EmailTo": {
            "type": "string"
        }
    },
    "type": "object"
}


a) Logic App
We first start with creating the Logic App. We need it before creating the error handler in ADF.

a1) Create new Logic App
Click on Create a resource and locate Logic App under Enterprise Integration. Pick a descriptive name like "ADF-Notifications" and then choose the Subscription, Resource Group and Location. For the Resource Group and this Logic app we use West Europe since we are from the Netherlands.
Create new Logic App




















a2) HTTP Trigger
When editing the Logic App we first need to pick a trigger. It is the event that starts this Logic App. Pick the HTTP trigger When a HTTP request is received and then click on edit to specify the parameters. Paste the JSON message from above in the textbox. In the next step we can use these parameters to setup the email.
Add HTTP trigger














a3) Send an email
Add a new step and choose Add an action. Search for "Send an email" and then scroll down to Office 365 Outlook - Send an email. The first time that you use this action you need to login with your Office 365 account. Now you can setup the email with fixed texts mixed with parameters from the JSON message from the previous step. When you are satisfied with the email setup, click on Save in the upper left corner.
Add action to Send an email














a4) Copy URL from HTTP trigger
The Logic App is ready. Click on the HTTP trigger and copy the URL. We need this in ADF to trigger the Logic App.
Copy the URL for ADF




















b) Data Factory
Next we will add a new activity in the existing ADF pipeline to trigger the new Logic App.

b1) Add Parameter
To specify the email address of the recipient we will use a pipeline parameter. Go to your existing pipeline (do not select any of the activities in it) and go to the Parameters page. Click on New and add a new String parameter called EmailTo. Add the email address of the recipient in the default value field. You can override the default value when editing the trigger (after save).
Add pipeline parameter













b2) Add Web activity
Next collapse the General activities and drag a Web activity to the canvas. Make sure to give it a suitable name like Error Notification. Add an Activity Dependency (Similar to the Precedence Constraints in SSIS) between the Stored Procedure activity and the Web activity. When right clicking it you can change it to Failure.
Add Web activity















b3) Web activity settings
Select the newly added Web activity and go to the Settings page. In the URL field you must paste the URL from step a4 and as method you need to select Post.
Next step is to add an new header with a JSON message. The header is called Content-Type and its expression is application/json. As body you need to add the following JSON message, but make sure to change the name of Stored Procedure activity. Ours is called Execute Package. The first two items are retrieving the Data Factory name and Pipeline name. The last one is getting the value of the parameter created in step b1.
{
    "DataFactoryName":
        "@{pipeline().DataFactory}",
    "PipelineName":
        "@{pipeline().Pipeline}",
    "ErrorMessage":
        "@{activity('Execute Package').error.message}",
    "EmailTo":
        "@{pipeline().parameters.EmailTo}"
}
.
Add URL and json to Web activity













b4) Testing
Now it is time to test the pipeline. Make sure something is failing in the package. For example by changing a servername or password in the SSIS environment. Or you could just pause your Integration Runtime and run the trigger. Now wait for the email to arrive.
Email notification received













The solution has one downside! Because you are handling the error with an Activity Dependency the entire pipeline succeeds despite of the failing SSIS stored procedure. Check the image below. The last 4 jobs did fail, but show the Status 'Succeeded'. Though there is an error message.
Failed or Succeeded?











b5) Add fail
If you want the correct status when the SSIS stored procedure fails then copy and paste the existing Stored Procedure activity, rename it to for example 'Fail' and replace SQL code with the code below. Then connect the Web activity to this new activity.
--T-SQL Code
Declare @err_msg NVARCHAR(150)
SET @err_msg=N'Error occurred, email was sent'
RAISERROR(@err_msg,15,1)
Add fail













Now we have a Failed status for a failing pipeline. Please leave a comment when you have a better or easier solution for this.
Status Failed












Summary
In this post we showed you how to use a Logic App to send you an email notification in case of a failing pipeline in Azure Data Factory. This saves you a daily login to the Azure portal to check the pipelines monitor. Feel free to adjust the JSON message to your own needs. You could also add an additional notification for successful jobs.

Update dec 1: instead of Office 365 you can now also use SendGrid

Monday 15 May 2017

Azure - On-premises data gateways

Case
If you want to use on-premises data in Azure or Power BI you need to install a so called gateway. A quick search resulted in four possible gateway downloads! Which one do you need? It's so confusing!

Microsoft Gateways for Azure and Power BI








Solution
There are two different gateways but with several names, versions and purposes:
  • Data Management Gateway
    This gateway is used for Azure Data Factory and Azure Machine Learning, but it was also used for Power BI. However, Power BI now uses the On-premises data gateway! I think / hope that this gateway will be deprecated and replaced by the On-premises data gateway in the near future.
    Download (32 and 64bit): https://www.microsoft.com/en-us/download/details.aspx?id=39717
  • On-premises data gateway
    This gateway comes in three flavors, but you can only install one per server.

Use source for multiple online services
Since you can only install one gateway per server, the On-premises data gateway has some challenges if you want to use your on-premises source for multiple online services like Power BI and AAS or for two AAS servers. In that case you install the first gateway on the source server itself and a second gateway on a separate on-premises server that can reach the first source server. A bit expensive to a have a server running for a data gateway only, but for now this is the only solution.
One source and two online services: two servers with their own gateway



















Note: details per gateway will be posted in separate posts (AAS)

Thursday 17 November 2016

Extracting tweets with Azure App Service - Logic App

Case
I want to extract tweets from twitter to my data warehouse without writing code. Is that possible?
Extract #SSIS Tweets













Solution
There are of course several options with traditional ETL tools like SSIS with custom .NET code, a custom twitter tasks or even by calling the Twitter API within R. But Microsoft also recently introduced Microsoft Flow and Microsoft Azure Logic Apps which can connect to various apps and services (like Facebook, Google Drive, GitHub, Twitter, SharePoint, etc.) to collect data, synchronize files or get notifications of certain events, without writing any code! The basics of Flow are free and Logic App is the professional version in Azure with of course some more options.

Solutions:
A) Microsoft Flow
B) Microsoft Azure Logic Apps

1) Microsoft Azure Logic Apps

First go to portal.azure.com, login and create a new Logic App (located under Enterprise Integration). The name of our Logic App is called SSISTweets (you have to choose a different name) and we use West-Europe as the location of the Resource Group since we life in the Netherlands. When you click on the create button the Logic App will be created and you can start editing. You can choose between various default templates or you can create a Blank LogicApp.
Creating a new Logic App

















If you have also tried Microsoft Flow, you will probably notice that there are less templates to choose from. We will choose the Blank LogicApp for this example.

2) Twitter source
In the blank LogicApp you first need to search for a trigger. In this case we want to search for tweets on twitter. The trigger is 'When a new tweet is posted'.
Blank LogicApp searching for Twitter trigger


















When you select the Twitter trigger you need to setup the Twitter connection and the search text. For this example I will login with my Twitter account and search for tweets with #SSIS. You can also change the frequency and interval.
Connecting to Twitter and setting up search


















3) Filtering annoying job tweets
Of course we want to filter those annoying job tweets. Therefore we need to add a condition. In the condition pane you can select fields from the Twitter 'source' to filter on. In this example we are filtering tweets that contain the word 'job' (probably not the perfect filter).
Add 'job' filter


















4) Add SQL Destination
Now we need to store the tweets in a (Azure) SQL Server database. For this example we created a simple table with all nvarchar(255) fields:
CREATE TABLE [dbo].[Tweets](
    [Name] [nvarchar](255) NULL,
    [TweetText] [nvarchar](255) NULL,
    [TweetTime] [nvarchar](255) NULL
)

Add a new step and search for SQL Server - Insert row. Then enter the credentials from your database, select the table (must be an existing table) and map the Twitter fields to the appropriate table columns.
SQL Destination - Insert row


















5) Save LogicApp
After the last step has been added click on Save to save your work and then on Done. Now your newly created flow is ready to run.
Saving the LogicApp

















Now click on the run button and watch the table!
Run LogicApp


















The result
















Conclusion
Microsoft Logic App is very easy to use and you don't need programming skills. There are dozens of apps and services to connect to. If you only need one Logic App and a 15 minute refresh is enough then you could also consider using Microsoft Flow which is free of charge.