Saturday, 13 June 2020

Process Analysis Service with Data Factory only

There are several methods to process Azure Analysis Services models like with Logic Apps, Azure Automation Runbooks and even SSIS, but is there an Azure Data Factory-only solution where we only use the pipeline activities from ADF?
Process Azure Analysis Services

Yes you can use the Web Activity to call the Rest API of Azure Analysis Services (AAS), but that requires you to give ADF permissions in AAS via its Managed Service Identity (MSI).

1) Create ADF service principal
In the next step we need a user which we can add as a Server Administrator of AAS. Since we will not find the managed identity of ADF when we search for a user account, we will have to create one. This 'user' is called a service principal.
  • Go to ADF in the Azure portal (not the Author & Monitor environment)
  • In the left menu click on Properties which you can find under General
  • Copy the 'Managed Identity Application ID' and the 'Managed Identity Tenant' properties to a notepad and construct the following string for the next step:
    app:<Application ID>@<Tentant> (and replace the <xxx> values with the properties)

Create app user from ADF for AAS

2) Add user as Server Administrator
Now we need to connect to your Azure Analysis Services via SQL Server Management Studio (SSMS) to add the user from the previous step as a Server Administrator. This cannot be done via the Azure portal.
  • Login to your AAS with SSMS
  • Right click your server and choose Properties
  • Go to the Security pane
  • Click on the Add... button
  • Add the service principal from the previous step via the Manual Entry textbox and click on the Add button
  • Click on Ok the close the property window

Add Server Administrator via Manual Entry

After this step the 'user' will appear on the portal as well, but you can not add it via the portal.
Analysis Services Admins

3) Add Web Activity
In your ADF pipeline you need to add a Web Activity to call the Rest API of Analysis Services. First step is to determine the Rest API URL. Replace in the string below, the <xxx> values with the region, servername and modelname of your Analysis Services. The Rest API method we will be using is 'refreshes':


Second step is to create a JSON message for the Rest API to give the process order to AAS. To full process the entire model you can use this message:
    "Type": "Full",
    "CommitMode": "transactional",
    "MaxParallelism": 2,
    "RetryCount": 2,
    "Objects": []
Or you can process particular tables within the model with a message like this:
    "Type": "Full",
    "CommitMode": "transactional",
    "MaxParallelism": 2,
    "RetryCount": 2,
    "Objects": [
            "table": "DimProduct",
            "partition": "CurrentYear"
            "table": "DimDepartment"
See the documentation for all the parameters that you can use.
  • Add the Web activity to your pipeline
  • Give it a descriptive name like Process Model
  • Go to the Settings tab
  • Use the Rest API URL from above in the URL property
  • Choose POST as Method
  • Add the JSON message from above in the Body property
  • Under advanced choose MSI as Authentication method
  • Add 'https://*' in the Resource property (note this URL is different for suspending and resuming AAS)
Web Activity calling the AAS Rest API

Then Debug the Pipeline to check the process result

4) Retrieve refreshes
By only changing the method type from POST to GET (body property will disappear) you can retrieve information about the processing status and use that information in the next pipeline activities.
Retrieve process status via GET

In this post you learned how process your Analysis Services models with only Azure Data Factory. No other services are needed which makes maintenance a little easier. In a next post we will also show you how to Pause or Resume your Analysis Services with Rest API. With a few extra steps you can also use this method to refresh a Power BI dataset, but we will show that in a future post.

Update: firewall turned on?


  1. Thanks for this post.i tried but getting error 2108 "message": "Invoking Web Activity failed with HttpStatusCode - 'NotFound'.",
    "failureType": "UserError".

    1. I got that error when the URL was incorrect (spelling error in model or server)

    2. Do you have a space in the name of your model? We experienced the same issue but just replaced the space with %20 in the url and it was solved.

  2. Thanks All worked.i missed "servers" before servername

  3. Hi Joost, thanks for this post, really good !!

    I've a question, is it possible to create a dynamic name partition parameters ? For example I've partitions with names DK_202001 and FR_201901, but I want to run all the partitions for DK and FR for example. And I want to process the tabular model dynamically with parameters, for example, I want process the tabular model just for DK, tomorrow for FR.

    Once again thanks and kind regards!

    1. I'm not seeing a list of available partitions in the Rest API. But if you can create a collection of partitions that you want to process then you could use some dynamic expressions to generate the body message

  4. i'm got like this error

    Response status code: ''. More details:Exception message: 'An error occurred while sending the request.'.\r\nNo response from the endpoint. Possible causes: network connectivity, DNS failure, server certificate validation or timeout.",
    "failureType": "UserError",
    "target": "Process Model",
    i followed same thing
    thank you

    1. Hi i have resolved this by manually go to SSMS and refresh the creds ,and problem solved.But i tried to deploy with an existing credentials but i still get this error.Is there any way we can automatically / permanently store the creds in AAS model?

  5. I am getting error while executing the web activity "Client with IP Address '*.*.*.*' is not allowed to access the server. To enable access, use the Firewall settings in Azure Management Portal." This IP address is changing and not fixed . Also I can not see any ADF IP address to add in AAS firewall setting.

    1. You could whitelist ALL Azure Data Factories in your region. Not 100% safe, but better than turning it off.

      An other option could be to use an Azure Automation Runbook (PowerShell or Python code) that first retrieves its own ip address to add it to the firewall and then removes it after processing.


    3. See latest blogpost

  6. hi , Is it possible to do incremental processing using this approch?

  7. Joost, thanks so much, this post was really clear and usefull! (Unlike some Microsoft Documentation! Within 5 minutes I have a fully working refresh without any problems!!!


Related Posts Plugin for WordPress, Blogger...