Showing posts with label BUG. Show all posts
Showing posts with label BUG. Show all posts

Wednesday 22 March 2023

DevOps - Get-SpnAccessToken is obsolete

Case
I'm deploying my SQL Server database via DevOps with the SqlAzureDacpacDeployment@1 task in YAML, but it is giving me a warning: ##[warning]The command 'Get-SpnAccessToken' is obsolete. Use Get-AccessTokenMSAL instead. This will be removed. It is still working, but the warning message is not very reassuring. 
The command 'Get-SpnAccessToken' is obsolete.
Use Get-AccessTokenMSAL instead.
This will be removed











Solution
This warning message appeared somewhere late 2022 and there is no new version of the dacpac deployment task available at the moment. When searching for this message it appears that other tasks like AzureFileCopy@5 have the same issue. The word MSAL (Microsoft Authentication Library) in the message points to a new(er) way to acquire security tokens.

To get more info you could run the pipeline in debug modes by Enabling system diagnostics.
Enable system diagnostics















Then you will get see a lot of extra messages and right above the warning you will see a message about USE_MSAL (empty) and that its default value is false.
USE_MSAL











It is just a warning and Microsoft will probably solve it some day. If you want to get rid of it you can set an environment variable called USE_MSAL to true within your pipeline. When set to true the task will use MSAL instead of ADAL to obtain the authentication tokens from the Microsoft Identity Platform. The easiest way to do this is writing one line of PowerShell code in a PowerShell task: ##vso[task.setvariable variable=USE_MSAL]true 

###################################
# USE_MSAL to avoid warning
###################################
- powershell: |
    Write-Host "Setting USE_MSAL to true to force using MSAL instead of ADAL to obtain the authentication tokens."
    Write-Host "##vso[task.setvariable variable=USE_MSAL]true"
  displayName: '3 Set USE_MSAL to true'

###################################
# Deploy DacPac
###################################             
- task: SqlAzureDacpacDeployment@1
  displayName: '4 Deploy DacPac' 
  inputs:
    azureSubscription: '${{ parameters.ServiceConnection }}'
    AuthenticationType: 'servicePrincipal'
    ServerName: '${{ parameters.SqlServerName }}.database.windows.net'
    DatabaseName: '${{ parameters.SqlDatabaseName }}' 
    deployType: 'DacpacTask'
    DeploymentAction: 'Publish'
    DacpacFile: '$(Pipeline.Workspace)/SQL_Dacpac/SQL/${{ parameters.SqlProjectName }}/bin/debug/${{ parameters.SqlProjectName }}.dacpac'
    PublishProfile: '$(Pipeline.Workspace)/SQL_Dacpac/SQL/${{ parameters.SqlProjectName }}/${{ parameters.SqlProjectName }}.publish.xml'
    IpDetectionMethod: 'AutoDetect'

After this the warning will not appear anymore and your database will still get deployed. The extra step taks about a second to run.
Extra PowerShell Task


No more absolete warnings
















Conclusion
In this post you learned how to get rid of the annoying The command 'Get-SpnAccessToken' is obsolete warning by setting one environment variable to true. You should probably check in a few weeks/months whether this workaround is still necessary or if there is a SqlAzureDacpacDeployment@2 version.




Sunday 2 October 2022

Synapse - error: missing required argument 'factoryId'

Case
I want to deploy a Synapse workspace via DevOps and the Synapse workspace deployment addon, but it is giving me an error: Stderr:  error: missing required argument 'factoryId'.  How do I solve this error?

error: missing required argument 'factoryId'
















2022-10-02T19:05:21.6763177Z ##[section]Starting: Synapseworkspacedeployment
2022-10-02T19:05:21.6900329Z ==============================================================================
2022-10-02T19:05:21.6900630Z Task         : Synapse workspace deployment
2022-10-02T19:05:21.6900882Z Description  : Deployment task for synapse workspace v2
2022-10-02T19:05:21.6901097Z Version      : 2.3.0
2022-10-02T19:05:21.6901303Z Author       : Microsoft Corporation
2022-10-02T19:05:21.6901526Z Help         : Displays the name of your extension v2
2022-10-02T19:05:21.6901791Z ==============================================================================
2022-10-02T19:05:22.5141212Z Bundle source :  https://web.azuresynapse.net/assets/cmd-api/main.js
2022-10-02T19:05:22.5165738Z Downloading asset file
2022-10-02T19:05:23.5975682Z Asset file downloaded at :  D:\a\1\s\downloads\main.js
2022-10-02T19:05:23.5986866Z Starting export operation
2022-10-02T19:05:23.5989932Z Executing shell command
2022-10-02T19:05:23.5991887Z Command :  node D:\a\1\s\downloads\main.js export "D:\a\1\SynapseArtifact\" dwhtst ExportedArtifacts
2022-10-02T19:05:25.3052935Z Stderr:  error: missing required argument 'factoryId'
2022-10-02T19:05:25.3054315Z 
2022-10-02T19:05:25.3225669Z Shell execution failed.
2022-10-02T19:05:25.3227048Z An error occurred during execution: Shell execution failed.
2022-10-02T19:05:25.3262506Z ##[error]Encountered with exception:Shell execution failed.
2022-10-02T19:05:25.3355687Z ##[section]Finishing: Synapseworkspacedeployment
Solution
This error points to a mistake in the ArtifactsFolder property of the Synapse workspace deployment@2 task. If you don't use the correct folder or even add a forward slash at the end (!) then you will get a not very descriptive error: Stderr: error: missing required argument 'factoryId'. If you get this error then make sure to add the treeview step in your pipeline to double check whether the folder is correct. It should point to the folder with publish_config.json file in it. However you will also get this error if you end the path with a forward slash!
            ###################################
            # Show treeview of agent
            ###################################
            - powershell: |
                Write-Output "Folder and file treeview of Pipeline_Workspace folder:"
                tree "$(Pipeline.Workspace)" /F
              displayName: 'Show treeview of Pipeline_Workspace folder'


            ###################################
            # validateDeploy
            ###################################
            - task: Synapse workspace deployment@2
              inputs:
                operation: validateDeploy
                ArtifactsFolder: '$(Pipeline.Workspace)/SynapseArtifact'
                azureSubscription: DevOps
                ResourceGroupName: dhwacc
                TargetWorkspaceName: rg_dwhacc
                DeleteArtifactsNotInTemplate: true

Conclusion
Double check the artifact folder and don't add a forward slash at the end of it. The forward slash bug(?) occurred in version 2.3.0 (9/2/2022).

Friday 30 September 2022

Synapse - 'node' is not recognized as an command

Case
I want to deploy a Synapse workspace via DevOps and the Synapse workspace deployment addon, but it is giving me an error: Stderr: 'node' is not recognized as an internal or external command, operable program or batch file. How do I solve this error?

Node not known in DevOps


















Solution
Just like with the deployment of Data Factory this add on also uses node.js to do the actual deployment. If you are using a self-hosted agent then you need to install node.js on your DevOps agent (VM) or use the Node.js Tool Installer task before the Synapse Workspace Deployment task.

NodeTool@0
















###################################
# 2 Installs Node.js on agent
###################################
- task: NodeTool@0
  displayName: '3 Install Node.js'
  inputs:
    versionSpec: '16.x'
    checkLatest: true  
Conclusion
In this short post you learned how to overcome the unrecognized node command in your DevOps Deployment pipeline. A simple manual installation of node.js or an automated installation via your CICD pipeline will do the trick.

Also see our posts of setting up Synapse and DevOps and creating the YAML pipeline.

Friday 13 May 2022

DevOps: SQL Server - NETFramework v4.5 not found

Case
I have a DevOps pipeline to build and deploy my Azure SQL Server Database, but it is giving an .NET Framework error stating that it can't find the framework version. It did work before, how can I solve it?
error MSB3644: The reference assemblies for
 .NETFramework,Version=v4.5 were not found.












The entire error message in Azure DevOps:
##[section]Starting: 1. Creating Artifact
==============================================================================
Task         : MSBuild
Description  : Build with MSBuild
Version      : 1.199.0
Author       : Microsoft Corporation
Help         : https://docs.microsoft.com/azure/devops/pipelines/tasks/build/msbuild
==============================================================================
##[command]"D:\a\_tasks\MSBuild_c6c4c611-aa2e-4a33-b606-5eaba2196824\1.199.0\ps_modules\MSBuildHelpers\vswhere.exe" -version [17.0,18.0) -latest -format json
##[command]"C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Current\Bin\msbuild.exe" "D:\a\1\s\SQL\myDWH\myDWH.sqlproj" /nologo /nr:false /dl:CentralLogger,"D:\a\_tasks\MSBuild_c6c4c611-aa2e-4a33-b606-5eaba2196824\1.199.0\ps_modules\MSBuildHelpers\Microsoft.TeamFoundation.DistributedTask.MSBuild.Logger.dll";"RootDetailId=|SolutionDir=D:\a\1\s\SQL\myDWH|enableOrphanedProjectsLogs=true"*ForwardingLogger,"D:\a\_tasks\MSBuild_c6c4c611-aa2e-4a33-b606-5eaba2196824\1.199.0\ps_modules\MSBuildHelpers\Microsoft.TeamFoundation.DistributedTask.MSBuild.Logger.dll"  /p:_MSDeployUserAgent="VSTS_3aa34741-51f2-4a22-9768-a5deca3bfa4e_build_17_0"
Build started 5/13/2022 2:36:44 PM.
##[error]C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Current\Bin\Microsoft.Common.CurrentVersion.targets(1221,5): Error MSB3644: The reference assemblies for .NETFramework,Version=v4.5 were not found. To resolve this, install the Developer Pack (SDK/Targeting Pack) for this framework version or retarget your application. You can download .NET Framework Developer Packs at https://aka.ms/msbuild/developerpacks
Project "D:\a\1\s\SQL\myDWH\myDWH.sqlproj" on node 1 (default targets).
C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Current\Bin\Microsoft.Common.CurrentVersion.targets(1221,5): error MSB3644: The reference assemblies for .NETFramework,Version=v4.5 were not found. To resolve this, install the Developer Pack (SDK/Targeting Pack) for this framework version or retarget your application. You can download .NET Framework Developer Packs at https://aka.ms/msbuild/developerpacks [D:\a\1\s\SQL\myDWH\myDWH.sqlproj]
_CleanRecordFileWrites:
  Creating directory "obj\Debug\".
Done Building Project "D:\a\1\s\SQL\myDWH\myDWH.sqlproj" (default targets) -- FAILED.

Build FAILED.

"D:\a\1\s\SQL\myDWH\myDWH.sqlproj" (default target) (1) ->
(GetReferenceAssemblyPaths target) -> 
  C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Current\Bin\Microsoft.Common.CurrentVersion.targets(1221,5): error MSB3644: The reference assemblies for .NETFramework,Version=v4.5 were not found. To resolve this, install the Developer Pack (SDK/Targeting Pack) for this framework version or retarget your application. You can download .NET Framework Developer Packs at https://aka.ms/msbuild/developerpacks [D:\a\1\s\SQL\myDWH\myDWH.sqlproj]

    0 Warning(s)
    1 Error(s)

Time Elapsed 00:00:01.97
##[error]Process 'msbuild.exe' exited with code '1'.
##[section]Finishing: 1. Creating Artifact
Solution
This error is probably because you have a DevOps agent of the type windows-latest and that recently changed to the newer Windows-2022 and .Net Framework version 4.5 is out of support since April 26, 2022.

The solution is simple:
  • Go to Visual Studio and open your database project
  • Find your project in the Solution Explorer pane
  • Right click your project and go to the properties
  • Go to the tab SQLCLR and find Target framework
  • Change the .NET Framework to a higher version (4.7.2 or 4.8)
  • Now save your project and make sure the change goes to your repository so that your build pipeline can do its work correctly
Change .NET Framework version in SQLCLR pane



















Another option is to downgrade your DevOps Agent from 'windows-latest' to 'windows-2019'. An even simpler change, but probably only a temporary change to postpone the real change.

Conclusion
In this post you learned how to solve the .NET framework not found error in two ways. A temporary quick-win by changing the agent type and a little more 'permanent' change in the Visual Studio project (until that versio also goes out of support).

Sunday 19 September 2021

ADF Build - missing publish_config.json

Case
I'm using the new and improved ARM export via Npm to generated an ARM template for my Data Factory so I can deploy it to the next environment, but the Validate step and the Validate and Generate ARM template step both throw an error sayin that the publish_config.json file can't be found. This file isn't mentioned in the steps from the documentation. How do I add this file and what content should be in it?
Unable to read file: publish_config.json


















ERROR === LocalFileClientService: Unable to read file: D:\a\1\publish_config.json, error: {"stack":"Error: ENOENT: no such file or directory, open 'D:\\a\\1\\publish_config.json'","message":"ENOENT: no such file or directory, open 'D:\\a\\1\\publish_config.json'","errno":-4058,"code":"ENOENT","syscall":"open","path":"D:\\a\\1\\publish_config.json"}
ERROR === PublishConfigService: _getLatestPublishConfig - Unable to process publish config file, error: {"stack":"Error: ENOENT: no such file or directory, open 'D:\\a\\1\\publish_config.json'","message":"ENOENT: no such file or directory, open 'D:\\a\\1\\publish_config.json'","errno":-4058,"code":"ENOENT","syscall":"open","path":"D:\\a\\1\\publish_config.json"}
Solution
While it indeed looks like a real error it doesn't stop the DevOps pipeline. The old method of publishing ADF changes to an other ADF did create this file automatically in the adf_publish branch when you hit the Publish button the the Data Factory Gui. So probably it isn't used any more, but we still want to get rid of annoying errors!

You can solve this by manually adding the missing file:

1) Add new file to repos
To solve it go to the Azure DevOps repository and locate the folder where ADF stores the pipeline, dataset and factory files (in subfolders). Click on the +New button and create a file called publish_config.json.
Add new file to repository (in root of ADF)










2) Add JSON content
The content of the new file should be the name of your publishing branch when you configured GIT for ADF in the following JSON format:
{"publishBranch": "factory/adf_publish"}

Add the publishing branch in the following format









3) The result
Now the new file is available for the Npm task in the pipeline. Run that DevOps pipeline again and you will notice that the error message won't appear in the logs.
publish_config.json is now available for the pipeline













Conclusion
In this post you learned how to avoid the error message about the missing publish_config.json file. Not very satisfying that it is still unknown why this file was missing and if it is still used by the process. Please add comment below if you found more details.

In a next post we will describe the entire Data Factory ARM deployment where you don't need to hit that annoying Publish button within the Data Factory GUI. Everything (CI and CD) will be a YAML pipeline).

thx to colleague Roelof Jonkers for helping

Saturday 18 September 2021

ADF Build - missing arm-template-parameters-definition.json

Case
I'm using the new and improved ARM export via Npm to generated and ARM template for my Data Factory so I can deploy it to the next environment, but the Validate step and the Validate and Generate ARM template step both throw an error sayin that the arm-template-parameters-definition.json file can't be found. This file isn't mentioned in the steps from the documentation. How do I add this file and what content should be in it?
Unable to read file: arm-template-parameters-definition.json



















ERROR === LocalFileClientService: Unable to read file: D:\a\1\arm-template-parameters-definition.json, error: {"stack":"Error: ENOENT: no such file or directory, open 'D:\\a\\1\\arm-template-parameters-definition.json'","message":"ENOENT: no such file or directory, open 'D:\\a\\1\\arm-template-parameters-definition.json'","errno":-4058,"code":"ENOENT","syscall":"open","path":"D:\\a\\1\\arm-template-parameters-definition.json"}
  
WARNING === ArmTemplateUtils: _getUserParameterDefinitionJson - Unable to load custom param file from repo, will use default file. Error: {"stack":"Error: ENOENT: no such file or directory, open 'D:\\a\\1\\arm-template-parameters-definition.json'","message":"ENOENT: no such file or directory, open 'D:\\a\\1\\arm-template-parameters-definition.json'","errno":-4058,"code":"ENOENT","syscall":"open","path":"D:\\a\\1\\arm-template-parameters-definition.json"}

Solution
This step is indeed not mentioned within that new documentation, but it can be found if you know where to look for. The messages state that it is indeed an error, but it will continue using a default file. Very annoying, but not blocking for your pipeline. To solve it we need to follow these steps:

1) Edit Parameter configuration
Go to your development ADF and open Azure Data Factory Studio. In the left menu click on the Manage icon (a toolbox) and then click on ARM template under Source Control. Now you will see the option 'Edit parameter configuration'. Click on it.
Edit parameter configuration












2) Save Parameter configuration
Now a new JSON file will be opened (that you can adjust to your needs, but more on that in a later post) and in the Name box above you will see 'arm-template-parameters-definition.json'. Click on the OK button and go to the Azure DevOps repository.
arm-template-parameters-definition.json























3) The result
In the Azure DevOps Repository you will now find a new file in the root of the ADF folder where the subfolders like pipeline and dataset are also located. Run the DevOps pipeline again and you will notice that the error and warning are gone.
The new file had been added to the repository by ADF











Note: that you only have to do this for the development Data Factory (not for test, acceptance or production) and that the ARM template parameter configuration is only available for git enabled data factories. 

Conclusion
In this post you learned how to solve the arm-template-parameters-definition.json not found error/warning. Next step is to learn more about this possibility and the use case of it. Most often it will be used to add extra parameters for options that aren't parameterized. This will be explained in a next post.

In an other following post we will describe the entire Data Factory ARM deployment where you don't need to hit that annoying Publish button within the Data Factory GUI. Everything (CI and CD) will be a YAML pipeline).

thx to colleague Roelof Jonkers for helping

Friday 17 September 2021

ADF Release - ResourceGroupNotFound

Case
I'm using the Pre and Post deployment PowerShell script from Microsoft within my ADF DevOps deployment, but it gives an error that it cannot find my resource group although I'm sure a gave the Service Principal enough access to this resource group (and I checked for typos). What is wrong and how can I solve this?
ResourceGroupNotFound



















##[error]HTTP Status Code: NotFound
Error Code: ResourceGroupNotFound
Error Message: Resource group 'RG_ADF_PRD' could not be found.

Solution
If the Service Principal indeed has enough permissions on the Azure Resource Group then your Service Principal probably has access to more than one Azure Subscription. The PowerShell function Get-AzDataFactoryV2Pipeline within this script can only get Data Factories from the Active Subscription. If your resource group is not in that active subscription then it will not find it. Only one can be active.
Get-AzDataFactoryV2Pipeline -ResourceGroupName "RG_ADF_PRD" -DataFactoryName "DataFactory2-PRD"
WARNING: TenantId '4e8e12ea-d5b6-40f1-9988-4b09705c2595' contains more than one active subscription. 
First one will be selected for further use. To select another subscription, use Set-AzContext.

The best solution is to create a Service Principal for each subscription where you want to deploy ADF and then give each Service Principal only access to one Azure subscription.

The alternative is to add one extra parameter to the PowerShell script for the subscription ID (or Name ) and then use the PowerShell function Set-AzContext to activate the correct Azure subscription.
Set-AzContext -Subscription $Subscription













In your YAML script you need to add the extra parameter and then either add the subscription ID (or  Name) hardcoded or much better as a variable from the DevOps Variable Group (Library). If you are using the old Release pipelines then hit the three dots behind the Script Arguments textbox to add the extra parameter.
add parameter to YAML














Conclusion
In this post you learned how to fix the Resource Not Found error during the Pre and Post deployment script execution. The best/safest solution is to minimize access for each Service Principal and the work around is to add two lines of code to the example script of Microsoft.

In a next post we will describe the entire Data Factory ARM deployment where you don't need to hit that annoying Publish button within the Data Factory GUI. Everything (CI and CD) will be a YAML pipeline).

thx to colleague Roelof Jonkers for helping

Friday 27 April 2018

Runbook with ADF: Method not found

Case
A little over two weeks ago we created a new Azure Automation account with a PowerShell runbook to pause and resume the ADF Integration Runtime, but got a strange error when running the script:
Method not found








Get-AzureRmDataFactoryV2IntegrationRuntime: Method not found: 'Newtonsoft.Json.Serialization.IAtrributeProvider Newtonsoft.Json.Serialization.JsonProperty.get_Attribute()'.

Of course the script did work before so what had changed since the last time?

Solution
One of the steps to get the pause and resume script working, is to add assemblies for Azure Data Factory (AzureRM.DataFactoryV2) and Azure Profile (AzureRM.profile). Both where updated on April 10. The ADF assembly from 0.5.2 to 0.5.3 and Profile from 4.5.0 to 4.6.0.

Apparently there is a bug: The updated AzureRM.profile assembly requires a newer version of Newtonsoft.Json which is not yet available on Azure Automation. This means the script will work on your local machine, but not yet in Azure Automation. Microsoft is working on an update.

Meanwhile you can use the following temporally workaround: use the previous version of both assemblies. First remove the two new assemblies from your Azure Automation account and then go to AzureRM.profile. Scroll down to the Version History list and click in the previous version. Then hit the Deploy to Azure Automation button.
Deploy to Azure Automation











You will be redirected to the Azure portal where you must select the Automation account to deploy the previous version of the assembly. Select the correct Automation account and click on the OK button.
Select Automation account

















Repeat this for AzureRM.DataFactoryV2 and then you are good to go.

Summary
There is a bug on which Microsoft is already working. In the meantime just downgrade to the previous versions or wait for an update.

Tuesday 28 March 2017

SSRS Snack: Excel Rendering Extension Error

Case
I am trying to export my SSRS report to Excel, but I'm getting an error. How Can I solve it?
Excel Rendering Extension: Unknown image format image/x-png















Solution
You are using one or more PNG pictures in your reports and SSRS doesn't know how to render these because they have an unknown mime format: image/x-png 

1) Solution Explorer
Go to your SSRS project in Visual Studio and search for PNG images in your Solution Explorer.
Solution Explorer



















2) Properties
Now go to the properties of this image file (F4) and locate the property MIME Type.
image/x-png



















3) Change MIME Type
Change the MIME Type from image/x-png to image/png and repeat this for all PNG images.
image/png


















4) Deploy and test
Now deploy your change project and reopen the report to test the excel export
Export to Excel