Monday, 5 July 2021

SQL Snack: Create readonly DB user in Azure SQL DB

Case
I want to create a SQL Server user in my Azure SQL Database that can only read tables and view from a specific schema so I can use it for example to import data in to Power BI.
Azure SQL Database user
























Solution
Since Azure SQL DB has no gui for this task we have to write some (simple) T-SQL code. So login in your Azure SQL DB with your favorite SQL editor. For example  Azure Data Studio or the good old SQL Server Management Studio.

1) Create Login
First step is to create a Login with a password. This is done on the Master database.
-- Create Login on Master DB
CREATE LOGIN PowerBI
    WITH PASSWORD = 'B0bd4f211122cf9a03f64c81ba0f0d8f!';
Create Login on Master DB













2) Create User
The next step is to create the actual user connected to the Login from step 1. The username can differ from the Login name, but in most cases they will match. The next step must be executed on the DB where you want to give this user access.
-- Create user on DB
CREATE USER PowerBI
    FOR LOGIN PowerBI
Create user for Login on DB (not master)















However you can also execute it on the Master database as well (with the extra default). This will save you filling in the database name when you create the connection, which is very annoying especially within SSMS where you have to set it on the Connections Property tab.
-- Create user on Master DB (note the extra default schema)
CREATE USER PowerBI
    FOR LOGIN PowerBI
    WITH DEFAULT_SCHEMA=[dbo];
Create user for Login  on Master DB as well














Connect to specific DB
























3) Grant access to Schema
Last step is to grant SELECT access on a specific database schema. This is instead of using using fixed database roles such as db_datareader.
-- Allow only SELECT queries on the DB schema 'DWH'
GRANT SELECT ON SCHEMA :: DWH 
    TO PowerBI
    WITH GRANT OPTION;


















Conclusion
In this post you learned how to add a SQL Server user that can only execute select queries on tables and views in a specific database schema. Super handy when you created some views for your dataset in Power BI, but you don't want to show all tables and view in Power BI.

Wednesday, 30 June 2021

Introduction to Azure DevOps for Developers

Case
I want to store, build and release my code after development. How does this and more all come together in Azure DevOps?













Solution

First it is important to agree on a way of working, regardless of the tools that can help. Especially when you are part of a team. Consider, for example, the use of version control (if so, which branching strategy), test and deployment moments, code reviews and registration of activities.

For this solution, we distinguish the following activities for supporting and implementing a way of working:
  1. Configure Git
  2. Define a branching strategy
  3. Define user stories 
  4. Create and approve code reviews
  5. Automatically build the code
  6. Automatically release the code
  7. Support testing
  8. Capture release to production
How can we translate this into the use of the services in Azure DevOps? 
Well, lets have a look at the picture below.  


Azure DevOps - Using Pipelines, Repos, Boards and Test Plans

In this blog post we will go deeper into each aspect and which service can be used.

Repos
This service enables you to manage, monitor, build, commit, verify and push your project code including all the other information about your project.

1) Configure Git
Configure Git for version control and code collaboration. This includes everything from creating a repository, determining the collaboration branch and configuring branch policies such as minimum number of reviewers or code comment(s) needs to be processed.

2)
Branching strategy
Define a branching strategy including "collaboration" and "feature" branches. This is a more detailed step, also how to handle "hotfixes" for instance, then only determine the collaboration branch. More information about the strategy around Git branching can be found here.

4)
Create and approve code reviews
Create and approve code reviews using Pull Requests. This always ensures a four eyes principle. More information about Pull Requests can be found here.

8)
Capture release to production
Create a Tag inside a Git repository after a successful release to production. This helps you to check the current or past production version at any time. It can happen for example that one of your stakeholders has other work to do and even after testing, he/she has questions about some functionality that was released the previous sprint and a new release has already taken place.

Boards
This service will manage your software projects and makes use of known frameworks like "Scrum" and "Kanban" that are supported by default.

3)
Define user stories
Define user stories to describe the activities and link those user stories to the code. You link user stories to features and epics to group work.

Pipelines
This service will bring together Continuous Integration (CI) and Continuous Delivery (CD) of your code and application using pipelines.

5) Automatically build the code
One of the most important things is working code. In a technical way, we say that your code should "compile" or "build" successfully. In addition to this, you can also write unit tests and add those to your build pipeline. With this, you do not have to worry about running it manually or to forget a test, before your code will be merged into the collaboration branch. 

In a separate blog post we will show you how to setup a basic build.

6) Automatically release the code
After a successfully build, you want to release your code to a next environment. With Releases you can setup a pipeline to automate this process. 

In a separate blog post we will show you how to setup a basic release.

7) Support testing
Besides unit testing, that can be part of both build and release pipelines, you can also add functional testing. For example the Business Acceptance Test (BAT) when you are releasing to the acceptance environment. You can run the test cases, that you have stored in Azure Test Plans, manually or it can be automated as part of the release. More information about Test Plans can be found here.

Conclusion
In this post you saw a high overview of how you can use Azure DevOps as part of your development process and how it can helps to deliver your software (in an automated way) including testing. With all those different services and options it is a lot and it will take some time and initial investment, but it pays off in the long run. That is why it is important, apart from Azure DevOps, to pay attention for which way of working suits your team the best.

In next blog posts we will go deeper into certain services in Azure DevOps, such as the pipelines that contains a build and release.

Sunday, 6 June 2021

Analyze Azure Data Factory logs - part 1: setup

Case
Azure Data Factory has a complete monitor that logs all details, but besides a simple filter it has no customization options and we don't want to add old fashioned custom logging to each pipeline with stored procedures to create our own logging. Is there a better alternative?
Azure Data Factory & Azure Log Analytics











Solution
You can use Azure Log Analytics Workspaces to store and analyze the logs of ADF. Within the Azure Market place you can find some standard dashboards to monitor and manage your Data Factory, but you can also write custom queries with Kusto Query Language (KQL). And even better use those custom queries in an Power BI report to create a (semi) live dashboard on your ADF logs.

In the first part of this series of blog posts we will focus on setting up the Log Analytics Workspace and adding the standard dashboards from the Azure market place. Prerequisites: an Azure Data Factory with a couple of pipelines that can run to generate some log data.

1) Create Log Analytics Workspace
For this example we will use a free trail workspace where the retention is limited to 7 days and the a max of 500mb of daily data ingestion. Perfect for a first introduction to Log Analytics.
  • Go to the Azure Portal and create a new Log Analytics workspace with a describing name and in the same region as your Data Factory.
  • On the Pricing tier tab choose Free and then hit the Review + Create button.
Create Log Analytics Workspace for ADF





















2) Add Azure Data Factory Analytics from Azure Market Place
Next step is to add Data Factory Analytics from Azure Market Place. Note that at the time of writing it is stil in preview.
  • Go to the URL and click on the Get It Now button.
  • Select your Log Analytics Workspace and click on Create
Add ADF Analytics from Azure Market Place













3) Connect ADF to Log Analytics Workspace
Now we need to tell your Data Factory to send its logs to the new Log Analytics Workspace.
  • Go to the ADF Overview page in the Azure Portal and click on Diagnostic settings in the left menu
  • Click on + Add diagnostic settings
  • Give the diagnotsic setting a name because you could create multiple ones: ADFDiagnostics
  • Now first choose what you want to log. The first three are for ADF Activities, Pipelines and Triggers. The second group of items that start with Sandbox are the debug runs and the third group of items are for SSIS. For this example we will only select the first three.
  • Secondly choose which metrics you want to log: AllMetrics
  • And thirdly choose the destinations: Send to Log Analytics Workspace
  • Now select you workspace and click on Save
Connect ADF to the Analytics Workspace















4) See standard dashboards
Now lets see that standard ADF dashboard in Azure Log Analytics, but first make sure you run a couple of ADF pipelines so that there is actually something to see on the dashboard because old runs won't be available. (Note: It could take a couple of minutes before you see the first data appear in Log Analytics)

In Log Analytics the dashboard is called a workbook. You can either directly go to all the workbooks and then search for "AzureDataFactoryAnalytics" or first go Solutions and then to the ADF solution and after that to the workbooks. The second way is a few more clicks, but the list of workbooks will be smaller.
  • Go to Azure Log Analytics and click on Workbooks in the left menu
  • In the list of public templates scroll down and click in AzureDataFactoryAnalytics
  • When opened scroll down to see all visuals.
  • After the first look you could hit the edit button and start customizing.
AzureDataFactoryAnalytics













ADF dashboard in Log Analytics














Conclusion
In this post you learned how to set up Log Analytics for ADF. Now you have that default workbook  (template) which you can adjust to your own needs, but you can also see which KQL queries Microsoft wrote to create those visuals. In a next blogpost we will explain the basics of KQL and show you how you can use those in Power BI to create an ADF dashboard. This could save you building custom log mechanisms because most information is already available in the ADF logs.

Monday, 31 May 2021

Power Apps Snack: Add an Attachments control

Case
I have a SQL server table with a varbinary field that I want to populate via a Power Apps form, but there seems to be no control for it! In the Media-menu of the Power Apps editor you can find a control to add an image, but it is limited to picture files such as jpg, png, gif and bmp. Is there a control to add other type of files such as pdf files?
The 'hidden' attachments control

















Solution
Yes there is an 'hidden' attachment control that we use via a little workaround (thx to colleague Dustin Felipa). So there is an attachments control, but it is only available for SharePoint lists. The trick is to create a temporary screen and form connected to a SharePoint list. After that you can copy and paste that attachment control to your own screen (and then remove the SharePoint screen and connection).

There is a small downside: It didn't yet get it to work within a form with the Submit function. So you have to use a Patch command instead.

1) Create a new screen
Within your app where you need the attachment control create a new empty screen. This will allow us to later on copy the attachment control from this dummy screen to the real screen.
Add temporary new screen
























2) Create new data source
Then create a new connection to a SharePoint site. Then choose a list within this site. It must be a list and not a document folder otherwise the control won't appear in the next step.
SharePoint list
























3) Create form on SharePoint list
On the new screen add an edit form and use the new SharePoint list as the datasource. Now you can add fields to the edit form. Make sure to select at least the field with the paper clip icon in front of it.
Add attachment field













Now you have a form with a working attachments control. Note that it is a collection of multiple attachments, but you can set the max number of attachments to 1.
Working attachment control
















4) Copy and paste
Now copy the control (not the entire datacard) and paste it on your own screen. I used an empty screen for this example. Now you can hit the play button to test the control.
Playing with the attachment control
















5) Code
As mentioned before I didn't get it to work within a form in combination with the submit form function. But you can use the Patch function. Since this is a collection you either need to pick the first item with a First or use a ForAll loop to repeat the Patch for all attachments.

// Get first
Patch(
    '[dbo].[myAttachments]',
    {
        Attachment: First(AttachmentBox.Attachments).Value,
        Filename: First(AttachmentBox.Attachments).Name
    }
);

// Get all
ForAll(
    AttachmentBox.Attachments,
    Patch(
        '[dbo].[myAttachments]',
        {
            Attachment: Value,
            Filename: Name
        }
    );
    
)

Now the files are in my table where the Attachment column has the Varbinary(max) datatype.
Now the file are in my attachments table













Conclusion
In this post you learned how to add an attachment control to your app. I hope/suspect that Microsoft will make it available for more data sources because this workaround is a little cumbersome. In a next post we will show you how to use this same control to send emails with attachments from within Power Apps. This requires an additional rename step.

Friday, 2 April 2021

ADF Snack - hide activity input and output from log

Case
I'm retrieving sensitive secrets from Azure Key Vault in an Azure Data Factory pipeline and using those values to call an other service. Both the output and the input with the secret value are showing in the Data Factory log. Is there a way to prevent logging sensitive inputs or outputs in ADF?
Secret unintentionally showed in log




















Solution
The solution is very simpel. Almost all activities have a Secure input and Secure output option on the General tab. When checking the Secure input box none of the input will show up in the logs and when checking the Secure output box none of the output will be logged. This is very useful for not logging passwords and other secrets used in your pipeline.






















Now we are running the same pipeline again with the Secure output option turned on and now the secret won't be visible in the log.
No more output

 















And you can of course still use the output in an subsequent activity with an expression. Just be sure to hide the input of that activity. Note that some activities like Set variable don't have the options to hide input and output.
You can still use hidden output



















Summary
In this post you learned how to hide sensitive input and output from activities so that they don't show up in the Azure Data Factory logging. The Key Vault example used in this post is explained here.