Showing posts with label C#. Show all posts
Showing posts with label C#. Show all posts

Tuesday 9 June 2020

Create and deploy a C# Azure Function

Case
I want to create an Azure Function with C# code. How do I create and deploy one in Azure (and use it Azure Data Factory)?
Write C# in Visual Studio Code to create an Azure Function











Solution
In this blogpost we will create and deploy a very simple 'hello world' Azure Function with an HTTP trigger which you can extend to your own needs. After that we can use the Azure Data Factory pipeline with an Azure Function activity to execute it.

As an alternative you could also create an Azure Function with a Blob Storage trigger that executes when a new file arrives, but we rather want to use that same trigger type to start an Azure Data Factory pipeline that then starts this Function followed by other pipeline activities. This way we have one place that does the triggering/orchestration.

1) Create new Azure Function project
Please first follow the steps of our previous post on how to prepare Visual Studio Code for creating Azure Functions with C#. After that open Visual Studio code and perform the steps below to create your first hello world Function.
  • In Visual Studio code click on the Azure icon in the left menu.
  • In the newly opened pane click on the folder with the lightning icon on it to create a new project. (An additional function can later-on be added to the project with the Lightning-plus icon)
  • Select the folder of the new project (or use the Browse... option)
  • Next select C# as coding language
  • Select HTTP trigger as the template for this example
  • Enter the Function name. This is the name of the function within the project (that can contain multiple functions). Example: myCSharpFunction
  • Provide a namespace: Bitools.Function
  • For this test example use anonymous as Authorization level
  • The project has been created, but their could be an additional action in step 2
Create new Azure Function project

















2) Unresolved dependencies
This extra step seems to be a bug in the Azure Function extension for C# in Visual Studio code. After the project has been generated Visual Studio Code will show the following error in the lower right corner. This seems to refer to some missing references.
There are unresolved dependencies.
Please execute the restore command to continue.













If you don't get this error then Microsoft probably solved the bug. When you do get it, the only thing you have to do is clicking the Restore button. After that some extra files will be added in the obj folder of your project. (see previous post to compare extension versions)
Execute the restore command

Spot the differences

























3) Code in myCSharpFunction.cs
The file 'myCSharpFunction.cs' contains your C# code. The name could be different when you gave your function a different name. Below you see the standard / generated code with some extra comment lines for if you are new to C#. For this example we do not extend the code.
// This section lists the namespaces that this function will be using frequently,
// and saves the programmer from specifying a fully qualified name every time that
// a method that is contained within is used
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;

namespace Bitools.Function
{
    public static class myCSharpFunction
    {
        // Main function and entry point of this Azure Function
        [FunctionName("myCSharpFunction")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            // Log information
            log.LogInformation("C# HTTP trigger function processed a request.");

            // Retrieve parameter 'name' from querystring
            string name = req.Query["name"];

            // Also try to retrieve the same parameter from the request body
            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            // If not found in querystring then use requestbody
            name = name ?? data?.name;

            // If name is still empty throw an error that a name parameter
            // is expected else response with a greeting
            string responseMessage = string.IsNullOrEmpty(name)
                ? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
                : $"Hello, {name}. This HTTP triggered function executed successfully.";

            return new OkObjectResult(responseMessage);
        }
    }
}

4) Debug locally
Now we are going to test the Azure Function locally on our Windows device. There are multiple ways to start debugging. Pressing F5 is probably the easiest. See animated gif for more options.
  • In the Run menu on the top of the screen you will find the Start Debugging option. 
  • The terminal pane on the bottom will show a lot of details. Wait a few seconds for it to finish and click (while also pressing CTRL) on the green URL.
  • A new browser window will open and it shows the error output that it cannot find the name parameter.
  • In the browser add a querystring after the URL: ?name=Joost (or your own name of course). Now it will respond with a greeting
  • Close the browser and then hit the disconnect icon on top to stop debugging
Debugging your function locally

















5) Create Azure Function in Azure Portal
Before you can deploy your newly created function you first need to create an Azure Function in the Azure portal.
  • Go to the Azure Portal and click on Create a resource
  • Search for function and select Function App
  • Click on the Create button
  • On the Basics tab you find the most important settings
  • Select your Subscription and Resource Group
  • Enter an unique Function App name
  • Select .NET Core as Runtime stack
  • Select 3.1 as Version 
  • Select the Region (probably the same as your Resource Group)
  • Optionally go to the Hosting tab for extra settings
  • Choose a new or existing Storage account
  • Change the Plan type (default: Serverless)
  • Optionally go to the Monitoring tab for extra settings
  • Disable or enable Application insights and change its name
  • Click the Review + create button
  • Review the settings and click on the Create button
Create new Azure Function (app) on Azure portal















Note 1: you cannot create an Azure Function with a Windows worker (.NET Core) if there is already a Linux worker (Python) in that same resource group and with the same region.

Note 2: you could also perform these steps within Visual Studio Code during deployment.

6) Deploy Azure Function to Azure Portal
Now that we have an (empty) Azure Functions app in the Azure portal we can deploy our newly created Azure Function to this resource.
  • In Visual Studio code click on the Azure icon in the left menu.
  • In the newly opened pane click on the blue arrow (deploy) icon
  • In the drop down select your Azure Functions App from the previous step
Deploy Azure Functions from Visual Studio Code

















7) Testing in portal
Now that we have deployed our project to Azure Functions we can test it in the Azure Portal. For this example we will use the post method.
  • Go to the Azure Portal and then open your Azure Functions App
  • In the left menu click on Functions
  • In the list of functions click on your function (only one in this example)
  • In the left menu click on Code + Test
  • Click on the test button (top center)
  • Change the HTTP method to post
  • Select one of the keys
  • Enter a JSON message in the body: {"name":"Joost"} (name=case-sensitive)
  • Click on the Run button and see the result
Testing in the Azure Portal

















8) Executing from Azure Data Factory
Now if you want to execute this new Azure Function in Azure Data Factory with the Azure Function Activity you can follow the steps in this previous post. However without code changes it will return an error stating that the response is invalid: 3603 - Response Content is not a valid JObject
3603 - Response Content is not a valid JObject














At the moment it is returning a so called JArray, but it is expecting a JObject (J = JSON). Any other return types than JObject will throw the error above. To overcome this we need a minor code change at the end by changing the return construction at line 37.
// This section lists the namespaces that this function will be using frequently,
// and saves the programmer from specifying a fully qualified name every time that
// a method that is contained within is used
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;

namespace Bitools.Function
{
    public static class myCSharpFunction
    {
        // Main function and entry point of this Azure Function
        [FunctionName("myCSharpFunction")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            // Log information
            log.LogInformation("C# HTTP trigger function processed a request.");

            // Retrieve parameter 'name' from querystring
            string name = req.Query["name"];

            // Also try to retrieve the same parameter from the request body
            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            name = name ?? data?.name;

            // If name is still empty throw an error that a name parameter
            // is expected else response with a greeting
            return name != null
                ? (ActionResult)new OkObjectResult(new {message = "Hello " + name})
                : new BadRequestObjectResult("Pass a name in the query string or in the request body for a personalized response.");

            /* 
            string responseMessage = string.IsNullOrEmpty(name)
                ? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
                : $"Hello, {name}. This HTTP triggered function executed successfully.";

            return new OkObjectResult(responseMessage);
            */
        }
    }
}

Below a couple of screenshots on how to configure and test this in Azure Data Factory. You might want to store the Function key in Azure Key Vault to avoid keys in your ETL/ELT code.
Set up the Azure Function Activity in Azure Data Factory

















After configuring the Azure Function activity you can hit the debug button and see the result. This output could then be used as input for successive pipeline activities
Successfully executing and getting the response


















Conclusion
First a big thank you to colleague Walter ter Maten for always helping me out with the C# stuff. In this blog post you learned how to create, test and deploy your first (very basic) Azure Function App with C# code. Then we also showed you how to execute this from Azure Data Factory. In a couple of follow up posts we will show you how to build some useful functions for DWH projects and show you some technical stuff like adding Azure Key Vault to the game. Also check out the Python version of this blogpost.


Thursday 21 May 2020

Setup Visual Studio code for Azure Functions

Case
I want to create Azure Functions on my Windows device, but which tools and extensions do I need to install?
Write Python or C# in Visual Studio code to create Azure Functions









Solution
In this blogpost we will show you which tools you need to install to create an Azure Function with either .NET or with Python code. The screenshots are of the current versions at the time of writing, but you might just want to take the latest stable version when downloading. In the upcoming Azure Functions posts we will create some basic Hello World functions to show the basics of creating and deploying your first Function. After that we will show some more useful functions for the Data Warehousing developers. For example to convert Excel or XML files to an easier readable format for Azure Data Factory or Synapse Polybase: CSV.

1) Download and install Visual Studio Code
For this blog post we will be using Visual Studio Code instead of the regular Visual Studio. Where this 'regular' Visual Studio is a so called Integrated Development Environment (IDE), the newer Visual Studio Code is more a lightweight source code editor. Ideal for some coding with PowerShell, C# or Python. Use the link below to download Visual Studio Code and then install it.
https://code.visualstudio.com/download
Installing Visual Studio Code




















2) Install extensions for Python
If you want to use Python for your Azure Functions you need to install Python for Windows and the Python extension for Visual Studio code. First download and install Python for windows 64bit. The default is a 32 bit version, but you can also find the 64 bit version slightly down the page (search for Windows x86-64 executable installer).
https://www.python.org/downloads/windows/
Install Python for Windows 64bit
















Then install the Python extension for Visual Studio code. When clicking on the install button on the website it will ask to open it with Visual Studio Code. Within Visual Studio Code you have to click on the install button again. After installation it will ask to point to the previously installed Python interpreter.
https://marketplace.visualstudio.com/items?itemName=ms-python.python
Install Python extension for Visual Studio code


















Note: You can also use the Extensions icon in the left menu of Visual Studio code to search for this specific extension.

There is one last installation required: Linter Pylint, but Visual Studio Code will ask for it when creating your first Azure Function with Python code: Linter pylint is not installed.
Install Linter pylint for Visual Studio Code







Installing Linter pylint within Visual Studio Code














3) Install extentions for C#
If you want to create Azure Functions with C# then you first need to install .Net Core SDK. Make sure to install the version to Build apps (Run Apps is not sufficient). The minimum version is .NET Framework 4.7.2 or .NET Core 2.2, but try the most recent version depending on your needs and the Runtime version of Azure Functions. If you forget this then you will recieve an error while trying to create a C# Function.
Receive an error when have not installed .NET Core SDK











Furthermore you should install the C# extension from Microsoft. This extension is not mandatory, but will be recommended when creating your first C# Azure Function.
Install C# extension for Visual Studio code














Note: You can also use the Extensions icon in the left menu of Visual Studio code to search for this specific extension.

4) Install extensions for Azure Functions
Next extension is Azure Functions for Visual Studio code. When clicking on the install button on the website it will ask to open it in Visual Studio Code. Within Visual Studio Code you have to click on the install button again.
https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions
Azure Functions for Visual Studio Code














Note: You can also use the Extensions icon in the left menu of Visual Studio code to search for this specific extension.

5) Install Azure Functions Core Tools
To make you able to debug the Azure Function code locally we need Azure Functions Core Tools, but to install that we first need to install Node Package Manager (NPM) which is included in nodejs (more detailed info here).
https://nodejs.org/en/download/
Install NodeJS with NPM




















Last step of this installation is to open a Command Promt (or PowerShell promt) in Administrator mode to install Azure Functions Core Tools. With the command npm -v you can check your npm version. Now use the following command for the installation (more detailed info here):
npm i -g azure-functions-core-tools@3 --unsafe-perm true
Install Azure Functions Core Tools via command prompt















Conclusion
In this introduction post you read which tools and extensions to install to create Azure Functions. Quite a lot installations, but manageable when following the steps above. We focused on the most popular languages (in the DWH scene) C# and Python, but there are way more languages to choose from like Java(script) or PowerShell. Each with its own extensions.

As mentioned before the next post about Azure Functions will be about deploying your first simple function with Python or C#. After that we will focus on the more functional Azure Functions solutions, but with a focus on the DWH scene. Also bringing Azure Key Vault to the game is a must for Azure Functions.

Tuesday 21 February 2017

Use BIML and csv files to create tables

Case
The case is about importing flat files (CSV’s) without the necessity of metadata. Because BIML always checks if the tables are accessible before creating the packages, the first step is to create the tables with BIML and the second step is to create the SSIS packages for transporting the data.

  1. Creating tables in the database 
  2. Packages to fill this database

Because of the size of the solution I’ve created two separate solutions, one for creating the tables and a second for creating the SSSI packages. You can click on the link to go to the other solution (which I will deploy later this month).


Solution - Creating tables with BIML
In this solution, we create a BIML that is going to create the tables and all the columns are defined as strings.
We have to create two BIML scripts, the first script for defining the table definition and the second for creating the actual package.

1) CSV files
For this example we are using two CSV files (age and sickleave) which are comma separated and have columnnames on the first row. These columnnames will be used in the create table statement

the drop map








content csv file










2) Tabledefinitions
The first biml is called “1_TableDefinitions.biml”
In this biml we define the path were the CSV files are located, an array with the names of the csv files and also some string which we going to use further in the code.
We use two “foreach loops”, the first one loops trough the array with files and the second one loops trough the actual file (to extract the column names).

Normally (without the loop) the code should look like this:

< tables>
 <columns>
   <column datatype="Int32" identityincrement="1" identityseed="1" name="AgeID">
   <column datatype="Int32" name="AgeFrom">
   <column name="AgeTo">
   <column datatype="String" length="255" name="AgeCategoryEmployee">
  </columns>
 </columns>
</tables>

Default BIML uses INT as an default datatype, in this case we use a string.
Now we add the loop in place and the complete code looks like this
<Biml xmlns="http://schemas.varigence.com/biml.xsd">

<#  
    string Prefix="Man";
    
    // the locatie of the csv's'
    string path = @"D:\Drop\Man";
    // Put all the filenames with the extension csv in a string array
    string[] myFiles = Directory.GetFiles(path, "*.csv");
    // string that will be filled with the filename
    string filename;
    // string array for columnnames extracted from CSV
    string[] myColumns;
#>
        <Connections>
            <OleDbConnection 
            Name="OLEDB_STG_<#=Prefix#>" 
            ConnectionString="Data Source=APPL43;Initial Catalog=dummy_STG;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;">
            </OleDbConnection>
        </Connections>
        <Databases>
             <Database ConnectionName="OLEDB_STG_<#=Prefix#>" Name="dummy_STG"/>
        </Databases>
       <Schemas>
              <Schema Name="dbo" DatabaseName="dummy_STG" Owner="dbo"/>
       </Schemas>
       <Tables>
            <!-- loop trough the array of files-->
            <#  foreach(string filePath in myFiles) 
                {
               // extract the filename from the path to use as tablename     
               fileName = Path.GetFileNameWithoutExtension(filePath);
            
            #>
            <Table Name="<#=Prefix#>_<#=fileName#>" SchemaName="dummy_STG.dbo">
                <Columns>
                   <!-- loop trough the file looking for the columnnames-->
                    <#
                    // read first row of csv to extract to columnnames
                    // and split on comma to create an array
                    StreamReader myFile = new StreamReader(filePath);

                    myColumns = myFile.ReadLine().Split(',');
                    // close file after reading first line
                    myFile.Close();

                    // Loop through column array
                    foreach(string myColumn in myColumns) 
                    {  
                    #>
                      <Column Name="<#=myColumn#>" DataType="String" Length="255"></Column>
                 <#  }   #>
                </Columns>    
            </Table>
            <# }#>
       </Tables>
</Biml>
 
<#@ template language="C#" hostspecific="true"#>
<#@ import namespace="System.IO"#>


3) CreateTable
Secondly we are going to create the biml, called 2_CreateTables.biml. which creates the actual package that contains the create statements to generate the tables.
BIML has an method to create SQL tables “RootNode.Tables.First().GetTableSql();”
We use this method to create ‘SQL create statement’ the of table

The code looks like this

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Packages>
        <Package Name="CreateTables" AutoCreateConfigurationsType="None" ConstraintMode="Linear">
            <Tasks>
                <# 
                // Loop trough the table definition os the first biml
                foreach(var table in RootNode.Tables) {#>
                <ExecuteSQL Name="SQL - Drop_Create <#=table.Name#>" ConnectionName="<#=table.Connection.Name#>">
                    <DirectInput>
                        <#=table.GetTableSql()#>    
                    </DirectInput>
                </ExecuteSQL>
                <# } #>
            </Tasks>
        </Package>
    </Packages>
</Biml>

<!--Includes/Imports for C#-->
<#@ template language="C#" hostspecific="true"#>
<#@ import namespace="System.Data"#>
<#@ import namespace="System.Data.SqlClient"#>


We’ve created 2 bimls, 1_TableDefinitions.biml and 2_CreateTables.biml. Now comes the important part (I’m using Biml Express) for generating the package. First we click on 1_TableDefinitions and secondly on and 2_CreateTables, if you have selected the 2 biml scripts  you click with your right mouse on 1_TableDefinitions.biml and generate SSIS packages. If you do this otherwise, you will get an empty SSIS package. .

Generate SSIS package







Below you can see the result of your BIML scripts: a package with an execute SQL Task for each table you need to create.
Visual studio











The actual create statement looks like this

SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER ON
GO
-------------------------------------------------------------------
IF EXISTS (SELECT * from sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Man_Age]') AND type IN (N'U'))
DROP TABLE [dbo].[Man_Age]
GO

CREATE TABLE [dbo].[Man_Age]
(
-- Columns Definition
 [AgeID] nvarchar(255) NOT NULL 
, [AgeFrom] nvarchar(255) NOT NULL 
, [AgeTo] nvarchar(255) NOT NULL 
, [AgeCategoryEmployee] nvarchar(255) NOT NULL 
, [AgeCategoryClient] nvarchar(255) NOT NULL 

-- Constraints

)
ON "default"
WITH (DATA_COMPRESSION = NONE)
GO
-------------------------------------------------------------------


Summary
We created two biml scripts one for creating to table definition and one which creates the actuale packages.The result in Management Studio looks like this.

Management studio result














In the next blog I’m going to explain how to create SSIS packages that transport the data from the csv files







Sunday 24 July 2016

IoT Adventure: 4 - Sending sensor data to IoT Hub

Case
I want to send sensor data from my .NET application to Azure. How do I do that?

Solution
To send sensor data from our .NET application we will use JSON messages and send them to the Azure IoT Hub. After that, other Azure parts like Stream Analytics can 'subscribe' to the IoT Hub to actually do something with the sensor data. But that will be described in a next blogpost.
The first step is to setup an IoT Hub to receive messages. The second step is to adjust our .NET application to send those messages.

1) Create Resource Group
To create an IoT Hub we first need a Resource Group to store the data from the IoT Hub. You can skip this step by creating a Resource Group within the IoT Hub form or if you already have one in your subscription.
Go to New, Management and then Resource Group. The most important option is the Resource group location. You can't combine Azure items from different resource groups. Because we live in the Netherlands, West Europe is the most obvious location to choose for us. That way the data stays in the Netherlands.
Azure Portal - Adding Resource Group

















2) IoT Hub
Now the actual IoT Hub. Go to New, Internet of Things and then IoT Hub. For this example we are using the F1 scale tier which is free of charge, but has a limit of 8000 messages a day. So sending each second isn't possible, but once a minute is no problem. The size of the messages is limited as well to 0,5kb for F1 and 4kb for the other editions.
Azure Portal - Adding IoT Hub
















3) Consumer groups
We need to create consumer groups. We will create one for the hot path called 'PowerBi' and one for the cold path called 'AzureDB'. Stream Analytics will use these consumer groups to run the queries. Using multiple consumer groups makes it possible for several consumer applications to read data from this IoT Hub independently. If you would use only one consumer group then one consumer application will retain the lease and the others will loose the connection.
Go to the newly created IoT Hub. Click on 'All Settings', 'Messaging' and then scroll down to the consumer groups. Add two groups: PowerBi and AzureDB and click on save.
Azure Portal - Adding Consumer Groups














4) IoT Hub Connection String
Next step is to create the Connection String which we need in our code to send JSON messages to this IoT Hub. First click on the Key icon on the IoT Hub dashboard. Then in the Shared access policies window click on 'iothubowner'. Now copy the 'Connection string - primary key' for the next step.
Azure Portal - Connection string - primary key














5) Device Explorer - Connection String
With the Device Explorer tool we are going to convert the connection string from the previous step to something we can use in our code.
  • Start up Device Explorer (see prerequisites) and paste the connection string on the Configuration tab under IoT Hub Connection String and click on Update.
  • Next go to the Management tab and click on Create. Enter the unique name (or id) of your device and click on the Create button. A new line will be added in the Device Grid.
  • Right click it and choose 'Copy connection string for selected device'. This is the string we need in our code.

















6) Adjust .NET code
Go to your .NET project and use this connection string to send messages to the IoT Hub. If you are using our project then here is what you need to change:
Our Sensory project















Conclusion
Not a very exciting and visible step, but very necessary for your IoT project. Moneywise the chosen Tier is very important, but the Consumer Groups are also very essential if you have multiple consuming applications. Also read Azure Event Hub vs IoT Hub to check the differences between those two hubs.

Sunday 17 July 2016

IoT Adventure: 3 - Create Visual Studio Project for sensors

Case
I have installed all the prerequisites, now I want to start a new Visual Studio Project. Which project do I choose and which assemblies do I need to reference for a IoT project?

Solution
Below you find the basic code you need to send sensor data to the Azure IoT Hub. The specific sensor code differs per sensor and will be posted in separate posts. Make sure to first install all prerequisites.

1) Blank App (Universal Windows)
Open Visual Studio 2015 and create a new project. We will be using C# as language. Under C#, Windows you will find the 'Blank App (Universal Windows)' project. Supply a ProjectName and Solution Name (also notice the checkbox for Source Control). Our project (and solution) is called Sensory.
Blank App (Universal Windows)

















When you create such project it will ask which version of Windows you want to use. Click on the help link if you're not sure.
Windows version










2) Source Control
The next window will be about source control because of the checkbox in the previous step. We will be using Team Foundation Server (TFS) for source control. This is recommended especially when you work in a team. You can either use your local TFS machine or use TFS online at visualstudio.com. It's also possible to use third party source control software.
TFS online
























3) Reference to Windows IoT Extensions for the UWP
A new blank project has been loaded. First we need a reference for IoT Extensions. Go to the Solution Explorer and right click on references and choose "Add Reference...". Then under Universal Windows / Extensions, locate "Windows IoT Extensions for the UWP". Sometimes you will find multiple versions. Make sure to choose the right one. We need version 10.0.10586.0.
Windows IoT Extensions for the UWP















4) Add NuGet Packages for Azure
Because we want to connect to Azure, we need to add a NuGet for Microsoft.Azure.Devices.Client.PCL. Right click the project (not the solution) and choose "Manage NuGet Packages". Then go to Browse and search for "Microsoft.Azure.Devices.Client.PCL" and click on the Install button. When you're ready, a new reference appears in the Solution Explorer.
Add NuGet for Azure















5) SensorMessage Class
For this project we have created a Sensor Message class to store the sensor values from one meassuring moment. Each x seconds/minutes we meassure for example the temperature and the illumination in a room. We store this in a SensorMessage object and then we are able to create a JSON message with these values and send this message to the Azure IoT Hub.

Right click the project and choose 'Add', 'New Item...' and in the new window choose Class. Give it the name SensorMessage. Copy and past the code below to the new class. If you used a different project name then change the namespace from Sensory to your namespace name.
Add new class file













//C# Code
using System;
using System.IO;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Json;
using System.Text;

namespace Sensory
{
    [DataContract]
    public class SensorMessage
    {
        #region Properties with get and/or set
        [DataMember]
        private string sensorName;
        public string SensorName
        {
            get { return sensorName; }
        }

        [DataMember]
        private DateTime measurementTime;
        public DateTime MeasurementTime
        {
            get { return measurementTime; } 
        }

        [DataMember]
        private decimal temperature;
        public decimal Temperature
        {
            get { return temperature; } 
            set { temperature = value; }
        }

        [DataMember]
        private decimal humidity;
        public decimal Humidity
        {
            get { return humidity; }
            set { humidity = value; }
        }

        [DataMember]
        private decimal pressure;
        public decimal Pressure
        {
            get { return pressure; }
            set { pressure = value; }
        }

        [DataMember]
        private decimal altitude;
        public decimal Altitude
        {
            get { return altitude; }
            set { altitude = value; }
        }

        [DataMember]
        private decimal decibel;
        public decimal Decibel
        {
            get { return decibel; }
            set { decibel = value; }
        }

        [DataMember]
        private int doorOpen;
        public int DoorOpen
        {
            get { return doorOpen; }
            set { doorOpen = value; }
        }

        [DataMember]
        private int motion;
        public int Motion
        {
            get { return motion; }
            set { motion = value; }
        }

        [DataMember]
        private int vibration;
        public int Vibration
        {
            get { return vibration; }
            set { vibration = value; }
        }

        [DataMember]
        private decimal illumination;
        public decimal Illumination
        {
            get { return illumination; }
            set { illumination = value; }
        }
        #endregion

        #region Constructor method
        /// 
        /// Creates SensorMessage object with default values
        /// Code should look something like:
        /// SensorMessage sensorMessage = new SensorMessage("mySensor");
        /// 
        /// The name or unique code of your sensor
        public SensorMessage(string SensorName)
        {
            sensorName = SensorName;
            // measurementTime = DateTime.Now;
            // Fix for timezone / offset troubles
            // after serializing to JSON. Without
            // this it shows UTC time with offset
            measurementTime = DateTime.UtcNow.AddHours(TimeZone.CurrentTimeZone.GetUtcOffset(DateTime.Now).Hours);
            temperature = 0;
            humidity = 0;
            pressure = 0;
            altitude = 0;
            decibel = 0;
            doorOpen = 0;
            motion = 0;
            vibration = 0;
            illumination = 0;
        }
        #endregion

        #region ToJson method
        /// 
        /// Extension method to convert object to JSON format
        /// It uses all properties with [DataMember] above it
        /// 
        /// 
        public string ToJson()
        {
            DataContractJsonSerializer ser = new DataContractJsonSerializer(typeof(SensorMessage));
            MemoryStream ms = new MemoryStream();
            ser.WriteObject(ms, this);
            string json = Encoding.UTF8.GetString(ms.ToArray(), 0, (int)ms.Length);

            return json;
        }
        #endregion
    }
}


5b) ToJson
The ToJson method in this class automatically returns a JSON message with all class properties that have [DataMember] above it. Also make sure to add [DataContract] above the class. It will generate the following message that can be read by the event Hub:
{"altitude":2.94,"decibel":7.43,"doorOpen":0,"humidity":93.97,"illumination":85.00, "measurementTime":"\/Date(1468490582115)\/","motion":0,"pressure":98.46, "sensorName":"Joost","temperature":89.49,"vibration":1}

Tip: try to keep the message as small as possible by using small columnnames or codes instead of large text values.

6) Main code
Locate the MainPage.xaml(.cs) file in the Solution Explorer and open the code behind it (the C# code). We have added some usings and methods which you need to add to your file. After that you can customize the code to your sensors/situation. In the following posts we will show you the sensor specific code for each sensor we will use in our IoT project.
MainPage.xaml.cs















//C# Code
using System;
using Microsoft.Azure.Devices.Client;   // Added
using Windows.Devices.Gpio;             // Added
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;

namespace Sensory
{
    /// 
    /// Main page
    /// 
    public sealed partial class MainPage : Page
    {
        #region Constants for config
        // GPIO PIN CONFIG
        const int PIN_DOORSENSOR = 4;

        // Azure IoT Hub connection string
        // Will be explained in next blogpost
        const string azureConnectionString = "HostName=xxxxxx.azure-devices.net;DeviceId=XXXXXXXX;SharedAccessKey=y1ANJwjTueyzCEBEAliy7MkOQHW5dOWiu6w79HcfvVc=";
        #endregion

        #region MainPage
        /// 
        /// Method that execute on startup
        /// 
        public MainPage()
        {
            this.InitializeComponent();

            // Initialize sensors
            InititializeDoorSensor();

            // Add timer event to collect and send
            // data each x seconds or minutes
            DispatcherTimer sensorTimer;
            sensorTimer = new DispatcherTimer();
            sensorTimer.Interval = TimeSpan.FromSeconds(60);
            sensorTimer.Tick += GetSensorData_Tick;
            sensorTimer.Start();
        }
        #endregion

        #region Collect sensor data
        // Method that is execute each x seconds
        // and collects the data from the sensors.
        // For this example it generates random data
        private void GetSensorData_Tick(object sender, object e)
        {
            // Create sensorMessage object and fill all
            // meassi=urements (with random data for now)
            SensorMessage sensorMessage = new SensorMessage("mySensor");
            sensorMessage.Illumination = randomDecimal();
            sensorMessage.Temperature = randomDecimal();
            sensorMessage.Humidity = randomDecimal();
            sensorMessage.Pressure = randomDecimal();
            sensorMessage.Altitude = randomDecimal();
            sensorMessage.Decibel = randomDecimal();
            sensorMessage.DoorOpen = randomInt(0.5);
            sensorMessage.Motion = randomInt(0.3);
            sensorMessage.Vibration = randomInt(0.1);

            // Send the data to the Azure IoT Hub
            SendMessage(sensorMessage.ToJson());
        }
        #endregion

        #region Initialize sensor
        // GPIO objects
        private GpioPin doorSensorPin;
        // Setup doorsensor
        private void InititializeDoorSensor()
        {
            // add sensor code
        }
        #endregion

        #region Send data to Azure IoT Hub
        // Object for sending data to azure
        private DeviceClient deviceClient = DeviceClient.CreateFromConnectionString(azureConnectionString);
        /// 
        /// Sends JSON message to Azure IoT Hub
        /// 
        /// The JSON message
        public async void SendMessage(string message)
        {
            // Send message to an IoT Hub using IoT Hub SDK 
            try
            {
                var content = new Microsoft.Azure.Devices.Client.Message(System.Text.Encoding.UTF8.GetBytes(message));
                await deviceClient.SendEventAsync(content);

#if DEBUG   // DEBUG INFO
                System.Diagnostics.Debug.WriteLine("Message Sent: {0}", message, null);
#endif
            }
            catch (Exception e)
            {
#if DEBUG   // DEBUG INFO
                System.Diagnostics.Debug.WriteLine("Exception when sending message:" + e.Message);
#endif
            }
        }
        #endregion

        #region Random Methods for test
        private int randomInt(double chance)
        {
            // Make sure each call returns a random number
            Thread.Sleep(1);

            Random test = new Random(DateTime.Now.Millisecond);

            if (test.NextDouble() >= chance)
            {
                return 1;
            }
            else
            {
                return 0;
            }
        }

        private decimal randomDecimal()
        {
            // Make sure each call returns a random number
            Thread.Sleep(1);

            Random test = new Random(DateTime.Now.Millisecond);
            return Math.Round(Convert.ToDecimal(test.NextDouble() * 100), 2);
        }
        #endregion
    }
}

6b) Random
The two random methods are temporary methods that return random data to test the application without the sensor specific code. This allows us to continue in parallel with the Azure IoT Hub, Stream Analytics and PowerBI.

7) Active Solution Platform
For building this project we need to switch to ARM (Acorn RISC Machine) instead of x86/x64. This is the processor architecture used on machines like the Raspberry Pi.
Switching Solution Platform to ARM











8) Deploying to Raspberry Pi
When your project is ready you can deploy it to your Raspberry Pi for debugging and testing. First you have to get the IP address of the Raspberry Pi device. The Windows 10 IoT Core Dashboard application is a good way to find your device(s). You will get the best result using a wired connection. When using WiFi the list below will sometimes stay empty.
Windows 10 IoT Core Dashboard

















You can also use Visual Studio to find the device. Go to the properties of your project. Then go to the Debug page and click on the Find button. On this page you can select the name or IP address from your device for deploying your code.
Find your IoT device
















Now you can use the green start button to deploy and debug your code on the remote device, but the first time you first need to select 'Remote Machine'. The first deployment could take a couple of minutes!
Start debugging









Now watch the output window in Visual Studio to see if your program is working. It's good to add a lot of Debug.WriteLine rows in de beginning to see whether everything works.
Debug output





















Conclusion
A lot of work to start with, but if the basic works you can start customizing the code for your sensors. You will need some patience because finding your Raspberry Pi sometimes need a couple of attemps (wired is more ofted successful) and the first deployment takes a couple of minutes. Don't worry if it takes 3 minutes.
In the next  few blogposts we will show the specific sensor code and show you how to create the Azure IoT Hub connection string.