Considering Software as an IoT Device

Considering Software as an IoT Device

An Azure IoT Hub can store just about any type of data from a Device.

There is support for:

  • Sending Device to Cloud messages.
  • Invoking direct methods on a device
  • Uploading files from a device
  • Managing Device Identities
  • Scheduling Jobs on single for multiple devices

The following is the List of of built-in endpoints

Custom Endpoints can also be created.

IoT Hub currently supports the following Azure services as additional endpoints:

  • Azure Storage containers
  • Event Hubs
  • Service Bus Queues
  • Service Bus Topics

Architecture

If we look through the documentation on the Azure Architecture Center, we can see a list of Architectural Styles.

If we were to design an IoT Solution, we would want to follow Best Practices. We can do this by using the Azure Architectural Style of Event Driven Architecture. Event-driven architectures are central to IoT solutions.

Merging Event Driven Architecture with Microservices can be used to separate the IoT Business Services.
These services include:

  • Provisioning
  • Management
  • Software Updating
  • Security
  • Logging and Notifications
  • Analytics

Creating our services

To create these services, we start by selecting our Compute Options.

App Services

The use of Azure Functions is becoming commonplace. They are an excellent replacement for API Applications. And they can be published to Azure Api Management.

We are able to create a Serverless API, or use Durable Functions that allow us to create workflows and maintain state in a serverless environment.

Logic Apps provide us with the capability of building automated scalable workflows.

Data Store

Having a single data store is usually not the best approach. Instead, it’s often better to store different types of data in different data stores, each focused towards a specific workload or usage pattern. These stores include Key/value stores, Document databases, Graph databases, Column-family databases, Data Analytics, Search Engine databases, Time Series databases, Object storage, and Shared files.

This may hold true for other Architectural Styles. In our Event-driven Architecture, it is ideal to store all data related to IoT Devices in the IoT Hub. This data includes results from all events within the Logic Apps, Function Apps, and Durable Functions.


Which brings us back to our topic… Considering Software as an IoT Device

Since Azure IoT supports the TransportType.Http1 protocol, we can use the Microsoft.Azure.Devices.ClientLibrary to send Event data to our IoT Hub from any type of software. We also have the capability of receiving configuration data from the IoT Hub.

The following is the source code for our SendEvent Function App.

SendEvent Function App

#region Information

//  
//  MIT License
//  
//  Copyright (c) 2018  Howard Edidin
//  
//  Permission is hereby granted, free of charge, to any person obtaining a copy
//  of this software and associated documentation files (the "Software"), to deal
//  in the Software without restriction, including without limitation the rights
//  to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
//  copies of the Software, and to permit persons to whom the Software is
//  furnished to do so, subject to the following conditions:
//  
//  The above copyright notice and this permission notice shall be included in all
//  copies or substantial portions of the Software.
//  
//  THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
//  IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
//  FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
//  AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
//  LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
//  OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
//  SOFTWARE.

#endregion

#region

using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data.Services.Client;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.Devices.Client;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;
using TransportType = Microsoft.Azure.Devices.Client.TransportType;

#endregion

namespace IoTHubClient
{
    public static class SendEvent
    {
        private static readonly string IotHubUri = ConfigurationManager.AppSettings["hubEndpoint"];

        [FunctionName("SendEventToHub")]
        public static async Task<HttpResponseMessage> Run(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = "device/{id}/{key:guid}")]
            HttpRequestMessage req, string id, Guid key, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");


            // Get request body
            dynamic data = await req.Content.ReadAsAsync<object>();

            var deviceId = id;
            var deviceKey = key.ToString();

            if (string.IsNullOrEmpty(deviceKey) || string.IsNullOrEmpty(deviceId))
                return req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a deviceid and deviceKey in the Url");

            var telemetry = new Dictionary<Guid, object>();


            foreach (var item in data.telemetryData)
            {
                var telemetryData = new TelemetryData
                {
                    MetricId = item.metricId,
                    MetricValue = item.metricValue,
                    MericDateTime = item.metricDateTime,
                    MetricValueType = item.metricValueType
                };

                telemetry.Add(Guid.NewGuid(), telemetryData);
            }


            var deviceData = new DeviceData
            {
                DeviceId = deviceId,
                DeviceName = data.deviceName,
                DeviceVersion = data.deviceVersion,
                DeviceOperation = data.deviceOperation,
                DeviceType = data.deviceType,
                DeviceStatus = data.deviceStatus,
                DeviceLocation = data.deviceLocation,
                SubscriptionId = data.subcriptionId,
                ResourceGroup = data.resourceGroup,
                EffectiveDateTime = new DateTimeOffset(DateTime.Now),
                TelemetryData = telemetry
            };


            var json = JsonConvert.SerializeObject(deviceData);

            var message = new Message(Encoding.ASCII.GetBytes(json));


            try
            {
                var client = DeviceClient.Create(IotHubUri, new DeviceAuthenticationWithRegistrySymmetricKey(deviceId, deviceKey),
                    TransportType.Http1);

                await client.SendEventAsync(message);

                return req.CreateResponse(HttpStatusCode.OK);
            }
            catch (DataServiceClientException e)
            {
                var resp = new HttpResponseMessage
                {
                    StatusCode = (HttpStatusCode) e.StatusCode,
                    Content = new StringContent(e.Message)
                };
                return resp;
            }
        }
    }


    public class DeviceData
    {
        public string DeviceId { get; set; }

        public string DeviceName { get; set; }

        public string DeviceVersion { get; set; }

        public string DeviceType { get; set; }

        public string DeviceOperation { get; set; }

        public string DeviceStatus { get; set; }

        public DeviceLocation DeviceLocation { get; set; }

        public string AzureRegion { get; set; }

        public string ResourceGroup { get; set; }

        public string SubscriptionId { get; set; }

        public DateTimeOffset EffectiveDateTime { get; set; }

        public Dictionary<Guid, object> TelemetryData { get; set; }
    }

    public class TelemetryData
    {
        public string MetricId { get; set; }

        public string MetricValueType { get; set; }

        public string MetricValue { get; set; }

        public DateTime MericDateTime { get; set; }
    }

    public enum DeviceLocation
    {
        Cloud,
        Container,
        OnPremise
    }
}

Software Device Properties

The following values are required in the Url Path

Route = "device/{id}/{key:guid}")

Name Description
id Device Id (String)
key Device Key (Guid)

The following are the properties to be sent in the Post Body
Name Description
deviceName Device Name
deviceVersion Device version number
deviceType Type of Device
deviceOperation Operation name or type
deviceStatus Default: Active
deviceLocation Cloud
Container
OnPremise
subscriptionId Azure Subscription Id
resourceGroup Azure Resource group
azureRegion Azure Region
telemetryData Array
telemetryData.metricId Array item id
telemetryData.metricValueType Array item valueType
telemetryData.metricValue Array item value
telemetryData.metricTimeStamp Array item TimeStamp

Summary

  • We can easily add the capability of sending messages and events to our Function and Logic Apps.
  • Optionally, we can send the data to an Event Grid.
  • We have a single data store for all our IoT events.
  • We can identify performance issues within our services.
  • Having a single data store makes it easier to perform Analytics.
  • We can use a Azure Function App to Send Device to Cloud Messages. In this case our Function App will be also be taking the role of a Device.
Azure Cosmos DB Performance – Throughput

Azure Cosmos DB Performance – Throughput

Recently I was designing a Cosmos DB SQL API solution for a client.  During testing we were constantly getting http status code 429 – Too many Requests errors.  We manually had to increase the Throughput for a collection. This got to be tedious.

In production, this was unacceptable.

We needed to automate the setting of the Cosmos DB Collection Throughput.

Architecture

The solution architecture I built looks like the diagram below and utilizes the Throttling Cloud Design Pattern.

Architecture

Steps

  1. Create an Cosmos DB Azure Alert for http status code 429 – Too many Requests.  The collection has exceeded the provisioned throughput limit.
  2. When the condition is met, an Alert is sent to a Function App.
  3. Function App increases the Request Units for the Collection by 100.
  4. Update Cosmos DB Collection Offer (Request Units).

Throughput SLA

Throughput Failed Requests” are requests which are throttled by the Azure Cosmos DB Collection resulting in an Error Code, before Consumed RUs have exceeded the Provisioned RUs for a partition in the Collection for a given second.

NOTE:  The Metric we want to check for is  Status Code 429 – Too many Requests .  The collection has exceeded the provisioned throughput limit.

Create a new Azure Cosmos DB Alert

We need to create an Alert for our Azure Cosmos DB Database as shown in the following figure.

New Alert

The following are the configuration settings.

  1. Select our Resource.
  2. Name for our Alert Rule.
  3. Description of our Alert Rule.
  4. Select Throttled Requests for the Metric.
  5. The condition, threshold, and period that determine when the alert activates.
  6. Condition set to greater than.
  7. Threshold Count set to 1.
  8. For Period we will start with Over the last hour.
  9. Check if the service administrator and co administrators are emailed when the alert fires.

NOTE: We will configure our WebHook setting later.

Create our Function App

We will create our Function App using the Azure Portal. Optionally we could do this in Visual Studio 2017.  I choose the Portal because I had planned on add additional functionality.

INFO: We could  download the Function App code with a Visual Studio project File.  This will allow us to modify the source code, use Visual Studio Team Services (VSTS)  for source control, and incorporate a  CI/CD  pipeline.

The Azure Function App template I used is a genericJson Webhook.  Application Insights was added.

The following is the code from the ProcessEvent Function App.

using System;
using System.Net;
using Newtonsoft.Json;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
using System.Configuration;
using System.Collections.Generic;

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info($"Webhook was triggered!");

    var endPoint = ConfigurationManager.AppSettings["Endpoint"];
    var authKey = ConfigurationManager.AppSettings["AuthKey"];
    var databaseId = ConfigurationManager.AppSettings["databaseName"];
   
    var list = new Dictionary<string, int>();
    var newThroughput = 0;

    string jsonContent = await req.Content.ReadAsStringAsync();
    dynamic data = JsonConvert.DeserializeObject(jsonContent);

    
  

    DocumentClient client = new DocumentClient(
        new Uri(endPoint),
                authKey,
                new ConnectionPolicy
                {
                    ConnectionMode = ConnectionMode.Direct,
                    ConnectionProtocol = Protocol.Tcp
                });

    var colls = await client.ReadDocumentCollectionFeedAsync(UriFactory.CreateDatabaseUri(databaseId));
    
    foreach (var coll in colls)
    {
        Microsoft.Azure.Documents.Offer offer = client.CreateOfferQuery().Where(r => r.ResourceLink == coll.SelfLink)
                    .AsEnumerable().SingleOrDefault();

        //Get the current Throughput for the collection
        var result = ((OfferV2) offer).Content.OfferThroughput;

        // Increase the Throughput by 100
        newThroughput = result + 100;

        // Modify the offer
        offer = new OfferV2(offer, newThroughput);
        
        
        list.Add(coll.Id, result);

        await client.ReplaceOfferAsync(offer);
    }
   
    // format the response content
    var results = string.Join("; ", list.Select(x => x.Key + "=" + x.Value));
   
    var res = new HttpResponseMessage(HttpStatusCode.OK)
    {
        
        Content = new StringContent($"Collection(s): {results} Request Units increased to {newThroughput}")
    };

    return res;   
   
}

NOTE:  We are not using the ‘HttpRequestMessage’ data in our Function App.  It is only triggered when an Event is received.

The following is an example of an Offer
offer {{
  "id": "gCK7",
  "_rid": "gCK7",
  "_self": "offers/gCK7/",
  "_etag": ""00002b01-0000-0000-0000-5a5d0b6f0000"",
  "offerVersion": "V2",
  "resource": "dbs/jdUKAA==/colls/jdUKAIUe1QA=/",
  "offerType": "Invalid",
  "offerResourceId": "jdUKAIUe1QA=",
  "content": {
    "offerThroughput": 10600,
    "offerIsRUPerMinuteThroughputEnabled": false
  },
  "_ts": 1516047215
}}  Microsoft.Azure.Documents.Offer {Microsoft.Azure.Documents.OfferV2}

Next we need to set our Azure Cosmos DB Connection settings

We need to modify the Function App Application Setting. The Cosmos DB connection settings need to be added.
* Endpoint
* AuthKey
* databaseName

The following figure shows the these settings.

Application Settings


Next we will test our Function App as shown in the following figure.

Our response is Collection(s): resources=1000; leases=1000 Request Units increased to 1100

We can see that the Collections resources and leases request units have been increased to 1100.

When we run it again, the request units increase by 100.

Modify Event Rule

Finally, we need modify our event rule.

  1. We first need to copy our Function App URL, as shown in the following figure.

Function Url

2. We then edit our Event Rule by pasting the Function App URL into the Webhook text box, as shown below.

Edit Rule

3.  Save the Event Rule.

Summary

  • Azure Event Rules can be used to send events to a Function App
  • Using Azure Cosmos DB SQL Api Microsoft.Azure.Documents.Offer builtin functionality to Get and Set the OfferThroughput is easy to use.
  • Increased the Request Units for all Collections in a Database.
  • We were able to provide the client with an automated process.

Next Steps

  • Extend functionality by using Logic Apps
  • Include Unlimited Collections
  • Add ability to lower Throughput
  • Use the Event data to get the collection Id.
Installing both BizTalk 2016 and Logic Apps mapper/schema editor SDK  on the same machine

Installing both BizTalk 2016 and Logic Apps mapper/schema editor SDK on the same machine

I installed the Logic Apps mapper/schema editor SDK on a BizTalk 2016 VM.

After I loaded a BizTalk project to make some changes, I discovered that the BizTalk Project Template was overwritten.

I thought this was a  bug and reported it.

I found out that having both on the same machine is currently unsupported  😮

To fix the issue, I uninstalled BizTalk on the machine I was using for Logic App development. 

On the BizTalk machine, I uninstalled both the Logic App designer and the Design Tools.

The XML Validator failed to validate

The XML Validator failed to validate

I was creating a demo  using the Logic App Adapter for BizTalk 2016.

I was sending a message from the Logic App to BizTalk.

I used the JSON Schema wizard and used the default setting. Root,  for the  Root element.

When testing I received the following error.

There was a failure executing the receive pipeline: "CRMDemo.ReceivePipeline1, CRMDemo, Version=1.0.0.0, Culture=neutral, PublicKeyToken=17c282d81ae800e3" Source: "XML validator" Receive Port: "CRMReceivePort" URI: "/GetCRMAccount/Service1.svc" Reason: The XML Validator failed to validate. Details: The element 'Root' in namespace 'http://CRMDemo.JSONSchema1' has invalid child element 'id'. List of possible elements expected: 'body'..

I looked at the message body from suspended message

{"id":"94bf6bf8-4a96-e611-80ea-c4346bdc7221","name":"Silly Goose two","phone":"748-852-1256"}

There wasn’t a body node.     💡

I then checked the source from the Logic App

{
    "Send_message": {
            "inputs": {
                "body": {
                    "id": "@{triggerBody()?['accountid']}",
                    "name": "@{triggerBody()?['name']}",
                    "phone": "@{triggerBody()?['telephone1']}"
                },
                "host": {
                    "api": {
                        "runtimeUrl": "https://logic-apis-northcentralus.azure-apim.net/apim/biztalk"
                    },
                    "connection": {
                        "name": "@parameters('$connections')['biztalk_1']['connectionId']"
                    }
                },
                "method": "post",
                "path": "/Send",
                "queries": {
                    "receiveLocationAddress": "http://40.86.103.129/GetCRMAccount/Service1.svc"
                }
            },
            "runAfter": {},
            "type": "ApiConnection"
    }

}

My BizTalk Schema is shown below.

<?xml version="1.0" encoding="utf-16"?>
<xs:schema xmlns:b="http://schemas.microsoft.com/BizTalk/2003" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="http://CRMDemo.JSONSchema1" xmlns:xs="http://www.w3.org/2001/XMLSchema">
  <xs:element name="Root">   
          <xs:complexType>
            <xs:sequence>
              <xs:element minOccurs="0" name="id" type="xs:string" />
              <xs:element minOccurs="0" name="name" type="xs:string" />
              <xs:element minOccurs="0" name="phone" type="xs:string" />
            </xs:sequence>
          </xs:complexType>
        </xs:element>    
</xs:schema>

Was it a bug or did I do something wrong?

Finally it came to me.        😳

I needed use the body   node as the name of the Root element in my schema. That solved the problem. 

Manual Trigger fields not showing up in the designer

Manual Trigger fields not showing up in the designer

Have you ever run into a issue where the Trigger fields are not available in the Logic App Designer View?  

Check to see if your Request Body JSON Schema has  required fields

Only required fields will be available in the designer.  This is by design. 

TIP:  Use jsonschema.net to generate your schema

JsonSchema.net is a tool that automatically generates JSON schema from JSON according to the IETF JSON Schema Internet Draft Version 4. JSON Schema will be automatically generated in three formats: editable, code view, and string. To get started just return to the homepage and start writing some JSON in the textarea on the left-hand side.

Using Azure Functions with Logic Apps  – Part 1

Using Azure Functions with Logic Apps – Part 1

There are times where you need to do data type conversions in a Logic App.

I recently ran into an issue where I was syncing records between CRMOL (on line) and Salesforce.  The record coming from CRMOL had NULL values.  When converted to JSON, the NULL value is a string.

I could use the Logic App  Replace function, but when you have to evaluate 30 to 50 fields, it becomes a tedious chore.

I decided to create a  Function App.  I used a C# Web Hook, so I can pass in the Response coming out of CRMOL.  I could return the parsed Record to be mapped to Salesforce.

The code is very simple. I loop through each field and replace the “NULL” values with a empty string.

In Part – 2,  I will show you the code and other utility functions  that can be used in Logic Apps.