Gain better IoT insights with Dynamics 365 integration

Gain better IoT insights with Dynamics 365 integration

This is a new post in the IoT Hub series. Previously we have seen how to administrate our devices, send messages from the device and from the cloud. Now that we have all this data flowing through our systems, it is time to help our users to actually work with this data.
Going back to the scenario we set in the first post of the series, we are receiving the telemetry readings from our ships, and getting alerts in case of high temperature. In the samples we have been using console apps for our communications between the systems, but in a real-life scenario you will probably want a better and easier interface. In the shipping business, Dynamics CRM is already widely used, and so it would benefit the business if they can use this product for their IoT solutions as well. Luckily they can, by using Microsoft Dynamics 365 for Field Service in combination with the Connected Field Service solution.

Use Connected Field Service for your end-to-end IoT solution

Setting Up Connected Field Service

To start working with the connected field service solution, first we will set up a 30 day trial for Dynamics 365 for Field Service. Just remember you will need an organizational Microsoft account to sign up for Dynamics 365. If you do not have one, you can create a <your-tenant>.onmicrosoft.com account in your Azure Active Directory for this purpose. If you already have your own Dynamics 365 environment, you can skip to installing the connected field service.

Create Dynamics 365 Environment

We will start by setting up Dynamics 365. In this post, we will be using a trial account, but if you already have an account you could of course also use that one. Go to the Dynamics 365 trial registration site, and click on Sign in to use an organizational account to login.

Sign in to Dynamics 365Sign in to Dynamics 365 Use an organizational account to sign inUse an organizational account to sign in

Once signed in, confirm that you want to sign up for the free trial.

Confirm the free trialConfirm the free trial Your trial has been acceptedYour trial has been accepted

Now that we have created our free trial, we will have to assign licenses to our users. Open the Subscriptions blade under Billing and choose to assign licenses to your users.

Assign licenses to usersAssign licenses to users

You will get an overview of all your users. Select the users for which you want to assign the licenses, and click Edit product licenses.

Choose users to assign licensesChoose users to assign licenses

Add the licenses to the users we just selected.

Add licenses to usersAdd licenses to users

Choose the trial license we just created. This will also add the connected Office 365 licenses.

Assign Dynamics 365 trial licensesAssign Dynamics 365 trial licenses

Now that we have assigned the Dynamics 365 licenses, we can finish our setup. Go to Admin Centers in the menu, and select Dynamics 365.

Go to Dynamics 365 admin centerGo to Dynamics 365 admin center

As we are interested in the field service, select this scenario, and complete the setup. The field service scenario will customize our Dynamics 365 instance, to include components like scheduling of technicians, inventory management, work orders and more, which in a shipping company would be used to keep track of repairs, maintenance, etc.

Select Field serviceSelect Field service

Once we have completed our Dynamics 365 setup, it will be shown in the browser. The address of the page will be in the format <yourtenant>.crm4.dynamics.com. You can also change this endpoint in your Dynamics 365 admin center.

Your Dynamics 365 environmentYour Dynamics 365 environment

Security

To allow us to install the Connected Field Service solution, we will need to add ourselves to the CRM admins. To do this, within your Dynamics 365 portal (in my case https://eldertiotcrmdemoeldert.crm4.dynamics.com/) go to the Settings Tab and open security.

Open Dynamics 365 SecurityOpen Dynamics 365 Security

Now open the users, select your user account, and click on Promote To Admin.

Promote your user to local adminPromote your user to local admin

Install Connected Field Service Solution

Now that we have Dynamics 365 set up, it’s time to add the Connected Field Service solution, which we will use to manage and interact with our devices from Dynamics 365. Start by going to Dynamics 365 in the menu bar.

Open Dynamics 365Open Dynamics 365

This will lead us to our Dynamics home, where we can install new apps. Click on Find more apps to open the app store.

Open the app storeOpen the app store

Search for Connected Field Service, and click on Get it now to add it to our environment.

Add Connected Field Service solutionAdd Connected Field Service solution

Agree with the permissions, and make sure you are signed in with the correct user. The user must have a license, and permissions to install this solution.

Accept permissionsAccept permissions

Follow the wizard for the solution and make sure you install it into the correct environment.

Select your Dynamics 365 environmentSelect your Dynamics 365 environment Accept deploymentAccept deployment

On the next pages accept the service agreement and the privacy statement. Make sure you deploy to the correct Dynamics 365 Organization.

Select correct organizationSelect correct organization

Now we will have to specify the Azure resources where we want to deploy our artefacts like IoT Hub, Stream Analytics etc. If you do not see a subscription, make sure your user has the correct permissions in your Azure environment to create and retrieve artefacts and subscriptions.

Select Azure subscription and resourcesSelect Azure subscription and resources

The wizard will now start deploying all the Azure artefacts, and will update CRM with new screens and components. You can follow this by refreshing the screen, or coming back to the website. Once this is finished, you will need to click the Authorize button, which will set up the connection between your Azure and Dynamics 365.

After deployment click on AuthorizeAfter deployment click on Authorize

This will open the Azure portal on the API connection, click on the message This connection is not authenticated to authorize the connection.

Click to authenticateClick to authenticate Authorize the connectionAuthorize the connection

The Azure Solution

Now let’s go to the Azure portal, and see what has been installed. Open the resource group which we created in the wizard.

Resource group for our connected field service solutionResource group for our connected field service solution

As you can see, we have a lot of new resources. I will explain the most important ones here, and their purpose in the Connected Field Service solution. After the solution has been deployed, all resources will have been setup for the data from the sample application which has been deployed with it. If you want to use your own devices, you will need to update these. This is also the place to start building your own solution, as your requirements might differ from what you get out of the box. As all these resources can be modified from the portal (except for the API Apps), customizing this solution to your own needs is very easy

IoT Hub

The IoT Hub which has been created is used for the device management and communication. It uses device to cloud messaging to receive telemetry from our devices, and cloud to device messaging to send commands to our devices. When working with your own devices, you should update them to connect with this IoT Hub.

Service Bus

Four Service Bus queues have been created, which are used for holding messages between systems.

Stream Analytics

There are several Stream Analytics jobs, which are used to process the data coming in from IoT Hub. When working with your own devices, you should update these jobs to process your own data.

  • Alerts; This job reads data from IoT Hub, and references it against device rules in a blob. If the job detects it needs to send an alert to Dynamics 365, in this case a high temperature, it will write this into a Service Bus queue.
  • PowerBI; This job reads all incoming telemetry data, and sends the maximum temperature per minute to PowerBI.

API Apps

Custom API Apps have been created, which will be used to translate between messages from IoT Hub and Dynamics 365.

Logic Apps

There are two Logic Apps, which serve as a communications channel between Dynamics 365 and IoT Hub. The Logic Apps use queues, API Apps and the Dynamics 365 connector to send and receive messages between these systems.

Setting Up PowerBI

PowerBI will be used to generate charts from the telemetry readings. To use this, we first need to import the reports. Start by downloading the reports, and make sure you have a PowerBI account, it is recommended to use the same user for this which you use for Dynamics 365. Open the downloaded reports file using PowerBI Desktop. The Power BI report will open with errors because it was created with a sample SQL database and user. Update the query with your SQL database and user, and then publish the report to Power BI.

Open the downloaded reportOpen the downloaded report

Once opened, click on Edit Queries to change the connection to your database.

Select Edit QueriesSelect Edit Queries Open Advanced EditorOpen Advanced Editor

Replace the source SQL server and database with the resources provisioned in your Azure resource group. The database server and database name can be found through the Azure portal.

Update Azure SQL Server and database namesUpdate Azure SQL Server and database names

Enter the credentials of your database user when requested.

Enter login credentialsEnter login credentials

If you get an error saying your client IP is not allowed to connect, use the Azure portal to add your client IP to the firewall on your Azure SQL Server.

Not allowed to connectNot allowed to connect Add client IP to firewall settingsAdd client IP to firewall settings

Once done, click on Close & Apply to update the report file.

Close and apply changesClose and apply changes

Now we will publish the report to PowerBI, so we can use it from Dynamics 365. Click on the Publish button to start, and make sure to save your changes.

Publish report to PowerBIPublish report to PowerBI

Sign in to your PowerBI account and wait for you report to be published.

Sign in to PowerBISign in to PowerBI

Once published, open the link to provide your credentials.

Publishing succeededPublishing succeeded

Follow the link to edit your credentials, and update the credentials with your database user login.

Sign in with database userSign in with database user

Now pin the tiles to a dashboard, creating one if it does not yet exist.

Pin tiles to dashboardPin tiles to dashboard

Managing Devices

In this post we will be using the simulator which has been deployed along with the solution. If you want to use your own (simulated) devices, be sure to update the connections and data for the deployed services. Go to your Dynamics 365 environment, open the Field Service menu, and select Customer Assets.

Open Customer AssetsOpen Customer Assets

To add a new device, we will create a new asset. This asset will then be linked to a device in IoT Hub.

Create new assetCreate new asset

Fill in the details of the asset. Important to note here, is we need to set a Device ID. This will be the ID with which the device is registered in IoT Hub. When done, click on Save.

Set asset detailsSet asset details

Once the asset has been saved, you will note a new command in the command bar called Register Devices. This will register the new device in IoT Hub, and link it with our asset in Dynamics 365. Click this now.

Register the device in IoT HubRegister the device in IoT Hub

The device will now be registered in IoT Hub. Once this is done, the registration status will be updated to Registered. We can now start interacting with our device.

Device has been registered in IoT HubDevice has been registered in IoT Hub

Receive Telemetry

Open the thermostat simulator, which was part of the deployment of the Connected Field Service solution. You can do this by going back to the deployment website and clicking Open Simulator.

Open the simulatorOpen the simulator

This will open a new website where we can simulate a thermostat. Start by selecting the device we just created from Dynamics 365.

Select deviceSelect device

Once the device has been selected, we will start seeing messages being sent. These will be sent to IoT Hub, and be placed into PowerBI, and alerts will be created if the temperature gets too high. Increase the temperature to trigger some alerts.

Generate high temperatureGenerate high temperature

Now go back to Dynamics 365, and open the Field Service dashboard.

Open Field Service dashboardOpen Field Service dashboard

On the dashboard we will now see a new IoT Alert. You can open this alert to see it’s details, and for example create a work order for this. In our scenario with the shipping company, this would allow us to recognize anomalies on the ships engines in near real time, and immediately take action for this, like arranging for repairs.

Alerts are shown in Dynamics 365Alerts are shown in Dynamics 365

Connect PowerBI

Now let’s set up Dynamics 365 to include the PowerBI graph in our assets, so we have an overview of our telemetry at all times as well. Go back to the asset we created earlier, and click the PowerBI button in the Connected Device Readings area.

Add PowerBI tile to assetAdd PowerBI tile to asset

Choose one of the tiles we previously added to the PowerBI dashboard and click save.

Add PowerBI tileAdd PowerBI tile

We will now see the recent device readings in our Dynamics 365 asset. This will show up with every asset with the readings for its registered device, allowing us to keep track of all our device’s readings.

Device readings are now integrated in Dynamics 365Device readings are now integrated in Dynamics 365

Send Commands

So for the final part, we will have a look how we can send messages from Dynamics 365 to our device. Go back to the asset we created, and click on Create Command.

Click Create CommandClick Create Command

Give the command a name, and provide the command. This should be in JSON format, so it can be parsed by the device. As we will be using the simulator, we will just send a demo command, but for your own device this should be a command your device can understand. You can send this command to a particular device or to all your devices. Once you have filled in the fields, click on Send & Close to send the command. The command which we will be sending is as follows.

{"CommandName":"Notification","Parameters":{"Message":"Technician has been dispatched"}}

Create and send your commandCreate and send your command

Now when we switch over to our simulator, we will see the command coming in.

Commands are coming inCommands are coming in

Conclusion

By using Dynamics 365 in combination with the Connected Field Service solution, we allow our users to use an environment which they are well known with, to administrate and communicate with their IoT devices. It allows them to handle alerts, dispatching technicians as soon as needed. By integrating the readings, they are always informed on the status of the devices, and by sending commands back to the device they can remotely work with the devices.

IoT Hub Blog Series

In case you missed the other articles from this IoT Hub series, take a look here.

Blog 1: Device Administration Using Azure IoT Hub
Blog 2: Implementing Device To Cloud Messaging Using IoT Hub
Blog 3: Using IoT Hub for Cloud to Device Messaging

Author: Eldert Grootenboer

Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/.

Integrate 2017 USA Day 2 Recap

Integrate 2017 USA Day 2 Recap

And so the second day of Integrate 2017 USA is a fact, another day of great sessions and action packed demos. We started the day with Mayank Sharma and Divya Swarnkar from Microsoft CSE, formerly Microsoft IT, taking us through their journey to the cloud. Microsoft has an astounding amount of integrations running for all their internal processes, communicating with 1000+ partners. With 175+ BizTalk servers running on Azure IaaS, doing 170M+ messages per month, they really need a integration platform they can rely on.

10_26_17 08_40 Office Lens

Like most companies, Microsoft is also looking into ways to modernize their application landscape, as well as to reduce costs. To accomplish this, they now are using Logic Apps at the heart of all their integrations, using BizTalk as their bridge to their LOB systems. By leveraging API Management they can test their systems in production as well as in their UAT environments, ensuring that all systems work as expected. By using the options the Azure platform provides for geo replication they ensure that even in case of a disaster their business will stay up and running.
10_26_17 08_44 Office Lens (2)

Adopting a microservices strategy, each Logic App is set up to perform a specific task, and meta data is used to execute or skip specific parts. To me this seems like a great setup, and definitely something to look into when setting up your own integrations.

10_26_17 08_56 Office Lens

Manage API lifecycle sunrise to sunset with Azure API Management

The second session of the day we had Matthew Farmer and Anton Babadjanov showing us how we can use API Management to set up an API using a design first approach. Continuing on the scenario of Contoso Fitness, they set up the situation where you need to onboard a partner to an API which has not been built yet. By using API Management we can set up a façade for the API, adding it’s methods and mock responses, allowing consumers to start working with the API quickly.

10_26_17 09_15 Office Lens

Another important subject is how you handle new versions of your API. Thanks to API Management you can now have versions and revisions of your API. Versions allow you to have different implementations of your API living next to each other publically available to your consumers, where revisions allow you to have a private new version of your API, in which you can develop and test changes. Once you are happy with the changes done in the revision, you can publish it with a click of the button, making the new revision the public API. This is very powerful, as it allows us to safely test our changes, and easily roll back in case of any issues.

10_26_17 09_38 Office Lens

Thanks to API Management we have the complete lifecycle of our API’s covered, going from our initial design, through the ALM story, all the way up to updating and deprovisioning.

10_26_17 09_49 Office Lens

Azure Logic Apps – Advanced integration patterns

Next up are Jeff Hollan and Derek Li, taking us behind the scenes of Logic Apps. Because the massive scale these need to run on, there are many new challenges which needed to be solved. Logic Apps does this by reading in the workflow definition, and breaking it down into a composition of tasks and dependencies. These tasks are then distributed across various workers, each executing their own piece of the tasks. This allows for a high degree of parallelism, which is why they can scale out indefinitely. Having this information, it’s important to take this with us in our scenarios, thinking about how this might impact us. This includes keeping in mind tasks might not be processed in order, and at high scale, so we need to take this into account on our receiving systems. Also, as Logic Apps provides at-least-once delivery, so we should look into idempotency for our systems.

10_26_17 10_02 Office Lens

Derek Li showed us different kinds of patterns which can be used with Logic Apps, including parallel processing, exception handling, looping, timeouts, and the ability to control the concurrency, which will be coming to the portal in the coming week. Using these patterns, Derek created a Logic App which sent out an approval email, and by adjusting the timeout and setting up exception handling on this, escalating to the approver’s manager in case the approval was not processed within the timeout. These kinds of scenarios show us how powerful Logic Apps has become, truly allowing for a customized flow.

10_26_17 10_20 Office Lens

Bringing Logic Apps into DevOps with Visual Studio and monitoring

After some mingling with like minded people during the break at Integrate 2017 USA, it’s now time for another session by Kevin Lam and Jeff Hollan, which is always a pleasure to see. In this session we dive into the story around DevOps and ALM for Logic Apps. These days Logic Apps is a first class citizen within Visual Studio, allowing us to create and modify them, pulling in and controlling existing Logic Apps from Azure.

10_26_17 11_01 Office Lens

As the Logic Apps designer creates ARM templates, we can also add these to source control like VSTS. By using the CI/CD possibilities of VSTS, we can then automatically deploy our Logic Apps, allowing for a completely automated deployment process.

10_26_17 11_30 Office Lens

Integrating the last mile with Microsoft Flow

As pro-integrators Microsoft Flow might not be the first tool coming to mind, but actually this is a very interesting service. It allows us to create light weight integrations, giving room to the idea of democratization of integration. There is a plethora of templates available for Flow, allowing users to easily automate tasks, accessing both online and on-premises data. With custom connectors give us the option to expose any system we want to. And being categorized in verticals, users will be able to quickly find templates which are useful for them.

10_26_17 11_50 Office Lens

Flow also has the option to use buttons, which can be both physical buttons, from Flic or bttn, or programmatic in the Flow app. This allows for on-demand flows to be executed by the click of a button, and sharing these within your company. For those who want more control over the flows that can be built, and the data that can be accessed, there is the Admin center, which is available with Flow Plan 2.

10_26_17 12_20 Office Lens

Looking at the updates which has happened over the last few months, it’s clear the team has been working hard, making Flow ready for your enterprise.

10_26_17 12_26 Office Lens

And even more great things are about to come, so make sure to keep an eye on this.

10_26_17 12_27 Office Lens

Deep dive into BizTalk technologies and tools

Yesterday (on Day 1 on Integrate 2017 USA), we heard the announcement of Feature Pack 2 for BizTalk. For me, one of the coolest features that will be coming is the ability to publish BizTalk endpoints through Azure API Management. This will allow us to easily expose the endpoint, either via SOAP pass-through or even with SOAP to REST, and take advantage of all the possibilities API Management brings us, like monitoring, throttling, authentication, etc. And all this, with just a right click on the port in BizTalk, pretty amazing.

10_26_17 13_36 Office Lens

As we had seen in the first session of today, Microsoft has a huge integration landscape. With that many applications and artifacts, migration to a new BizTalk version can become quite the challenge. To overcome this, Microsoft IT created the BizTalk Server Migration Tool, and published the tool for us as well. The tool takes care of migrating your applications to a new BizTalk environment, taking care of dependencies, services, certificates and everything else.

10_26_17 13_56 Office Lens

Looking at the numbers, we can see how much effort this saves, and minimizing the risks of errors. The tool supports migration to BizTalk 2016 from any version from BizTalk 2010, and is certainly a great asset for anyone looking into migration. So if you are running an older version of BizTalk, remember to migrate in time, to avoid running out of the support timelines we have seen yesterday.

10_26_17 13_58 Office Lens

What’s there & what’s coming in BizTalk360

Next up we had Saravana Kumar, CEO of BizTalk360 and founding father of Integrate, guiding us through his top 10 features of BizTalk360. Having worked with the product since its first release, I can only say it has gone through an amazing journey, and has become the de-facto monitoring solution for BizTalk and its surrounding systems. It helps solving the challenges anyone who has been administrating BizTalk, giving insights in your environment, adding monitoring and notifications, and giving fine-grained security control.

10_26_17 14_21 Office Lens

So Saravana’s top 10 of BizTalk360 is as following, and I pretty much agree on all of them.

1. Rich operational dashboards, showing you the health of your environment in one single place

2. Fine grained security and auditing, so you can give your users access to only those things they need, without the need to opening up your complete system

3. Graphical message flow, providing an end to end view of your message flows

4. Azure + BizTalk Server (single management tool), because Azure is becoming very important in most integrations these days

5. Monitoring – complete coverage, allowing us monitor and, even more importantly, be notified on any issue in your environment

6. Data (no events) monitoring, giving us monitoring on what’s not happening as well, for example expected messages not coming in

7. Auto healing – from failures, to make sure your environment keeps running, automatically coming back up after issues, either from mechanical or human causes

8. Scheduled reporting, which will be coming in the next version, creating reports about your environment on a regular basis

9. Analytics & messaging patterns, giving even more insights in what is happening using graphical charts and widgets

10. Throttling analyser, because anyone who has ever needed to solve a throttling problem knows how difficult this can be, having to keep track of various performance counters, this feature allows a nice graphical overview and historical insights

11. Team knowledgebase, so one more bonus feature that should really be addressed, the knowledgebase is used to link articles to specific errors and faults, making sure this knowledge is readily available in your company

Of course, this is not all, BizTalk360 has a lot more great features, and I can recommend anyone to go and check it.

10_26_17 14_50 Office Lens

Give your Bots connectivity, with Azure Logic Apps

Kent Weare, former MVP and now Principal Program Manager within Microsoft on Flow team, takes us on a journey into bots, and giving them connectivity with Logic Apps and Flow. First setting the stage, we all have heard about digital transformation, but what is it all about? Digital transformation has become a bit of a buzzword, but the idea behind it is actually quite intriguing, which is using digital means to provide more value and new business models. The following quote shows this quite nicely.

“Digital transformation is the methodology in which organizations transform, create new business models and culture with digital technologies” – Ray Wang, Constellation Research

An important part here is the culture in the organization will need to change as well, so go out and become a change agent within your organization.

10_26_17 15_00 Office Lens (1)

Next we go on to bots, which can be used to reduce barriers and empower users through conversational apps. With the rise of various messenger applications like WhatsApp and Facebook Messenger, there is a huge market to be reached here. There are many different kinds of bots, but they all have a common way of working, often incorporating cognitive services like Language Understanding Intelligence Service (LUIS) to make the bot more human friendly.

10_26_17 15_10 Office Lens (1)

When we want to build our own bots, we have different possibilities here as well, depending on your background and skills. Kent had a great slide on this, making it clear you don’t have to be a pro integrator anymore to make compelling bots.

10_26_17 15_13 Office Lens (1)

In his demos, Kent showed the different implementations on how to build a bot. The first is using the bot framework with Logic Apps and Cognitive Services to make a complex bot, allowing for a completely tailored bot. For the other two demos, he used Microsoft flow in combination with Bizzy, a very cool connector which allows us to create a “question-answer bot”, analyzing the input from the user and making decisions on it. Finally the ability to migrate Flow implementations to Logic Apps was demonstrated, allowing users to start a simple integration in Flow, but having the ability to seamlessly migrate these to Logic Apps when more complexity is needed over the lifecycle of the integration.

10_26_17 15_47 Office Lens

Empowering the business using Logic Apps

And closing this second day of Integrate 2017 USA, we had Steef-Jan Wiggers, with a view from the business side on Logic Apps. A very interesting session, as instead of just going deep down into the underlying technologies, he actually went and looked for the business value we can add using these technologies, which in the end is what it is all about. Serverless integration is a great way to provide value for your business, lowering costs and allowing for easy and massive scaling.

10_26_17 16_25 Office Lens

Steef-Jan went out to several companies who are actually using and implementing Azure, including Phidiax, MyTE, Mexia and ServiceBus360. The general consensus amongst them, is Logic Apps and the other Azure services are indeed adding value to their business, as it gives them the ability to set up new powerful scenarios fast and easy.

10_26_17 16_29 Office Lens

With several great demos and customer cases Steef-Jan made very visible how he has already helped many customers with these integrations to add value to their business. The integration platform as a service is here to stay, and according to Gartner iPaaS will actually be the preferred option for new projects by 2019. And again, he has gone out and this time went to the community leaders and experts, to get their take on Logic Apps. The conclusion here is these days Logic Apps is a mature and powerful tool in the iPaaS integration platform.

10_26_17 16_43 Office Lens

So that was the end of the second day at Integrate 2017 USA, another day full of great sessions, inspiring demos, and amazing presenters. With one more day to go, Integrate 2017 USA is again one of the best events out there.

Check out the recap of Day 1 and Day 3 at Integrate 2017 USA.

Author: Eldert Grootenboer

Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/.

Integration Patterns In Azure – Message Router Using Logic Apps

Integration Patterns In Azure – Message Router Using Logic Apps

When implementing software, it’s always a good idea to follow existing patterns, as these allow us to use proven and reliable techniques. The same applies in integration, where we have been working with integration patterns in technologies like BizTalk, MSMQ etc. These days we are working more and more with new technologies in Azure, giving us new tools like Service Bus, Logic Apps, and since recently Event Grid. But even though we are working with new tools, these integration patterns are still very useful, and should be followed whenever possible. This post is the first in a series where I will be showing how we can implement integration patterns using various services in Azure.

Message Router Pattern

Message Router Pattern

The first pattern which will be shown is the Message Router, which is used to route a message to different endpoints depending on a set of conditions, which can be evaluated against the contents or metadata of the message. We will implement this pattern with different technologies, where we will focus on Logic Apps in this post. For this sample we will implement a scenario where we receive orders, and depending on the city where the order should be delivered we will route it to a specific carrier.

Scenario

Scenario

When using Logic Apps, we can easily route a message based on its contents to various endpoints. Using a Logic in combination with the Message Router pattern is especially useful when we have the following requirements:

  • Different types of endpoints; the power of Logic Apps lies in the many connectors we get out of the box, allowing us to easily integrate with various systems like SQL, Dynamics CRM, Salesforce, etc.
  • Small amount of endpoints; as we will be using a switch in our Logic App, managing these becomes cumbersome when we have many endpoints.

In this sample we write the messages to Github Gists, but you could easily replace this with other destinations. We use a HTTP Trigger, meaning we receive the message on a http endpoint, where the message format is as the following.

{
    "Address":"Kings Cross 20",
    "City":"New York",
    "Name":"Eldert Grootenboer"
}

We use a switch to determine the endpoint to which we will send our message based on the city inside the message body, and send out the message to our endpoint, in this case using a HTTP action. Of course we could send the message to any other type of endpoint from our cases inside the switch as well. Finally we will respond the location of the Gist where the message was placed.

Logic App Implementation

Logic App Implementation

You can easily deploy this solution from the Azure Quickstart Templates site, or use the below button to directly deploy this to your own Azure environment.

Custom Subscribers In Azure Event Grid

Custom Subscribers In Azure Event Grid

After having shown how to send our custom events to Event Grid in my previous blog post, we will now see how we can create custom subscribers. Event Grid will be integrated with all Azure services, but by allowing us to create our own custom subscribers as well, we can truly route events to any service or application. And what’s more, we will also see how we can use the API to create a subscription from our subscriber, allowing us to quickly onboard new services, which can then start listening to the events which are of interest to them. In this sample, we will create an Azure API App, which will receive the events from our previous blog post, and store them in Azure Table Storage. On starting, the API App will check if the subscriptions it uses are created, and if not, it will create them and point them to the various endpoints the API App exposes.

Azure Event Grid

Azure Event Grid

As we will be using Table Storage to store the data from the events, we will start by creating this in a storage account. I am using the storage account I created in this post, but of course you can also set up a new account for this.

Create Table Storage

Create Table Storage

Add Table Storage

Add Table Storage

Now we will create the API App which will subscribe to our events. Start by creating a new ASP.NET Web Application project.

Create ASP.NET Web Application project

Create ASP.NET Web Application project

Now choose the Azure API App template.

Create Azure API App

Create Azure API App

Connect Storage Account

As we will be connecting to Table Storage, we will add Azure Storage as a connected service on the overview pane.

Add Azure Storage

Add Azure Storage

Choose the storage account in which we created the Table Storage. Optionally you can also create a new Storage Account from here.

Choose Storage Account

Choose Storage Account

Application Settings

We can now start implementing our API App. Let’s start by adding some application settings in the web.config file. These will be needed later on in our application. Once the API App has been deployed we will recreate these settings on the API App’s application settings as well. These are the settings which need to be created.

  • AzureStorageConnectionString
    • This is the connection string for the Storage Account, which can be retrieved in the Azure Portal.
  • SubscriptionID
    • GUID with your subscription ID, can also be retrieved from the Azure portal.
  • AuthorizationToken
    • In this sample I will be using a bearer token to connect to the Azure management API. Instructions on how to get this token can be found in this blogpost by Toon, follow the instructions under Authenticate With Service Principal.
      In a production environment you would implement this using the Azure AD SDK, as the bearer token will expire after a few hours.
  • ResourceGroup
    • The name of the resource group in which the Event Grid Topic has been created.
  • EventGridTopic
    • The name of the Event Grid Topic to which we want to subscribe.
  • ApiAppUrl
    • URL on which the API App can be reached, which will be used in the subscription endpoints. We will know this once the API App has been deployed, at which time we can update this in the API App’s application settings. Keep in mind that we need an https endpoint for our Event Grid subscriptions, as http will throw an error when creating the subsciption. Luckily API Apps come with a https endpoint out of the box.
Add application settings

Add application settings

Data Classes

We now will create two new classes, which will be be used to receive the repair and order events we sent in in the previous blog post. The first class is the data we sent in our Data node in our custom event.

/// <summary>
/// Data which can be sent with various ship events.
/// </summary>
public class ShipEventData
{
        /// <summary>
        /// Name of the ship.
        /// </summary>
        public string Ship { get; set; }
 
        /// <summary>
        /// Type of event.
        /// </summary>
        public string Type { get; set; }
 
        /// <summary>
        /// Device received in the event.
        /// </summary>
        public string Device { get; set; }
 
        /// <summary>
        /// Description received in the event.
        /// </summary>
        public string Description { get; set; }
 
        /// <summary>
        /// Product received in the event.
        /// </summary>
        public string Product { get; set; }
 
        /// <summary>
        /// Amount received in the event.
        /// </summary>
        public int? Amount { get; set; }
}

And the second class is the event we will receive this from Event Grid.

/// <summary>
/// Class used to receive ship event values.
/// </summary>
public class ShipEventValue
{
        /// <summary>
        /// Time when event was created.
        /// </summary>
        public string EventTime;
 
        /// <summary>
        /// Data of the event.
        /// </summary>
        public ShipEventData Data;
}

Now let’s implement the Subscription class, which will be used to create the subscriptions we need for our sample when the API App starts.

/// <summary>
/// Defines a subscription with its filters.
/// </summary>
public class Subscription
{
        /// <summary>
        /// Name of the subscription.
        /// </summary>
        public string Name;
 
        /// <summary>
        /// Filter which will look at the start of the subscription's subject.
        /// </summary>
        public string PrefixFilter;
 
        /// <summary>
        /// Filter which will look at the end of the subscription's subject.
        /// </summary>
        public string SuffixFilter;
}

We will also need a class which will be used to insert our data into the Table Storage.

/// <summary>
/// Used to insert ship events to Table Storage.
/// </summary>
public class ShipEventEntity : TableEntity
{
        /// <summary>
        /// Constructor.
        /// </summary>
        public ShipEventEntity(string ship, string dateTime)
        {
                PartitionKey = ship;
                RowKey = dateTime;
        }
 
        /// <summary>
        /// Type of event.
        /// </summary>
        public string Type { get; set; }
 
        /// <summary>
        /// Device received in the event.
        /// </summary>
        public string Device { get; set; }
 
        /// <summary>
        /// Description received in the event.
        /// </summary>
        public string Description { get; set; }
 
        /// <summary>
        /// Product received in the event.
        /// </summary>
        public string Product { get; set; }
 
        /// <summary>
        /// Amount received in the event.
        /// </summary>
        public int? Amount { get; set; }
}

Controller

The controller is used to expose our methods to the outside world. In this case, we will provide four endpoints for different types of subscriptions. Each method will be called by the subscribtion on different events, and write the data it received to it’s own table in Table Storage. In a production implementation, this would probably be four different services, and for different parties who are interested in the events (for example, a specific ship might have to get it’s orders from supplier, while another ship gets it’s orders from supplier B).

We will change the name of the default ValuesController class to SubscriberController to better represent our scenario, and instantiate a CloudStorageAccount used to communicate with our Table Storage.

public class SubscribersController : ApiController
    {
        /// <summary>
        /// Storage account used to store to Table Storage.
        /// </summary>
        private readonly CloudStorageAccount _storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("AzureStorageConnectionString"));
}

Add the following method to the class, which will take the data we receive on our endpoints, and store it into Table Storage.

/// <summary>
/// Insert Ship Event into table storage.
/// </summary>
private async Task InsertToTable(IReadOnlyList<ShipEventValue> value, string tableName)
{
        // Check if any events were received
        if (value == null || value.Count == 0)
        {
                return;
        }
 
        // Create the table client
        var tableClient = _storageAccount.CreateCloudTableClient();
 
        // Retrieve a reference to the table
        var table = tableClient.GetTableReference(tableName);
 
        // Create the table if it doesn't exist
        table.CreateIfNotExists();
 
        // Create a new ship event entity
        var shipEventEntity = new ShipEventEntity(value[0].Data.Ship, value[0].EventTime)
        {
                Type = value[0].Data.Type,
                Product = value[0].Data.Product,
                Amount = value[0].Data.Amount,
                Device = value[0].Data.Device,
                Description = value[0].Data.Description
        };
 
        // Create the TableOperation object that inserts the customer entity
        var insertOperation = TableOperation.Insert(shipEventEntity);
 
        // Execute the insert operation
        await table.ExecuteAsync(insertOperation);
}

And the final piece in this class are the methods for the endpoints. Notice the ActionName attributes, which we will use to have various endpoints in our API.

/// <summary>
/// Receives all events.
/// </summary>
[ActionName("All")]
public async Task<StatusCodeResult> PostAll([FromBody] List<ShipEventValue> value)
{
        await InsertToTable(value, "All");
        return new StatusCodeResult(HttpStatusCode.Created, this);
}
 
/// <summary>
/// Receives all types of events for the ship Hydra.
/// </summary>
[ActionName("Hydra")]
public async Task<StatusCodeResult> PostHydra([FromBody] List<ShipEventValue> value)
{
        await InsertToTable(value, "Hydra");
        return new StatusCodeResult(HttpStatusCode.Created, this);
}
 
/// <summary>
/// Receives repairs for all ships.
/// </summary>
[ActionName("Repairs")]
public async Task<StatusCodeResult> PostRepairs([FromBody] List<ShipEventValue> value)
{
        await InsertToTable(value, "Repairs");
        return new StatusCodeResult(HttpStatusCode.Created, this);
}
 
/// <summary>
/// Receives orders for the ship Aeris.
/// </summary>
[ActionName("AerisOrders")]
public async Task<StatusCodeResult> PostAerisOrders([FromBody] List<ShipEventValue> value)
{
        await InsertToTable(value, "AerisOrders");
        return new StatusCodeResult(HttpStatusCode.Created, this);
}

Configure Routes

Now hop on over to the WebApiConfig class, and implement the following code. This will generate the different endpoints for our Controller actions.

public static class WebApiConfig
{
        public static void Register(HttpConfiguration config)
        {
                // Web API routes 
                config.MapHttpAttributeRoutes();
                config.Routes.MapHttpRoute(name: "routes", routeTemplate: "api/{controller}/{action}");
        }
}

Subscriptions Creation

Finally we need to implement the Global.asax class, in which we will create our Event Grid Subscriptions on start up of the API App. This is where we define the subscriptions to be created, including their filters. Event Grid allows us to filter on the subject’s prefix and suffix, as well as the event type.

public class WebApiApplication : HttpApplication
{
        /// <summary>
        /// Subscriptions to be created.
        /// </summary>
        private readonly List<Subscription> _subscriptions = new List<Subscription>
        {
                new Subscription { Name = "All" },
                new Subscription { Name = "Hydra", PrefixFilter = "Hydra" },
                new Subscription { Name = "Repairs", SuffixFilter = "Repair" },
                new Subscription { Name = "AerisOrders", PrefixFilter = "Aeris", SuffixFilter = "Order" }
        };
}

Currently we don’t have a SDK available to work with Event Grid, so we will be using a HttpClient to work directly against its API.

/// <summary>
/// Create HTTP client used to communicate with Azure.
/// </summary>
/// <returns></returns>
private static HttpClient CreateHttpClient()
{
        // Create a HTTP client
        var httpClient = new HttpClient();
 
        // Add key in the request headers
        httpClient.DefaultRequestHeaders.Add("Authorization", $"Bearer {CloudConfigurationManager.GetSetting("AuthorizationToken")}");
 
        // Return the HTTP client
        return httpClient;
}

For each subsription, we will need to check if it does not already exist. This allows us to add new subscriptions whenever we want.

/// <summary>
/// Check if subscription exists.
/// </summary>
private static async Task<bool> SubscriptionExists(string subscription)
{
        // Check if subscription exists
        var result = await CreateHttpClient()
                .GetAsync(
                        $"https://management.azure.com/subscriptions/{CloudConfigurationManager.GetSetting("SubscriptionID")}/resourceGroups/{CloudConfigurationManager.GetSetting("ResourceGroup")}/providers/Microsoft.EventGrid/topics/{CloudConfigurationManager.GetSetting("EventGridTopic")}/providers/Microsoft.EventGrid/eventSubscriptions/{subscription}?api-version=2017-06-15-preview");
        return result.IsSuccessStatusCode;
}

If the specific subscription does not yet exist, we will create it using the following code.

/// <summary>
/// Create subscription with filters.
/// </summary>
private static async Task CreateSubscription(string subscription, string prefixFilter, string suffixFilter)
{
        // Set up create subscription message
        var createSubscription = new
        {
                properties = new
                {
                        destination = new { endpointType = "webhook", properties = new { endpointUrl = $"{CloudConfigurationManager.GetSetting("ApiAppUrl")}/api/Subscribers/{subscription}" } },
                        filter = new { includedEventTypes = new[] { "shipevent" }, subjectBeginsWith = prefixFilter, subjectEndsWith = suffixFilter, subjectIsCaseSensitive = "false" }
                }
        };
 
        // Create content to be sent
        var json = JsonConvert.SerializeObject(createSubscription);
        var content = new StringContent(json, Encoding.UTF8, "application/json");
 
        // Create subscription
        await CreateHttpClient()
                .PutAsync(
                        $"https://management.azure.com/subscriptions/{CloudConfigurationManager.GetSetting("SubscriptionID")}/resourceGroups/{CloudConfigurationManager.GetSetting("ResourceGroup")}/providers/Microsoft.EventGrid/topics/{CloudConfigurationManager.GetSetting("EventGridTopic")}/providers/Microsoft.EventGrid/eventSubscriptions/{subscription}?api-version=2017-06-15-preview",
                        content);
}

And finally implement the method which will loop over our subscriptions, creating the ones we need. Call this method whenever the application is started.

/// <summary>
/// Entry point of application.
/// </summary>
protected async void Application_Start()
{
        GlobalConfiguration.Configure(WebApiConfig.Register);
        await CreateSubscriptions();
}
 
/// <summary>
/// Create subscriptions that don't exist.
/// </summary>
private async Task CreateSubscriptions()
{
        // Check if subscriptions can be created, this will only be done if the endpoint of this API App has been updated in the settings
        if (CloudConfigurationManager.GetSetting("ApiAppUrl").ToLowerInvariant().Contains("tobereplaced"))
        {
                return;
        }
 
        // Loop through subsriptions
        foreach (var subscription in _subscriptions)
        {
                // Check if subscription already exists
                if (await SubscriptionExists(subscription.Name))
                {
                        continue;
                }
 
                // Create subscription
                await CreateSubscription(subscription.Name, subscription.PrefixFilter, subscription.SuffixFilter);
 
                // Wait for a while, to prevent throttling
                Thread.Sleep(5000);
        }
}

Deployment

Now that our API App has been completed, we can deploy it to Azure. Rightclick on the project, and select Publish. This will create the publishing wizard. Create a new API App, and set the properties where you want it to be deployed.

Publish API App

Publish API App

Once it has been deployed, we need to add the application settings for the API App in the portal.

Add application settings

Add application settings

After the settings have been added, restart the API App. This will now start the creation of the subscriptions, which you should be able to see in the Event Grid blade after a minute or so.

Event Grid Subscriptons have been created by the API App

Event Grid Subscriptons have been created by the API App

Testing

Now that we have our events set up end to end, we can use the application from this blog post to start generating events. These will then be routed by Event Grid to the endpoints of our subscriptions, which trigger the API App’s different methods. And finally, they will be stored into the various tables in our table storage. The complete code with this blog post can be found here.

Events stored in Table Storage

Events stored in Table Storage

Sending Custom Events To Azure Event Grid

Sending Custom Events To Azure Event Grid

In my previous post I showed how we can use the recently announced Event Grid service to integrate Azure services with each other. In this post, I will show how we can send custom events from any application or service into Event Grid, allowing us to integrate any service. My next post will show you in detail how we can subscribe to these custom events.

Integrate any service or application

Integrate any service or application

We will start by creating a new Event Grid Topic in the portal. Go to the Event Grid Topics blade, add a new Topic, and provide the details.

Create Event Grid Topic

Create Event Grid Topic

We will now create an application, which will send custom events into our topic. By setting the subject and event type, we can later on subscribe to specific events. I used an application I have built for a shipping company as inspiration, which they use for their day to day work. In this sample, we build a console app, which will generate events when an order has been placed, as well as when a repair has been requested.

Create Console App

Create Console App

Data Classes

Once created, we will first add the classes which will represent the data of the orders and repairs. Both of these inherit from the ShipEvent class, which holds the common data.

/// <summary>
/// Event sent for a specific ship.
/// </summary>
public class ShipEvent
{
        /// <summary>
        /// Name of the ship.
        /// </summary>
        public string Ship { get; set; }
 
        /// <summary>
        /// Type of event.
        /// </summary>
        public string Type { get; set; }
}
/// <summary>
/// Used to place an order.
/// </summary>
public class Order : ShipEvent
{
        /// <summary>
        /// Name of the product.
        /// </summary>
        public string Product { get; set; }
 
        /// <summary>
        /// Number of items to be ordered.
        /// </summary>
        public int Amount { get; set; }
 
        /// <summary>
        /// Constructor.
        /// </summary>
        public Order()
        {
                Type = "Order";
        }
}
/// <summary>
/// Used to request a repair.
/// </summary>
public class Repair : ShipEvent
{
        /// <summary>
        /// Device which needs to be repaired.
        /// </summary>
        public string Device { get; set; }
 
        /// <summary>
        /// Description of the defect.
        /// </summary>
        public string Description { get; set; }
 
        /// <summary>
        /// Constructor.
        /// </summary>
        public Repair()
        {
                Type = "Repair";
        }
}

Custom Event Class

Now we will create our custom Event class, which follows the structure of the Event Grid events schema. In this sample, we will only send out shipevent types of events, but you could easily expand this with other event types as well. We use the UpdateProperties property to update the Subject, which will later on be used to filter our messages in our subscriptions. Here we also include our Data, which is our payload of the Order or Repair we are sending.

/// <summary>
/// Event to be sent to Event Grid Topic.
/// </summary>
public class Event
{
        /// <summary>
        /// This will be used to update the Subject and Data properties.
        /// </summary>
        public ShipEvent UpdateProperties
        {
                set
                {
                        Subject = $"{value.Ship}/{value.Type}";
                        Data = value;
                }
        }
 
        /// <summary>
        /// Gets the unique identifier for the event.
        /// </summary>
        public string Id { get; }
 
        /// <summary>
        /// Gets the publisher defined path to the event subject.
        /// </summary>
        public string Subject { get; set; }
 
        /// <summary>
        /// Gets the registered event type for this event source.
        /// </summary>
        public string EventType { get; }
 
        /// <summary>
        /// Gets the time the event is generated based on the provider's UTC time.
        /// </summary>
        public string EventTime { get; }
 
        /// <summary>
        /// Gets or sets the event data specific to the resource provider.
        /// </summary>
        public ShipEvent Data { get; set; }
 
        /// <summary>
        /// Constructor.
        /// </summary>
        public Event()
        {
                Id = Guid.NewGuid().ToString();
                EventType = "shipevent";
                EventTime = DateTime.UtcNow.ToString("o");
        }
}

Settings

In the Program class, add two properties. The first will hold the endpoint of the Event Grid Topic we just created, while the second will hold the key used to connect to the topic.

Grab the Topic endpoint and key

Grab the Topic endpoint and key

/// <summary>
/// Send events to an Event Grid Topic.
/// </summary>
public class Program
{
        /// <summary>
        /// Endpoint of the Event Grid Topic.
        /// Update this with your own endpoint from the Azure Portal.
        /// </summary>
        private const string TOPIC_ENDPOINT = "https://eventgridcustomeventstopic.westcentralus-1.eventgrid.azure.net/api/events";
 
        /// <summary>
        /// Key of the Event Grid Topic.
        /// Update this with your own key from the Azure Portal.
        /// </summary>
        private const string KEY = "1yroJHgswWDsc3ekc94UoO/nCdClNOwEuqV/HuzaaDM=";
}

Send To Event Grid

Next we will add the method to this class, which will be used to send our custom events to the Topic. At this moment, we don’t have a SDK available yet, but luckily Event Grid does expose a powerful API which we can leverage using a HttpClient. When using this API, we will need to send in a aeg-sas-key with the key we retrieved from the portal.

/// <summary>
/// Send events to Event Grid Topic.
/// </summary>
private static async Task SendEventsToTopic(object events)
{
        // Create a HTTP client which we will use to post to the Event Grid Topic
        var httpClient = new HttpClient();
 
        // Add key in the request headers
        httpClient.DefaultRequestHeaders.Add("aeg-sas-key", KEY);
 
        // Event grid expects event data as JSON
        var json = JsonConvert.SerializeObject(events);
 
        // Create request which will be sent to the topic
        var content = new StringContent(json, Encoding.UTF8, "application/json");
 
        // Send request
        Console.WriteLine("Sending event to Event Grid...");
        var result = await httpClient.PostAsync(TOPIC_ENDPOINT, content);
 
        // Show result
        Console.WriteLine($"Event sent with result: {result.ReasonPhrase}");
        Console.WriteLine();
}

Main Method

And finally we will implement the main method. In this method, the user specifies what they want to do, and provides the data for that action. We will then send this data to our Topic.

/// <summary>
/// Main method.
/// </summary>
public static void Main(string[] args)
{
        // Set default values
        var entry = string.Empty;
 
        // Loop until user exits
        while (entry != "e" && entry != "exit")
        {
                // Get entry from user
                Console.WriteLine("Do you want to send an (o)rder, request a (r)epair or (e)xit the application?");
                entry = Console.ReadLine()?.ToLowerInvariant();
 
                // Get name of the ship
                Console.WriteLine("What is the name of the ship?");
                var shipName = Console.ReadLine();
 
                // Order
                var events = new List<Event>();
                switch (entry)
                {
                        case "e":
                        case "exit":
                                continue;
                        case "o":
                        case "order":
                                // Get user input
                                Console.WriteLine("What would you like to order?");
                                var product = Console.ReadLine();
                                Console.WriteLine("How many would you like to order?");
                                var amount = Convert.ToInt32(Console.ReadLine());
 
                                // Create order event
                                // Event Grid expects a list of events, even when only one event is sent
                                events.Add(new Event { UpdateProperties = new Order { Ship = shipName, Product = product, Amount = amount } });
                                break;
                        case "r":
                        case "repair":
                                // Get user input
                                Console.WriteLine("Which device would you like to get repaired?");
                                var device = Console.ReadLine();
                                Console.WriteLine("Please provide a description of the issue.");
                                var description = Console.ReadLine();
 
                                // Create repair event
                                // Event Grid expects a list of events, even when only one event is sent
                                events.Add(new Event { UpdateProperties = new Repair { Ship = shipName, Device = device, Description = description } });
                                break;
                        default:
                                Console.Error.WriteLine("Invalid entry received.");
                                continue;
                }
 
                // Send to Event Grid Topic
                SendEventsToTopic(events).Wait();
        }
}

Testing

Now when we run our application, we can start sending in events to Event Grid, after which we can use subscriptions to pick them up. More on these subscriptions can be found in my previous post and next post. The complete code with this blog post can be found here.

Send events from the application

Send events from the application

Integrate 2017 – Another Amazing Event!

Integrate 2017 – Another Amazing Event!

Last week, June 26th to 28th, Integrate 2017 was once again held in London. This is the largest integration centered event, and a great way to have fun with the community, see amazing sessions, and get to meet the product groups. I have been to these events since the beginning, and have seen it grow into one of the best events around.

Last year at Integrate, we got introduced to the vision of Hybrid Integration, a way of seamlessly integrating across the cloud and on-premises. This year, what we saw was a matured vision, with all the bits and pieces falling into place, and giving us the tools to build great solutions. I think Microsoft made a good bet on this, as we see more and more of these hybrid integrations being built at our customers, where some of the data or logic will remain on premises, but they do want to leverage the power and flexibility of Azure.

C:UsersELDER_~1AppDataLocalTempmsohtmlclip12clip_image001.jpg

The Microsoft teams as well as the MVP’s had very engaging sessions, which truly inspired people to get their hands on all the parts of the hybrid platform.

I always love seeing the sessions, getting the latest information, and being inspired, but the best part of these events to me is the interaction with the community. I think we have one of the best communities around, and so it’s great to catch up with old friends from around the globe, and of course making a lot of new friends as well. I recommend anyone to visit one of these events for themselves, and just say hi to people you do not know yet. People are always willing to have a chat, share their inspiration and experiences, and catch a beer afterwards.

And for those who can’t wait another year for Integrate, there’s good news, as we will be having an Integrate in Redmond, on October 25th to 27th. If you could not come to London, or just can’t get enough of Integrate and our amazing community, be sure to come. I’m looking forward to seeing you there, so come and say hi.

C:UsersELDER_~1AppDataLocalTempmsohtmlclip12clip_image002.jpg

As one of the global organizers of the Global Integration Bootcamp, I am proud to say we have also presented next year’s version of this global event. We will be holding GIB2018 on March 24th 2018, if you want to host your own location just drop us a line. The website will be updated soon, but you can already find our contact details there.

Of course, I am not the only one posting about this year’s amazing Integrate, there are already a couple of great recaps out there, here are some you cannot miss.

BizTalk360
Kent Weare
Steef-Jan
Daniel Toomey
Wagner Silveira
Codit

TUGA IT 2017 – Recap of an amazing event

TUGA IT 2017 – Recap of an amazing event

Last week I was in Lisbon for TUGA IT, one of the greatest events here in Europe. A full day of workshops, followed by two days of sessions in multiple tracks, with attendees and presenters from all around Europe. For those who missed it this year, make sure to be there next time!

https://pbs.twimg.com/media/DARVdjOXcAArTIe.jpg:large

On Saturday I did a session on Industrial IoT using Azure IoT Hub. The industrial space is where we will be seeing a huge growth in IoT, and I showed how we can use Azure IoT Hub to manage our devices and do bi-directional communication. Dynamics 365 was used to give a familiar and easy to use interface to work with these devices and visualize the data.

And of course, I was not alone. The other speakers in the integration track, are community heroes and my good friends, Sandro, Nino, Steef-Jan, Tomasso and Ricardo, who all did some amazing sessions as well. It is great to be able to present side-by-side with these amazing guys, to learn and discuss.

There were some other great sessions as well in the other tracks, like Karl’s session on DevOps, Kris‘ session on the Bot Framework, and many more. At an event like this it’s always so much content being presented, that you can’t always see every session you would like, but luckily the speakers are always willing to have a discussion with you outside of the sessions as well. And with 8 different tracks running side-by-side, there’s always something interesting going on.

One of the advantages of attending all these conferences, is that I get to see a lot of cities as well. This was the second time I was in Lisbon, and Sandro has showed us a lot of beautiful spots in this great city. We enjoyed traditional food and drinks, a lot of ice cream, and had a lot of fun together.

Using IoT Hub for Cloud to Device Messaging

Using IoT Hub for Cloud to Device Messaging

In the previous blog posts of this IoT Hub series, we have seen how we can use IoT Hub to administrate our devices, and how to do device to cloud messaging. In this post we will see how we can do cloud to device messaging, something which is much harder when not using Azure IoT Hub. IoT devices will normally be low power, low performance devices, like small footprint devices and purpose-specific devices. This means they are not meant to (and most often won’t be able to) run antivirus applications, firewalls, and other types of protection software. We want to minimize the attack surface they expose, meaning we can’t expose any open ports or other means of remoting into them. IoT Hub uses Service Bus technologies to make sure there is no inbound traffic needed toward the device, but instead uses per-device topics, allowing us to send commands and messages to our devices without the need to make them vulnerable to attacks.

IoT Hub For Cloud To Device Messaging

Send Message To Device

When we want to send one-way notifications or commands to our devices, we can use cloud to device messages. To do this, we will expand on the EngineManagement application we created in our earlier posts, by adding the following controls, which, in our scenario, will allow us to start the fans of the selected engine.

IoT Hub For Cloud To Device Messaging

To be able to communicate to our devices, we will first implement a ServiceClient in our class.

private readonly ServiceClient serviceClient = ServiceClient.CreateFromConnectionString("HostName=youriothubname.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=yoursharedaccesskey"); 

Next we implement the event handler for the Start Fans button. This type of communication targets a specific device by using the DeviceID from the device twin.

private async void ButtonStartFans_Click(object sender, EventArgs e)
{
    var message = new Microsoft.Azure.Devices.Message();
    message.Properties.Add(new KeyValuePair&amp;lt;string, string&amp;gt;("StartFans", "true"));
    message.Ack = DeliveryAcknowledgement.Full; // Used for getting delivery feedback
    await serviceClient.SendAsync(comboBoxSerialNumber.Text, message);
}

Process Message On Device

Once we have sent our message, we will need to process it on our device. For this, we are going to update the client application of our simulated engine (which we also created in the previous blog posts) by adding the following method.

private static async void ReceiveMessageFromCloud(object sender, DoWorkEventArgs e)
{
    // Continuously wait for messages
    while (true)
    {
        var message = await client.ReceiveAsync();

        // Check if message was received
        if (message == null)
        {
            continue;
        }

        try
        {
            if (message.Properties.ContainsKey("StartFans") &amp;amp;&amp;amp; message.Properties["StartFans"] == "true")
            {
                // This would start the fans
                Console.WriteLine("Fans started!");

            }

            await client.CompleteAsync(message);
        }
        catch (Exception)
        {
            // Send to deadletter
            await client.RejectAsync(message);
        }
    }
}

We will run this method in the background, so update the Main method, and insert the following code after the call for updating the firmware.

// Wait for messages in background
var backgroundWorker = new BackgroundWorker();
backgroundWorker.DoWork += ReceiveMessageFromCloud;
backgroundWorker.RunWorkerAsync();

Message Feedback

Although cloud to device messages are a one-way communication style, we can request feedback on the delivery of the message, allowing us to invoke retries or start compensation when the message fails to be delivered. To do this, implement the following method in our EngineManagement backend application.

private async void ReceiveFeedback(object sender, DoWorkEventArgs e)
{
    var feedbackReceiver = serviceClient.GetFeedbackReceiver();
    
    while (true)
    {
        var feedbackBatch = await feedbackReceiver.ReceiveAsync();

        // Check if feedback messages were received
        if (feedbackBatch == null)
        {
            continue;
        }

        // Loop through feedback messages
        foreach(var feedback in feedbackBatch.Records)
        {
            if(feedback.StatusCode != FeedbackStatusCode.Success)
            {
                // Handle compensation here
            }
        }

        await feedbackReceiver.CompleteAsync(feedbackBatch);
    }
}

And add the following code to the constructor.

var backgroundWorker = new BackgroundWorker();
backgroundWorker.DoWork += ReceiveFeedback;
backgroundWorker.RunWorkerAsync();

Call Remote Method

Another feature when sending messages from the cloud to our devices is to call a remote method on the device, which we call invoking a direct method. This type of communication is used when we want to have an immediate confirmation of the outcome of the command (unlike setting the desired state and communicating back reported properties, which has been explained in the previous two blog posts). Let’s update the EngineManagement application by adding the following controls, which would allow us to send an alarm message to the engine, sounding the alarm and displaying a message.

IoT Hub For Cloud To Device Messaging

Now add the following event handler for clicking the Send Alarm button.

private async void ButtonSendAlarm_Click(object sender, EventArgs e)
{
    var methodInvocation = new CloudToDeviceMethod("SoundAlarm") { ResponseTimeout = TimeSpan.FromSeconds(300) };
    methodInvocation.SetPayloadJson(JsonConvert.SerializeObject(new { message = textBoxMessage.Text }));

    CloudToDeviceMethodResult response = null;

    try
    {
        response = await serviceClient.InvokeDeviceMethodAsync(comboBoxSerialNumber.Text, methodInvocation);
    }
    catch (IotHubException)
    {
        // Do nothing
    }

    if (response != null &amp;amp;&amp;amp; JObject.Parse(response.GetPayloadAsJson()).GetValue("acknowledged").Value&amp;lt;bool&amp;gt;())
    {
        MessageBox.Show("Message was acknowledged.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
    }
    else
    {
        MessageBox.Show("Message was not acknowledged!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
    }
}

And in our simulated device, implement the SoundAlarm remote method which is being called.

 
private static Task&amp;lt;MethodResponse&amp;gt; SoundAlarm(MethodRequest methodRequest, object userContext)
{
    // On a real engine this would sound the alarm as well as show the message
    Console.ForegroundColor = ConsoleColor.Red;
    Console.WriteLine($"Alarm sounded with message: {JObject.Parse(methodRequest.DataAsJson).GetValue("message").Value&amp;lt;string&amp;gt;()}! Type yes to acknowledge.");
    Console.ForegroundColor = ConsoleColor.White;
    var response = JsonConvert.SerializeObject(new { acknowledged = Console.ReadLine() == "yes" });
    return Task.FromResult(new MethodResponse(Encoding.UTF8.GetBytes(response), 200));
}

And finally, we need to map the SoundAlarm method to the incoming remote method call. To do this, add the following line in the Main method.

client.SetMethodHandlerAsync("SoundAlarm", SoundAlarm, null);

Call Remote Method On Multiple Devices

When invoking direct methods on devices, we can also use jobs to send the command to multiple devices. We can use our custom tags here to broadcast our message to a specific set of devices.
In this case, we will add a filter on the engine type and manufacturer, so we can, for example, send a message to all main engines manufactured by Caterpillar. In our first blog post, we added these properties as tags on the device twin, so we now use these in our filter. Start by adding the following controls to our EngineManagement application.

IoT Hub For Cloud To Device Messaging

Now add a JobClient to the application, which will be used to broadcast and monitor our messages.

 
private readonly JobClient jobClient = JobClient.CreateFromConnectionString("HostName=youriothubname.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=yoursharedaccesskey");

To broadcast our message, update the event handler for the Send Alarm button to the following.

 
private async void ButtonSendAlarm_Click(object sender, EventArgs e)
{
    var methodInvocation = new CloudToDeviceMethod("SoundAlarm") { ResponseTimeout = TimeSpan.FromSeconds(300) };

    methodInvocation.SetPayloadJson(JsonConvert.SerializeObject(new { message = textBoxMessage.Text }));

    if (checkBoxBroadcast.Checked)
    {
        try
        {
            var jobResponse = await jobClient.ScheduleDeviceMethodAsync(Guid.NewGuid().ToString(), $"tags.engineType = '{comboBoxEngineTypeFilter.Text}' and tags.manufacturer = '{textBoxManufacturerFilter.Text}'", methodInvocation, DateTime.Now, 10);
            
            await MonitorJob(jobResponse.JobId);
        }
        catch (IotHubException)
        {
            // Do nothing
        }
    }
    else
    {
        CloudToDeviceMethodResult response = null;

        try
        {
            response = await serviceClient.InvokeDeviceMethodAsync(comboBoxSerialNumber.Text, methodInvocation);
        }
        catch (IotHubException)
        {
            // Do nothing
        }

        if (response != null &amp;amp;&amp;amp; JObject.Parse(response.GetPayloadAsJson()).GetValue("acknowledged").Value&amp;lt;bool&amp;gt;())
        {
            MessageBox.Show("Message was acknowledged.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
        }
        else
        {
            MessageBox.Show("Message was not acknowledged!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
        }
    }
}

And finally, add the MonitorJob method with the following implementation.

 
public async Task MonitorJob(string jobId)
{
    JobResponse result;

    do
    {
        result = await jobClient.GetJobAsync(jobId);
        Thread.Sleep(2000);
    }
    while (result.Status != JobStatus.Completed &amp;amp;&amp;amp; result.Status != JobStatus.Failed);

    // Check if all devices successful
    if (result.DeviceJobStatistics.FailedCount &amp;gt; 0)
    {
        MessageBox.Show("Not all engines reported success!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
    }
    else
    {
        MessageBox.Show("All engines reported success.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
    }
}

Conclusion

By using IoT Hub we have a safe and secure way of communicating from the cloud and our backend to devices out in the field. We have seen how we can use the cloud to device messages in case we want to send one-way messages to our device or use direct methods when we want to be informed of the outcome from our invocation. By using jobs, we can also call out to multiple devices at once, limiting the devices being called by using (custom) properties of the device twin. The code for this post can be found here.

IoT Hub Blog Series

In case you missed the other articles from this IoT Hub series, take a look here.

Blog 1: Device Administration Using Azure IoT Hub
Blog 2: Implementing Device To Cloud Messaging Using IoT Hub
Blog 3: Using IoT Hub for Cloud to Device Messaging

Author: Eldert Grootenboer

Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/.

TUGA IT 2017 – Be There!

TUGA IT 2017 – Be There!

Just under two weeks away, TUGA IT will be held once again in beautiful Lisbon. TUGA IT is three days full of sessions, workshops, meeting the experts, and having a great time. After having visited last year as a participant, I am honored to have been selected as one of the speakers this year.

On saturday I will be giving a session where we will be going into industrial IoT on Azure. The industrial space is where we will be seeing a huge growth in IoT, and I will be showing how we can use Azure IoT Hub to manage our devices and do bi-directional communication. I will also be showing how we can use Dynamics 365 to give a familiar and easy to use interface to work with these devices and visualize the data. And of course, Azure’s stack will be used to extend on the solution.

Integration is everywhere, and TUGA IT is no exception. We will have a full integration track on saturday where I will be joined by fellow integrators and good friends, Steef-Jan, Sandro, Nino, Tomasso and Ricardo to give a day full of integration sessions.

Schedule overview Saturday including Integration track

But there is more than just integration going on. On thursday there are several workshops, where you can learn about various Microsoft products and get hands-on experience. On friday there are also multiple tracks with sessions on many topics like Azure, .NET and more.

Schedule overview Thursday

Schedule overview Friday

As you can see, there is something for everyone at TUGA IT, whether you want to do workshops, attend sessions, or just want to meet with MVP’s and experts from various countries. Be sure to be there as well, registrations are open, and they only ask a small fee for lunch.

TUGA IT 2017 – Be There!

TUGA IT 2017 – Be There!

Just under two weeks away, TUGA IT will be held once again in beautiful Lisbon. TUGA IT is three days full of sessions, workshops, meeting the experts, and having a great time. After having visited last year as a participant, I am honored to have been selected as one of the speakers this year.

On saturday I will be giving a session where we will be going into industrial IoT on Azure. The industrial space is where we will be seeing a huge growth in IoT, and I will be showing how we can use Azure IoT Hub to manage our devices and do bi-directional communication. I will also be showing how we can use Dynamics 365 to give a familiar and easy to use interface to work with these devices and visualize the data. And of course, Azure’s stack will be used to extend on the solution.

Integration is everywhere, and TUGA IT is no exception. We will have a full integration track on saturday where I will be joined by fellow integrators and good friends, Steef-Jan, Sandro, Nino, Tomasso and Ricardo to give a day full of integration sessions.

Schedule overview Saturday including Integration track

But there is more than just integration going on. On thursday there are several workshops, where you can learn about various Microsoft products and get hands-on experience. On friday there are also multiple tracks with sessions on many topics like Azure, .NET and more.

Schedule overview Thursday

Schedule overview Friday

As you can see, there is something for everyone at TUGA IT, whether you want to do workshops, attend sessions, or just want to meet with MVP’s and experts from various countries. Be sure to be there as well, registrations are open, and they only ask a small fee for lunch.