by Sandro Pereira | Mar 6, 2020 | BizTalk Community Blogs via Syndication
I am always paying attention to requests from members of the community, and whenever I can, I update this stencil pack with requested shapes or functionalities. And this is one of these cases, Josh asked me to add DevOps offerings stencils in special: Boards, Repos, Pipelines, Test Plans, and Artifacts.
The result was this. I hope you enjoy it.
What’s new in this version?
The main goal of this release was to provide the new icons present in the Azure Portal and update existing ones. In this version, the changes and additions are:
- New shapes: New shapes added on MIS Developer Stencils;
- SVG Files: Add new SVG files, and uniform all the filenames;
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack it’s a Visio package that contains fully resizable Visio shapes (symbols/icons) that will help you to visually represent On-premise, Cloud or Hybrid Integration and Enterprise architectures scenarios (BizTalk Server, API Management, Logic Apps, Service Bus, Event Hub…), solutions diagrams and features or systems that use Microsoft Azure and related cloud and on-premises technologies in Visio 2016/2013:
- BizTalk Server
- Microsoft Azure
- Integration
- Integration Service Environments (ISE)
- Logic Apps and Azure App Service in general (API Apps, Web Apps, and Mobile Apps)
- Azure API Management
- Messaging: Event Hubs, Event Grid, Service Bus, …
- Azure IoT and Docker
- AI, Machine Learning, Stream Analytics, Data Factory, Data Pipelines
- SQL Server, DocumentDB, CosmosDB, MySQL, …
- and so on
- Microsoft Power Platform
- Microsoft Flow
- PowerApps
- Power BI
- Office365, SharePoint,…
- DevOps and PowerShell
- Security and Governance
- And much more…
- … and now non-related Microsoft technologies like:
The Microsoft Integration Stencils Pack is composed of 27 files:
- Microsoft Integration Stencils
- MIS Additional or Support Stencils
- MIS AI and Machine Learning Stencils
- MIS Apps and Systems Logo Stencils
- MIS Azure Additional or Support Stencils
- MIS Azure Mono Color
- MIS Azure Old Versions
- MIS Azure Others Stencils
- MIS Azure Stencils
- MIS Buildings Stencils
- MIS Databases and Analytics Stencils
- MIS Deprecated Stencils
- MIS Developer Stencils
- MIS Devices Stencils
- MIS Files Stencils
- MIS Generic Stencils
- MIS Infrastructure Stencils
- MIS Integration Fun
- MIS Integration Patterns Stencils
- MIS IoT Devices Stencils
- MIS Office365
- MIS Power BI Stencils
- MIS PowerApps and Flows Stencils
- MIS SAP Stencils
- MIS Security and Governance
- MIS Servers (HEX) Stencils
- MIS Users and Roles Stencils
That you can use and resize without losing quality, in particular, the new shapes.
Download
You can download Microsoft Integration, Azure, BAPI, Office 365 and much more Stencils Pack for Visio from:
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack for Visio
GitHub
or from :
You can download Microsoft Integration Stencils Pack for Visio 2016/2013 from:
Microsoft Integration Stencils Pack for Visio 2016/2013 (10,1 MB)
Microsoft | TechNet Gallery
The post Microsoft Integration and Azure Stencils Pack for Visio: New version available (v6.1.0) appeared first on SANDRO PEREIRA BIZTALK BLOG.
by Sandro Pereira | Jan 19, 2020 | BizTalk Community Blogs via Syndication
It was only 9 days ago that I released the latest minor version of this package, on that time the goal was to update the Security and Governance stencils to please my dear friend Nino Crudele. It was then that I notice that Microsoft did again a completely redesigned on several of the symbols and of course add new services so I decided that this was the time to not only refresh the package with the new icons but also but so extra work.
What’s new in this version?
With the growing number of stencils in this package, it was becoming hard to easily find or look for the right shape/representation. So once again based on some feedback I received from the community I decide to do so rearranging. I still have many things to do in this project in terms of organization but for now, these are the changes on this major release:
- New shapes: The main additions are all the new shapes on the Azure Portal. New shapes added on:
- MIS Azure Stencils: containing all the main Azure Portal Services
- MIS AI and Machine Learning Stencils: all shapes related to AI or Machine Learning scenarios;
- MIS IoT Devices Stencils: all shapes related to IoT scenarios:
- Microsoft Integration Stencils;
- MIS Azure Others Stencils: containing Azure Postal Services, features of the services and other interesting shapes;
- MIS Azure Additional or Support Stencils: other shapes that may be interesting in supporting Azure diagrams and designs or presentations;
- MIS Azure Stencils: Complete update to this category with many new shapes added and many shapes moved to MIS Azure Others Stencils file;
- MIS AI and Machine Learning Stencils: Complete update to this category with many new shapes added and updating many others to there current stencils;
- Add categories:
- Microsoft Integration Stencils Old Version Stencils: containing all the oldest versions of stencils (obsolete, deprecated or replaced by a new shape)
- Ordered by name: I recently started to sort shapes in alphabetical order. These are the categories already sorted:
- MIS Azure Stencils;
- MIS Azure Others Stencils;
- MIS AI and Machine Learning Stencils;
- Microsoft Integration Stencils Old Version Stencils;
- MIS Security and Governance;
- Text Annotations: This is a requested feature and I already started to work on it but it will get some time to finish it. For now, you will see text annotations on:
- MIS AI and Machine Learning Stencils
- SVG Files: Add new SVG files, and uniform the name of the files;
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack it’s a Visio package that contains fully resizable Visio shapes (symbols/icons) that will help you to visually represent On-premise, Cloud or Hybrid Integration and Enterprise architectures scenarios (BizTalk Server, API Management, Logic Apps, Service Bus, Event Hub…), solutions diagrams and features or systems that use Microsoft Azure and related cloud and on-premises technologies in Visio 2016/2013:
- BizTalk Server
- Microsoft Azure
- Integration
- Integration Service Environments (ISE)
- Logic Apps and Azure App Service in general (API Apps, Web Apps, and Mobile Apps)
- Azure API Management
- Messaging: Event Hubs, Event Grid, Service Bus, …
- Azure IoT and Docker
- AI, Machine Learning, Stream Analytics, Data Factory, Data Pipelines
- SQL Server, DocumentDB, CosmosDB, MySQL, …
- and so on
- Microsoft Power Platform
- Microsoft Flow
- PowerApps
- Power BI
- Office365, SharePoint,…
- DevOps and PowerShell
- Security and Governance
- And much more…
- … and now non-related Microsoft technologies like:
The Microsoft Integration Stencils Pack is composed of 28 files:
- Microsoft Integration Stencils
- MIS Azure Stencils
- MIS Additional or Support Stencils
- MIS AI and Machine Learning Stencils
- MIS Apps and Systems Logo Stencils
- MIS Azure Additional or Support Stencils
- MIS Azure Mono Color
- MIS Azure Old Versions
- MIS Azure Others Stencils
- MIS Buildings Stencils
- MIS Databases and Analytics Stencils
- MIS Deprecated Stencils
- MIS Developer Stencils
- MIS Devices Stencils
- MIS Files Stencils
- MIS Generic Stencils
- MIS Infrastructure Stencils
- MIS Integration Fun
- MIS Integration Patterns Stencils
- MIS IoT Devices Stencils
- MIS Office365
- MIS Power BI Stencils
- MIS PowerApps and Flows Stencils
- MIS SAP Stencils
- MIS Security and Governance
- MIS Servers (HEX) Stencils
- MIS Users and Roles Stencils
- Microsoft Integration Stencils Old Version Stencils
That you can use and resize without losing quality, in particular, the new shapes.
Download
You can download Microsoft Integration, Azure, BAPI, Office 365 and much more Stencils Pack for Visio from:
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack for Visio
GitHub
or from :
You can download Microsoft Integration Stencils Pack for Visio 2016/2013 from:
Microsoft Integration Stencils Pack for Visio 2016/2013 (10,1 MB)
Microsoft | TechNet Gallery
The post Microsoft Integration and Azure Stencils Pack for Visio: New major version available (v6.0.0) appeared first on SANDRO PEREIRA BIZTALK BLOG.
by Sandro Pereira | Jan 10, 2020 | BizTalk Community Blogs via Syndication
In October, I did a major rearrange and release of my stencils pack mainly because Microsoft redesigned many of the icons present in the Azure Portal, but and guess what? Microsoft didn’t stop it yet. And several of the symbols that suffer a redesigned they already have a new version. So I decide that it is time for me to update my stencils once again, but instead of spending a lot of time and release everything at the same time, like I did last time, I decided this time I will do it in small waves.
What’s new in this version?
The main goal of this release was to provide the new icons present in the Azure Portal and update existing ones. In this version, the changes and additions are:
- New shapes: New shapes added on MIS Security and Governance, MIS Developer Stencils and MIS IoT Devices Stencils;
- MIS Security and Governance: Complete update to this category with many unique symbols added and updating many others to there current stencils;
- SVG Files: Add new SVG files, and uniform all the filenames;
- Special Highlights: Azure Arc and Machines – Azure Arc
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack it’s a Visio package that contains fully resizable Visio shapes (symbols/icons) that will help you to visually represent On-premise, Cloud or Hybrid Integration and Enterprise architectures scenarios (BizTalk Server, API Management, Logic Apps, Service Bus, Event Hub…), solutions diagrams and features or systems that use Microsoft Azure and related cloud and on-premises technologies in Visio 2016/2013:
- BizTalk Server
- Microsoft Azure
- Integration
- Integration Service Environments (ISE)
- Logic Apps and Azure App Service in general (API Apps, Web Apps, and Mobile Apps)
- Azure API Management
- Messaging: Event Hubs, Event Grid, Service Bus, …
- Azure IoT and Docker
- AI, Machine Learning, Stream Analytics, Data Factory, Data Pipelines
- SQL Server, DocumentDB, CosmosDB, MySQL, …
- and so on
- Microsoft Power Platform
- Microsoft Flow
- PowerApps
- Power BI
- Office365, SharePoint,…
- DevOps and PowerShell
- Security and Governance
- And much more…
- … and now non-related Microsoft technologies like:
The Microsoft Integration Stencils Pack is composed of 27 files:
- Microsoft Integration Stencils
- MIS Additional or Support Stencils
- MIS AI and Machine Learning Stencils
- MIS Apps and Systems Logo Stencils
- MIS Azure Additional or Support Stencils
- MIS Azure Mono Color
- MIS Azure Old Versions
- MIS Azure Others Stencils
- MIS Azure Stencils
- MIS Buildings Stencils
- MIS Databases and Analytics Stencils
- MIS Deprecated Stencils
- MIS Developer Stencils
- MIS Devices Stencils
- MIS Files Stencils
- MIS Generic Stencils
- MIS Infrastructure Stencils
- MIS Integration Fun
- MIS Integration Patterns Stencils
- MIS IoT Devices Stencils
- MIS Office365
- MIS Power BI Stencils
- MIS PowerApps and Flows Stencils
- MIS SAP Stencils
- MIS Security and Governance
- MIS Servers (HEX) Stencils
- MIS Users and Roles Stencils
That you can use and resize without losing quality, in particular, the new shapes.
Download
You can download Microsoft Integration, Azure, BAPI, Office 365 and much more Stencils Pack for Visio from:
Microsoft Integration, Azure, Power Platform, Office 365 and much more Stencils Pack for Visio
GitHub
or from :
You can download Microsoft Integration Stencils Pack for Visio 2016/2013 from:
Microsoft Integration Stencils Pack for Visio 2016/2013 (10,1 MB)
Microsoft | TechNet Gallery
The post Microsoft Integration and Azure Stencils Pack for Visio: New version available (v5.1.0) appeared first on SANDRO PEREIRA BIZTALK BLOG.
by Howard Edidin | Mar 28, 2018 | BizTalk Community Blogs via Syndication
An Azure IoT Hub can store just about any type of data from a Device.
There is support for:
- Sending Device to Cloud messages.
- Invoking direct methods on a device
- Uploading files from a device
- Managing Device Identities
- Scheduling Jobs on single for multiple devices
The following is the List of of built-in endpoints

Custom Endpoints can also be created.
IoT Hub currently supports the following Azure services as additional endpoints:
- Azure Storage containers
- Event Hubs
- Service Bus Queues
- Service Bus Topics
Architecture
If we look through the documentation on the Azure Architecture Center, we can see a list of Architectural Styles.
If we were to design an IoT Solution, we would want to follow Best Practices. We can do this by using the Azure Architectural Style of Event Driven Architecture. Event-driven architectures are central to IoT solutions.
Merging Event Driven Architecture with Microservices can be used to separate the IoT Business Services.
These services include:
- Provisioning
- Management
- Software Updating
- Security
- Logging and Notifications
- Analytics
Creating our services
To create these services, we start by selecting our Compute Options.
App Services
The use of Azure Functions is becoming commonplace. They are an excellent replacement for API Applications. And they can be published to Azure Api Management.
We are able to create a Serverless API, or use Durable Functions that allow us to create workflows and maintain state in a serverless environment.
Logic Apps provide us with the capability of building automated scalable workflows.
Data Store
Having a single data store is usually not the best approach. Instead, it’s often better to store different types of data in different data stores, each focused towards a specific workload or usage pattern. These stores include Key/value stores, Document databases, Graph databases, Column-family databases, Data Analytics, Search Engine databases, Time Series databases, Object storage, and Shared files.
This may hold true for other Architectural Styles. In our Event-driven Architecture, it is ideal to store all data related to IoT Devices in the IoT Hub. This data includes results from all events within the Logic Apps, Function Apps, and Durable Functions.
Which brings us back to our topic… Considering Software as an IoT Device
Since Azure IoT supports the TransportType.Http1
protocol, we can use the Microsoft.Azure.Devices.Client
Library to send Event data to our IoT Hub from any type of software. We also have the capability of receiving configuration data from the IoT Hub.
The following is the source code for our SendEvent Function App.
SendEvent Function App
#region Information
//
// MIT License
//
// Copyright (c) 2018 Howard Edidin
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
#endregion
#region
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data.Services.Client;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.Devices.Client;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;
using TransportType = Microsoft.Azure.Devices.Client.TransportType;
#endregion
namespace IoTHubClient
{
public static class SendEvent
{
private static readonly string IotHubUri = ConfigurationManager.AppSettings["hubEndpoint"];
[FunctionName("SendEventToHub")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = "device/{id}/{key:guid}")]
HttpRequestMessage req, string id, Guid key, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
// Get request body
dynamic data = await req.Content.ReadAsAsync<object>();
var deviceId = id;
var deviceKey = key.ToString();
if (string.IsNullOrEmpty(deviceKey) || string.IsNullOrEmpty(deviceId))
return req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a deviceid and deviceKey in the Url");
var telemetry = new Dictionary<Guid, object>();
foreach (var item in data.telemetryData)
{
var telemetryData = new TelemetryData
{
MetricId = item.metricId,
MetricValue = item.metricValue,
MericDateTime = item.metricDateTime,
MetricValueType = item.metricValueType
};
telemetry.Add(Guid.NewGuid(), telemetryData);
}
var deviceData = new DeviceData
{
DeviceId = deviceId,
DeviceName = data.deviceName,
DeviceVersion = data.deviceVersion,
DeviceOperation = data.deviceOperation,
DeviceType = data.deviceType,
DeviceStatus = data.deviceStatus,
DeviceLocation = data.deviceLocation,
SubscriptionId = data.subcriptionId,
ResourceGroup = data.resourceGroup,
EffectiveDateTime = new DateTimeOffset(DateTime.Now),
TelemetryData = telemetry
};
var json = JsonConvert.SerializeObject(deviceData);
var message = new Message(Encoding.ASCII.GetBytes(json));
try
{
var client = DeviceClient.Create(IotHubUri, new DeviceAuthenticationWithRegistrySymmetricKey(deviceId, deviceKey),
TransportType.Http1);
await client.SendEventAsync(message);
return req.CreateResponse(HttpStatusCode.OK);
}
catch (DataServiceClientException e)
{
var resp = new HttpResponseMessage
{
StatusCode = (HttpStatusCode) e.StatusCode,
Content = new StringContent(e.Message)
};
return resp;
}
}
}
public class DeviceData
{
public string DeviceId { get; set; }
public string DeviceName { get; set; }
public string DeviceVersion { get; set; }
public string DeviceType { get; set; }
public string DeviceOperation { get; set; }
public string DeviceStatus { get; set; }
public DeviceLocation DeviceLocation { get; set; }
public string AzureRegion { get; set; }
public string ResourceGroup { get; set; }
public string SubscriptionId { get; set; }
public DateTimeOffset EffectiveDateTime { get; set; }
public Dictionary<Guid, object> TelemetryData { get; set; }
}
public class TelemetryData
{
public string MetricId { get; set; }
public string MetricValueType { get; set; }
public string MetricValue { get; set; }
public DateTime MericDateTime { get; set; }
}
public enum DeviceLocation
{
Cloud,
Container,
OnPremise
}
}
Software Device Properties
The following values are required in the Url Path
Route = "device/{id}/{key:guid}")
Name |
Description |
id |
Device Id (String) |
key |
Device Key (Guid) |
The following are the properties to be sent in the Post Body
Name |
Description |
deviceName |
Device Name |
deviceVersion |
Device version number |
deviceType |
Type of Device |
deviceOperation |
Operation name or type |
deviceStatus |
Default: Active |
deviceLocation |
Cloud Container OnPremise |
subscriptionId |
Azure Subscription Id |
resourceGroup |
Azure Resource group |
azureRegion |
Azure Region |
telemetryData |
Array |
telemetryData.metricId |
Array item id |
telemetryData.metricValueType |
Array item valueType |
telemetryData.metricValue |
Array item value |
telemetryData.metricTimeStamp |
Array item TimeStamp |
Summary
- We can easily add the capability of sending messages and events to our Function and Logic Apps.
- Optionally, we can send the data to an Event Grid.
- We have a single data store for all our IoT events.
- We can identify performance issues within our services.
- Having a single data store makes it easier to perform Analytics.
- We can use a Azure Function App to Send Device to Cloud Messages. In this case our Function App will be also be taking the role of a Device.
by Eldert Grootenboer | Oct 30, 2017 | BizTalk Community Blogs via Syndication
This is a new post in the IoT Hub series. Previously we have seen how to administrate our devices, send messages from the device and from the cloud. Now that we have all this data flowing through our systems, it is time to help our users to actually work with this data.
Going back to the scenario we set in the first post of the series, we are receiving the telemetry readings from our ships, and getting alerts in case of high temperature. In the samples we have been using console apps for our communications between the systems, but in a real-life scenario you will probably want a better and easier interface. In the shipping business, Dynamics CRM is already widely used, and so it would benefit the business if they can use this product for their IoT solutions as well. Luckily they can, by using Microsoft Dynamics 365 for Field Service in combination with the Connected Field Service solution.

Setting Up Connected Field Service
To start working with the connected field service solution, first we will set up a 30 day trial for Dynamics 365 for Field Service. Just remember you will need an organizational Microsoft account to sign up for Dynamics 365. If you do not have one, you can create a <your-tenant>.onmicrosoft.com account in your Azure Active Directory for this purpose. If you already have your own Dynamics 365 environment, you can skip to installing the connected field service.
Create Dynamics 365 Environment
We will start by setting up Dynamics 365. In this post, we will be using a trial account, but if you already have an account you could of course also use that one. Go to the Dynamics 365 trial registration site, and click on Sign in to use an organizational account to login.
Sign in to Dynamics 365
Use an organizational account to sign in
Once signed in, confirm that you want to sign up for the free trial.
Confirm the free trial
Your trial has been accepted
Now that we have created our free trial, we will have to assign licenses to our users. Open the Subscriptions blade under Billing and choose to assign licenses to your users.
Assign licenses to users
You will get an overview of all your users. Select the users for which you want to assign the licenses, and click Edit product licenses.
Choose users to assign licenses
Add the licenses to the users we just selected.
Add licenses to users
Choose the trial license we just created. This will also add the connected Office 365 licenses.
Assign Dynamics 365 trial licenses
Now that we have assigned the Dynamics 365 licenses, we can finish our setup. Go to Admin Centers in the menu, and select Dynamics 365.
Go to Dynamics 365 admin center
As we are interested in the field service, select this scenario, and complete the setup. The field service scenario will customize our Dynamics 365 instance, to include components like scheduling of technicians, inventory management, work orders and more, which in a shipping company would be used to keep track of repairs, maintenance, etc.
Select Field service
Once we have completed our Dynamics 365 setup, it will be shown in the browser. The address of the page will be in the format <yourtenant>.crm4.dynamics.com. You can also change this endpoint in your Dynamics 365 admin center.
Your Dynamics 365 environment
Security
To allow us to install the Connected Field Service solution, we will need to add ourselves to the CRM admins. To do this, within your Dynamics 365 portal (in my case https://eldertiotcrmdemoeldert.crm4.dynamics.com/) go to the Settings Tab and open security.
Open Dynamics 365 Security
Now open the users, select your user account, and click on Promote To Admin.
Promote your user to local admin
Install Connected Field Service Solution
Now that we have Dynamics 365 set up, it’s time to add the Connected Field Service solution, which we will use to manage and interact with our devices from Dynamics 365. Start by going to Dynamics 365 in the menu bar.
Open Dynamics 365
This will lead us to our Dynamics home, where we can install new apps. Click on Find more apps to open the app store.
Open the app store
Search for Connected Field Service, and click on Get it now to add it to our environment.
Add Connected Field Service solution
Agree with the permissions, and make sure you are signed in with the correct user. The user must have a license, and permissions to install this solution.
Accept permissions
Follow the wizard for the solution and make sure you install it into the correct environment.
Select your Dynamics 365 environment
Accept deployment
On the next pages accept the service agreement and the privacy statement. Make sure you deploy to the correct Dynamics 365 Organization.
Select correct organization
Now we will have to specify the Azure resources where we want to deploy our artefacts like IoT Hub, Stream Analytics etc. If you do not see a subscription, make sure your user has the correct permissions in your Azure environment to create and retrieve artefacts and subscriptions.
Select Azure subscription and resources
The wizard will now start deploying all the Azure artefacts, and will update CRM with new screens and components. You can follow this by refreshing the screen, or coming back to the website. Once this is finished, you will need to click the Authorize button, which will set up the connection between your Azure and Dynamics 365.
After deployment click on Authorize
This will open the Azure portal on the API connection, click on the message This connection is not authenticated to authorize the connection.
Click to authenticate
Authorize the connection
The Azure Solution
Now let’s go to the Azure portal, and see what has been installed. Open the resource group which we created in the wizard.
Resource group for our connected field service solution
As you can see, we have a lot of new resources. I will explain the most important ones here, and their purpose in the Connected Field Service solution. After the solution has been deployed, all resources will have been setup for the data from the sample application which has been deployed with it. If you want to use your own devices, you will need to update these. This is also the place to start building your own solution, as your requirements might differ from what you get out of the box. As all these resources can be modified from the portal (except for the API Apps), customizing this solution to your own needs is very easy
IoT Hub
The IoT Hub which has been created is used for the device management and communication. It uses device to cloud messaging to receive telemetry from our devices, and cloud to device messaging to send commands to our devices. When working with your own devices, you should update them to connect with this IoT Hub.
Service Bus
Four Service Bus queues have been created, which are used for holding messages between systems.
Stream Analytics
There are several Stream Analytics jobs, which are used to process the data coming in from IoT Hub. When working with your own devices, you should update these jobs to process your own data.
- Alerts; This job reads data from IoT Hub, and references it against device rules in a blob. If the job detects it needs to send an alert to Dynamics 365, in this case a high temperature, it will write this into a Service Bus queue.
- PowerBI; This job reads all incoming telemetry data, and sends the maximum temperature per minute to PowerBI.
API Apps
Custom API Apps have been created, which will be used to translate between messages from IoT Hub and Dynamics 365.
Logic Apps
There are two Logic Apps, which serve as a communications channel between Dynamics 365 and IoT Hub. The Logic Apps use queues, API Apps and the Dynamics 365 connector to send and receive messages between these systems.
Setting Up PowerBI
PowerBI will be used to generate charts from the telemetry readings. To use this, we first need to import the reports. Start by downloading the reports, and make sure you have a PowerBI account, it is recommended to use the same user for this which you use for Dynamics 365. Open the downloaded reports file using PowerBI Desktop. The Power BI report will open with errors because it was created with a sample SQL database and user. Update the query with your SQL database and user, and then publish the report to Power BI.
Open the downloaded report
Once opened, click on Edit Queries to change the connection to your database.
Select Edit Queries Open Advanced Editor
Replace the source SQL server and database with the resources provisioned in your Azure resource group. The database server and database name can be found through the Azure portal.
Update Azure SQL Server and database names
Enter the credentials of your database user when requested.
Enter login credentials
If you get an error saying your client IP is not allowed to connect, use the Azure portal to add your client IP to the firewall on your Azure SQL Server.
Not allowed to connect
Add client IP to firewall settings
Once done, click on Close & Apply to update the report file.
Close and apply changes
Now we will publish the report to PowerBI, so we can use it from Dynamics 365. Click on the Publish button to start, and make sure to save your changes.
Publish report to PowerBI
Sign in to your PowerBI account and wait for you report to be published.
Sign in to PowerBI
Once published, open the link to provide your credentials.
Publishing succeeded
Follow the link to edit your credentials, and update the credentials with your database user login.
Sign in with database user
Now pin the tiles to a dashboard, creating one if it does not yet exist.
Pin tiles to dashboard
Managing Devices
In this post we will be using the simulator which has been deployed along with the solution. If you want to use your own (simulated) devices, be sure to update the connections and data for the deployed services. Go to your Dynamics 365 environment, open the Field Service menu, and select Customer Assets.
Open Customer Assets
To add a new device, we will create a new asset. This asset will then be linked to a device in IoT Hub.
Create new asset
Fill in the details of the asset. Important to note here, is we need to set a Device ID. This will be the ID with which the device is registered in IoT Hub. When done, click on Save.
Set asset details
Once the asset has been saved, you will note a new command in the command bar called Register Devices. This will register the new device in IoT Hub, and link it with our asset in Dynamics 365. Click this now.
Register the device in IoT Hub
The device will now be registered in IoT Hub. Once this is done, the registration status will be updated to Registered. We can now start interacting with our device.
Device has been registered in IoT Hub
Receive Telemetry
Open the thermostat simulator, which was part of the deployment of the Connected Field Service solution. You can do this by going back to the deployment website and clicking Open Simulator.
Open the simulator
This will open a new website where we can simulate a thermostat. Start by selecting the device we just created from Dynamics 365.
Select device
Once the device has been selected, we will start seeing messages being sent. These will be sent to IoT Hub, and be placed into PowerBI, and alerts will be created if the temperature gets too high. Increase the temperature to trigger some alerts.
Generate high temperature
Now go back to Dynamics 365, and open the Field Service dashboard.
Open Field Service dashboard
On the dashboard we will now see a new IoT Alert. You can open this alert to see it’s details, and for example create a work order for this. In our scenario with the shipping company, this would allow us to recognize anomalies on the ships engines in near real time, and immediately take action for this, like arranging for repairs.
Alerts are shown in Dynamics 365
Connect PowerBI
Now let’s set up Dynamics 365 to include the PowerBI graph in our assets, so we have an overview of our telemetry at all times as well. Go back to the asset we created earlier, and click the PowerBI button in the Connected Device Readings area.
Add PowerBI tile to asset
Choose one of the tiles we previously added to the PowerBI dashboard and click save.
Add PowerBI tile
We will now see the recent device readings in our Dynamics 365 asset. This will show up with every asset with the readings for its registered device, allowing us to keep track of all our device’s readings.
Device readings are now integrated in Dynamics 365
Send Commands
So for the final part, we will have a look how we can send messages from Dynamics 365 to our device. Go back to the asset we created, and click on Create Command.
Click Create Command
Give the command a name, and provide the command. This should be in JSON format, so it can be parsed by the device. As we will be using the simulator, we will just send a demo command, but for your own device this should be a command your device can understand. You can send this command to a particular device or to all your devices. Once you have filled in the fields, click on Send & Close to send the command. The command which we will be sending is as follows.
{"CommandName":"Notification","Parameters":{"Message":"Technician has been dispatched"}}
Create and send your command
Now when we switch over to our simulator, we will see the command coming in.
Commands are coming in
Conclusion
By using Dynamics 365 in combination with the Connected Field Service solution, we allow our users to use an environment which they are well known with, to administrate and communicate with their IoT devices. It allows them to handle alerts, dispatching technicians as soon as needed. By integrating the readings, they are always informed on the status of the devices, and by sending commands back to the device they can remotely work with the devices.
IoT Hub Blog Series
In case you missed the other articles from this IoT Hub series, take a look here.
Blog 1: Device Administration Using Azure IoT Hub
Blog 2: Implementing Device To Cloud Messaging Using IoT Hub
Blog 3: Using IoT Hub for Cloud to Device Messaging
Author: Eldert Grootenboer
Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/. View all posts by Eldert Grootenboer
by Steef-Jan Wiggers | Jul 29, 2017 | BizTalk Community Blogs via Syndication
July the holiday month, or at least that’s when the summer holiday season starts in the Netherlands. And this year I went for a holiday with the family to Portugal (Porto) and France (Montréal, Midi-Pyrénées).
Month July
For me this is a special month as the MVP renew cycle starts, which is now yearly. And I have always have been a July MVP, hence nothing really changed for me. Anyways, I got renewed, a great start of the month!

I shared the picture above on LinkedIn with the text: “Awarded for the eight time! Thanks Microsoft. Coolest Technology!!!”. And this posted got to my surprise more than 8000 views in a week. Awesome!
In the beginning of July I consolidated my talk at Integate2017 London into a blog post: Building sentiment analysis solution with Logic Apps. This years integrate was a success as many of you might have read in various blog post that recap the event. And there will be an US Integrate later this year in October, where I will be one of speakers too.
What else did I do this month. Well I worked together with Kent for one of his Middleware Friday shows: INTEGRATE 2017 Highlight Show. We recorded our interview session in Dublin. And I wrote a blog post about Azure Functions, go serverless!
Another thing I like to mention is that for a customer I worked hard with a team on a POC with CosmosDB, Graph model and Azure Search. And we have achieved some important milestones. Some of the learning I will share in upcoming months.
Holiday
As mentioned already it’s summer holiday season in the Netherlands and I went with my family to Porto in Portugal to visit Sandro and his family.


No suprises here, we went to have lunch, visit Porto and have an ice cream of course (Santini).
Books
Since I was on holiday I was able to read a few books. With a long road trip to Porto I saw a few movies related to AI, digitalization and IOT:
And what I found interesting about seeing this movies is how they relate to these books I read:

The world is changing around us with sensors, devices and huge amounts of data. Moreover, this makes us more aware of everything around is and smart, at least we get more insights.
Blockchain
During Integrate 2017, my buddy Kent talked a lot about Blockchain and Cryptocurrencies. And I got intrigued, yet I did not fully understand both. Therefore, I bought and read these two books to get a better understanding:
Both are recently pusblished books, relevant and up to date.
Relaxing books
Besides some technical books I read two thrillers to relax and chill:
The first is defenitely an amazing well written thriller and if you like the movies Se7en than this is for you!
Music
My favorite albums that were released in July were:
- Decapitated – Anticult
- Prong – Zero Days
- Wintersun – The Forest Seasons

Hence, another month gone by. Next month I will continue to work on the POC and prepare for sessions in September and October.
Cheers,
Steef-Jan
Author: Steef-Jan Wiggers
Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years. View all posts by Steef-Jan Wiggers
by Eldert Grootenboer | May 16, 2017 | BizTalk Community Blogs via Syndication
In the previous blog posts of this IoT Hub series, we have seen how we can use IoT Hub to administrate our devices, and how to do device to cloud messaging. In this post we will see how we can do cloud to device messaging, something which is much harder when not using Azure IoT Hub. IoT devices will normally be low power, low performance devices, like small footprint devices and purpose-specific devices. This means they are not meant to (and most often won’t be able to) run antivirus applications, firewalls, and other types of protection software. We want to minimize the attack surface they expose, meaning we can’t expose any open ports or other means of remoting into them. IoT Hub uses Service Bus technologies to make sure there is no inbound traffic needed toward the device, but instead uses per-device topics, allowing us to send commands and messages to our devices without the need to make them vulnerable to attacks.
Send Message To Device
When we want to send one-way notifications or commands to our devices, we can use cloud to device messages. To do this, we will expand on the EngineManagement application we created in our earlier posts, by adding the following controls, which, in our scenario, will allow us to start the fans of the selected engine.
To be able to communicate to our devices, we will first implement a ServiceClient in our class.
private readonly ServiceClient serviceClient = ServiceClient.CreateFromConnectionString("HostName=youriothubname.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=yoursharedaccesskey");
Next we implement the event handler for the Start Fans button. This type of communication targets a specific device by using the DeviceID from the device twin.
private async void ButtonStartFans_Click(object sender, EventArgs e)
{
var message = new Microsoft.Azure.Devices.Message();
message.Properties.Add(new KeyValuePair&lt;string, string&gt;("StartFans", "true"));
message.Ack = DeliveryAcknowledgement.Full; // Used for getting delivery feedback
await serviceClient.SendAsync(comboBoxSerialNumber.Text, message);
}
Process Message On Device
Once we have sent our message, we will need to process it on our device. For this, we are going to update the client application of our simulated engine (which we also created in the previous blog posts) by adding the following method.
private static async void ReceiveMessageFromCloud(object sender, DoWorkEventArgs e)
{
// Continuously wait for messages
while (true)
{
var message = await client.ReceiveAsync();
// Check if message was received
if (message == null)
{
continue;
}
try
{
if (message.Properties.ContainsKey("StartFans") &amp;&amp; message.Properties["StartFans"] == "true")
{
// This would start the fans
Console.WriteLine("Fans started!");
}
await client.CompleteAsync(message);
}
catch (Exception)
{
// Send to deadletter
await client.RejectAsync(message);
}
}
}
We will run this method in the background, so update the Main method, and insert the following code after the call for updating the firmware.
// Wait for messages in background
var backgroundWorker = new BackgroundWorker();
backgroundWorker.DoWork += ReceiveMessageFromCloud;
backgroundWorker.RunWorkerAsync();
Message Feedback
Although cloud to device messages are a one-way communication style, we can request feedback on the delivery of the message, allowing us to invoke retries or start compensation when the message fails to be delivered. To do this, implement the following method in our EngineManagement backend application.
private async void ReceiveFeedback(object sender, DoWorkEventArgs e)
{
var feedbackReceiver = serviceClient.GetFeedbackReceiver();
while (true)
{
var feedbackBatch = await feedbackReceiver.ReceiveAsync();
// Check if feedback messages were received
if (feedbackBatch == null)
{
continue;
}
// Loop through feedback messages
foreach(var feedback in feedbackBatch.Records)
{
if(feedback.StatusCode != FeedbackStatusCode.Success)
{
// Handle compensation here
}
}
await feedbackReceiver.CompleteAsync(feedbackBatch);
}
}
And add the following code to the constructor.
var backgroundWorker = new BackgroundWorker();
backgroundWorker.DoWork += ReceiveFeedback;
backgroundWorker.RunWorkerAsync();
Call Remote Method
Another feature when sending messages from the cloud to our devices is to call a remote method on the device, which we call invoking a direct method. This type of communication is used when we want to have an immediate confirmation of the outcome of the command (unlike setting the desired state and communicating back reported properties, which has been explained in the previous two blog posts). Let’s update the EngineManagement application by adding the following controls, which would allow us to send an alarm message to the engine, sounding the alarm and displaying a message.
Now add the following event handler for clicking the Send Alarm button.
private async void ButtonSendAlarm_Click(object sender, EventArgs e)
{
var methodInvocation = new CloudToDeviceMethod("SoundAlarm") { ResponseTimeout = TimeSpan.FromSeconds(300) };
methodInvocation.SetPayloadJson(JsonConvert.SerializeObject(new { message = textBoxMessage.Text }));
CloudToDeviceMethodResult response = null;
try
{
response = await serviceClient.InvokeDeviceMethodAsync(comboBoxSerialNumber.Text, methodInvocation);
}
catch (IotHubException)
{
// Do nothing
}
if (response != null &amp;&amp; JObject.Parse(response.GetPayloadAsJson()).GetValue("acknowledged").Value&lt;bool&gt;())
{
MessageBox.Show("Message was acknowledged.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
else
{
MessageBox.Show("Message was not acknowledged!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
}
And in our simulated device, implement the SoundAlarm remote method which is being called.
private static Task&lt;MethodResponse&gt; SoundAlarm(MethodRequest methodRequest, object userContext)
{
// On a real engine this would sound the alarm as well as show the message
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine($"Alarm sounded with message: {JObject.Parse(methodRequest.DataAsJson).GetValue("message").Value&lt;string&gt;()}! Type yes to acknowledge.");
Console.ForegroundColor = ConsoleColor.White;
var response = JsonConvert.SerializeObject(new { acknowledged = Console.ReadLine() == "yes" });
return Task.FromResult(new MethodResponse(Encoding.UTF8.GetBytes(response), 200));
}
And finally, we need to map the SoundAlarm method to the incoming remote method call. To do this, add the following line in the Main method.
client.SetMethodHandlerAsync("SoundAlarm", SoundAlarm, null);
Call Remote Method On Multiple Devices
When invoking direct methods on devices, we can also use jobs to send the command to multiple devices. We can use our custom tags here to broadcast our message to a specific set of devices.
In this case, we will add a filter on the engine type and manufacturer, so we can, for example, send a message to all main engines manufactured by Caterpillar. In our first blog post, we added these properties as tags on the device twin, so we now use these in our filter. Start by adding the following controls to our EngineManagement application.
Now add a JobClient to the application, which will be used to broadcast and monitor our messages.
private readonly JobClient jobClient = JobClient.CreateFromConnectionString("HostName=youriothubname.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=yoursharedaccesskey");
To broadcast our message, update the event handler for the Send Alarm button to the following.
private async void ButtonSendAlarm_Click(object sender, EventArgs e)
{
var methodInvocation = new CloudToDeviceMethod("SoundAlarm") { ResponseTimeout = TimeSpan.FromSeconds(300) };
methodInvocation.SetPayloadJson(JsonConvert.SerializeObject(new { message = textBoxMessage.Text }));
if (checkBoxBroadcast.Checked)
{
try
{
var jobResponse = await jobClient.ScheduleDeviceMethodAsync(Guid.NewGuid().ToString(), $"tags.engineType = '{comboBoxEngineTypeFilter.Text}' and tags.manufacturer = '{textBoxManufacturerFilter.Text}'", methodInvocation, DateTime.Now, 10);
await MonitorJob(jobResponse.JobId);
}
catch (IotHubException)
{
// Do nothing
}
}
else
{
CloudToDeviceMethodResult response = null;
try
{
response = await serviceClient.InvokeDeviceMethodAsync(comboBoxSerialNumber.Text, methodInvocation);
}
catch (IotHubException)
{
// Do nothing
}
if (response != null &amp;&amp; JObject.Parse(response.GetPayloadAsJson()).GetValue("acknowledged").Value&lt;bool&gt;())
{
MessageBox.Show("Message was acknowledged.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
else
{
MessageBox.Show("Message was not acknowledged!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
}
}
And finally, add the MonitorJob method with the following implementation.
public async Task MonitorJob(string jobId)
{
JobResponse result;
do
{
result = await jobClient.GetJobAsync(jobId);
Thread.Sleep(2000);
}
while (result.Status != JobStatus.Completed &amp;&amp; result.Status != JobStatus.Failed);
// Check if all devices successful
if (result.DeviceJobStatistics.FailedCount &gt; 0)
{
MessageBox.Show("Not all engines reported success!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
else
{
MessageBox.Show("All engines reported success.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
}
Conclusion
By using IoT Hub we have a safe and secure way of communicating from the cloud and our backend to devices out in the field. We have seen how we can use the cloud to device messages in case we want to send one-way messages to our device or use direct methods when we want to be informed of the outcome from our invocation. By using jobs, we can also call out to multiple devices at once, limiting the devices being called by using (custom) properties of the device twin. The code for this post can be found here.
IoT Hub Blog Series
In case you missed the other articles from this IoT Hub series, take a look here.
Blog 1: Device Administration Using Azure IoT Hub
Blog 2: Implementing Device To Cloud Messaging Using IoT Hub
Blog 3: Using IoT Hub for Cloud to Device Messaging
Author: Eldert Grootenboer
Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/. View all posts by Eldert Grootenboer
by Sandro Pereira | May 10, 2017 | BizTalk Community Blogs via Syndication
After the success of last year, Tuga IT 2017 is one week away from returning to Lisbon! From May 18th until the 20th, a variety of world class speakers (local and international) that will present a huge variety of “fresh” and “hot” topics about Microsoft Data Platform, SharePoint, Office 365, Enterprise Integration, Agile Methodologies, Open Source Technologies, Azure and much more. And this is only possible thank to the collaboration of all of the 9 communities taking part in Tuga IT 2017 event organizers or track owners.
Once again, I was invited to lead the Integration track and yes, we will once again have again a dedicated Enterprise Integration track! The integration track on Saturday the 20th will be packed with amazing sessions by Azure MVP’s Nino Crudele, Steef-Jan Wiggers, Eldert Grootenboer, Tomasso Groenendijk, myself and Ricardo Torre.
Enterprise Integration Track Agenda
8:30 Registration
9:00 How to face the integration challenges using the open source by Nino Crudele [Azure MVP]
“The open source space offers a lot of different options and opportunities in order to face the endless challenges that we normally need to face in the integration of technologies.
During this session, we will explore all the best open source options to cover the most advanced and complex requirements.
We will explore new options to extend the actual technologies and how to use the open source in conjunction with technology stacks like, pure .Net Framework, BizTalk Server and Microsoft Azure to solve complex integration scenarios.
During the session, we will also examine the best options available in the market and in the open source space.”
10:10 Coffe break
10:30 The speaker nightmare: Eval Forms & OCR & Logic Apps & Power BI by Sandro Pereira [Azure MVP]
“An evaluation form is something that a speaker love and hates, especially, if the results are processed in real-time and public available. If the result was excellent, then it is extremely rewarding, other times, it may “”hurt”” the speaker who has made himself available to share his knowledge and has been evaluated negatively. Sometimes the attendees are unfair in their evaluations, like, to basic in a 100-level session (these types of sessions are supposed to be basic or introductory) or sometimes the speaker had a bad day (it happens with everyone).
I speak from personal experience, is these last 6 years that I have been doing speaking at community events, in Portugal and abroad, I already been evaluated in all ways: badly, reasonable, good and excellent, sometimes I saw in, the same sessions, attendees with different profiles evaluate me badly and excellent. The key point for the speaker is:
- All feedback is good, either negative or positive, he can learn to improve itself, if that’s the case, or that I specific topic is not good for a certain audience
- He only needs to give his best! We cannot please everyone, and the goal is to fill happy with yourself and your performance.
I love evaluation forms and I love for them to be public available, even better if they are public available during the event. Because, at least, it will give during the event a good topic of conversation for people that do not know each other and it will keep the conversation flowing (naturally), people normally are affray or shy to start a conversation between unfamiliar persons, this is a good ice break.
In this session, I will show and explain a real live demo on how we can easily build a robust solution for process evaluation forms, using an OCR software and easily integrate them with Power BI to present the results in an interactive and beautiful way. But most important: How you can educate your enterprise Developers and IT pros users to easily extend capabilities for power users, who understand their business challenges the best and allowing them to use their familiar tools like OCR software to process Evaluation forms and quickly build and deliver Power BI solutions to build Interactive Data dashboards. And at the same time integrate these tools, platforms or systems and a very quick and robust way using integrate feature on Azure, like, Logic Apps, API Apps and Azure Functions. How to start from a simple solution and evolve them enabling new functionalities.”
11:50 Cloud Integration: so many options! by Steef-Jan Wiggers [Azure MVP]
Traditional integration has changed with the rise and evolution of cloud computing. And connectivity between systems on premise will evolve to connections with services and solutions in the cloud. Data is everywhere, applications are everywhere and people are everywhere with their devices always connected. In this world integration is key. Azure offers us various ways to implement integration solutions to provide connectivity to the business to their data. This talk will be a journey of options you have when building these types of solutions!
13:00 Lunch
14:00 Using DocumentDB to make your API App high performant and secure it with API Management by Tomasso Groenendijk [Azure MVP]
APIs are becoming more important for organizations and people even talk about the API economy but how can you create your own API, expose it globally and make a profit with it? In this session, Tomasso is going to show how you can use API Apps and DocumentDB to create a high performant API and how to use API Management in combination with Web Apps to expose the API.
15:10 Sponsor Session
15:30 Cofee Break
15:50 BizTalk 2016 in a hybrid world by Ricardo Torre
The integration landscape has definitely evolved to be a hybrid, significant on-premise investment has been accumulated over the years while at the same time cloud computing brought new challenges and new ways of implementing integration. Let’s navigate through the innovations in both worlds and how BizTalk and the cloud currently live together.
17:10 Azure IoT Hub Deep Dive by Eldert Grootenboer [Azure MVP]
Azure IoT Hub gives us the possibility to manage and secure and do bi-directional communication with billions of IoT devices. In this session, we will dive into all these possibilities, to show how you can easily set up a robust and hyper-scalable solution for your IoT needs.
18:20 Closing
More information on the official website: http://tugait.pt/2017.
Invitation for a social game.
A social SPEAKER game… there might be other games (and awards) but that is still under NDA!
Speaker game has 1 objective: getting you the MAXIMUM social experience, getting you to LEARN about your fellow speakers, about the event and about THE ATTENDEES.
The winner/s will be determined by the collection of the maximum number of points and we shall determine the winner on Saturday evening … (the prize, again, is still under NDA) and here is how you can start collecting points on the game card will receive at the airport/event:
- Get to know 5 people at the event and write down their name, company, it area & specialty: 5 points
- Vote your favorite Tuga IT 2017 speaker (you will need to attend that session): 1 point
- Write down the number of countries that Tuga IT 2017 speakers represent: 1 point
- Write down the number of countries that Tuga IT 2017 attendees represent: 1 point
- Name all Tuga IT 2016 sponsors: 1 point
- Name the person who was the most influential in your career: 1 point
- Name 3 IT Celebrities (MVPs/Speakers/PMs) who started outside of IT: 1 point
- Name the creation dates of the involved User Groups: 1 point per correct answer (there are 9 groups in total)
- Name the youngest speaker of Tuga IT 2017: 1 point
- Discover how many MVPs/Microsoft MVPs/Oracle Aces do we have at the Tuga IT 2017? 1 point
And even more for the social part:
It might be worth playing… you know… you never lose in trying.
Registration For TUGA IT 2017
TUGA IT 2017 will take place in Microsoft Portugal’s offices, in Lisbon, on May 18-20, 2017. It will feature 3 days of breakout sessions and full-day workshops by world-class speakers where IT Professionals can spend 3 amazing days checking the future of IT and also take the time to network with top-level speakers and other IT Professionals.
Registration for TUGA IT 2017 is a few euro’s or even free if you do require lunch (the fee is there to reduce waste and prevent having an abundance of food).
You can register here and I will see you there in Lisbon!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community. View all posts by Sandro Pereira
by Mick Badran | May 23, 2016 | BizTalk Community Blogs via Syndication
After a few customer engagements recently this topic has come up several times, some lean in closer and let’s have a chat on it.
How application infrastructure is evolving to support digital business?
The great thing in this modern era is that businesses are placing great pressure on traditional IT, integrators and solution architects to innovate and look for that next disruptive edge.
With a pea sized idea, access to the rapidly evolving cloud technologies businesses can disrupt even the most cemented industries. In fact many of my customer meetings are around disruption, “What/How can we disrupt today?”. The business climate has never been better & hence the birth of the phrase ‘digital business’.
Many companies that ‘push tin’ are scrambling to offer a range of other services as they also see the future is not in tin, not so much in technical expertise (while I do love being part of a great team, cloud templates can put short work to our previous ‘cluster’ expert), but in being agile! Taking an idea whether it’s IoT or anything else and realising the solution. In my opinion, this is the skill that will be much sought after in the market.
Integration is key here.

What about the application landscape? How has it changed?
Great question!
Figure 2 – Depicting the role of middleware
From what I’ve experienced is that over the past 20 odd years, application integration ‘layers’ (or middleware) were large, monolithic and usually cost a fair bit of $$. Associated with a software platform purchase was a good 9-18 month evaluation of the platform to see if it was ‘fit for purpose’.
(There were a whole series of VAN’s in this space as well – Value Added Networks, which made life simple to move data/messages through a particular industry vertical. This was partly because it was made complex by proprietary interfaces, and also the fact that the VAN providers wanted customer lock-in. So naturally everything to do with these environments was ‘hard’ and we let the ‘experts’ deal with it)
In these times many of the systems and applications all had custom ways to communicate. Software vendors reluctantly opened up ways of getting data in/out of their system. Communications standards were lacking, as well as message formats and protocols.
The fact that the middleware platforms communicated with a large number of systems from DB2, SAP, OracleEB, JD Edwards through to Pronto (ERPs) made attractive choices for businesses that saw the value in getting ‘end-to-end integration’ and a full 360 degree customer view.
Fast forward to present day…..
Software vendors are exploding at a rate of knots all over the web. Applications are more about functionality than specifically where they run – on-premises or cloud or…phone…or where ever. Applications today are modular and connected by well known interfaces – although the resiliency and interface workflows are lagging behind a little in this realm.
The software industry realises that mobile devices are king and they build applications accordingly. For e.g. the pseudo modern accepted message/data exchange JSON/HTTP (aka REST) based APIs. I had one of my team members complain when an interface they were talking to was SOAP. I felt like I was talking about tape backups 🙂
Given that Mobile is taking charge and software is providing better foundations for ease of mobile development and operation, you could call it the mobile evolution….
REST based APIs are accepted as the norm with OpenId/OAuth being token based authentication standards, allowing the beauty of delegated authentication (something that plagued many previous integration implementations in the quest for the elusive single sign-on capability).
Figure 3 – Illustrates the business need to maximise and provide a comprehensive set of APIs in which to monetise.
Businesses today are realising they don’t need mobile, they don’t need a website….they need an API!!! An API:
- to expose their business to public consumers
- to expose their business to down stream consumers
- to expose their business to upstream consumers
- to commercialise their offerings.
They now realise they can get away from building ‘yet another mobile app’ and focus squarely on turning data into information.
Software today needs to produce analytics cause gone are the days when we got excited at just being ‘connected’.
by Mick Badran | May 23, 2016 | BizTalk Community Blogs via Syndication
After a few customer engagements recently this topic has come up several times, some lean in closer and let’s have a chat on it.
How application infrastructure is evolving to support digital business?
The great thing in this modern era is that businesses are placing great pressure on traditional IT, integrators and solution architects to innovate and look for that next disruptive edge.
With a pea sized idea, access to the rapidly evolving cloud technologies businesses can disrupt even the most cemented industries. In fact many of my customer meetings are around disruption, “What/How can we disrupt today?”. The business climate has never been better & hence the birth of the phrase ‘digital business’.
Many companies that ‘push tin’ are scrambling to offer a range of other services as they also see the future is not in tin, not so much in technical expertise (while I do love being part of a great team, cloud templates can put short work to our previous ‘cluster’ expert), but in being agile! Taking an idea whether it’s IoT or anything else and realising the solution. In my opinion, this is the skill that will be much sought after in the market.
Integration is key here.

What about the application landscape? How has it changed?
Great question!
Figure 2 – Depicting the role of middleware
From what I’ve experienced is that over the past 20 odd years, application integration ‘layers’ (or middleware) were large, monolithic and usually cost a fair bit of $$. Associated with a software platform purchase was a good 9-18 month evaluation of the platform to see if it was ‘fit for purpose’.
(There were a whole series of VAN’s in this space as well – Value Added Networks, which made life simple to move data/messages through a particular industry vertical. This was partly because it was made complex by proprietary interfaces, and also the fact that the VAN providers wanted customer lock-in. So naturally everything to do with these environments was ‘hard’ and we let the ‘experts’ deal with it)
In these times many of the systems and applications all had custom ways to communicate. Software vendors reluctantly opened up ways of getting data in/out of their system. Communications standards were lacking, as well as message formats and protocols.
The fact that the middleware platforms communicated with a large number of systems from DB2, SAP, OracleEB, JD Edwards through to Pronto (ERPs) made attractive choices for businesses that saw the value in getting ‘end-to-end integration’ and a full 360 degree customer view.
Fast forward to present day…..
Software vendors are exploding at a rate of knots all over the web. Applications are more about functionality than specifically where they run – on-premises or cloud or…phone…or where ever. Applications today are modular and connected by well known interfaces – although the resiliency and interface workflows are lagging behind a little in this realm.
The software industry realises that mobile devices are king and they build applications accordingly. For e.g. the pseudo modern accepted message/data exchange JSON/HTTP (aka REST) based APIs. I had one of my team members complain when an interface they were talking to was SOAP. I felt like I was talking about tape backups 🙂
Given that Mobile is taking charge and software is providing better foundations for ease of mobile development and operation, you could call it the mobile evolution….
REST based APIs are accepted as the norm with OpenId/OAuth being token based authentication standards, allowing the beauty of delegated authentication (something that plagued many previous integration implementations in the quest for the elusive single sign-on capability).
Figure 3 – Illustrates the business need to maximise and provide a comprehensive set of APIs in which to monetise.
Businesses today are realising they don’t need mobile, they don’t need a website….they need an API!!! An API:
- to expose their business to public consumers
- to expose their business to down stream consumers
- to expose their business to upstream consumers
- to commercialise their offerings.
They now realise they can get away from building ‘yet another mobile app’ and focus squarely on turning data into information.
Software today needs to produce analytics cause gone are the days when we got excited at just being ‘connected’.