BizMan, The BizTalk Server SuperHero Sticker

BizMan, The BizTalk Server SuperHero Sticker

Let me tell you the story behind BizMan, The BizTalk Server SuperHero Sticker. On 12th May, during my session in the Integrate 2016 event in London about “A new set of BizTalk Server Tips and Tricks”, I announce that I had two BizTalk Server 2016 stickers versions to offer – probably one of the firsts BizTalk stickers ever – with the mythic phrase “The T-Rex is loose“ to celebrate the release of BizTalk Server 2016 version and one of them, of course, with a badass T-Rex and the other with a “dear”/”sweet” T-Rex version.

BizMan, The BizTalk Server SuperHero: The T-Rex is loose badassBizMan, The BizTalk Server SuperHero: The T-Rex is loose sweet

You need to remember that to commemorate the first release ever: BizTalk Server 2000 (on 12/12/2000) the BizTalk Server marketing folks designed a “killer” mouse pad for the product team with the phrase “The T-Rex is loose”.

BizMan, The BizTalk Server SuperHero: The T-Rex is loose

(Original photo from Gijs in ‘t Veld)

You can read more about it on Gijs in ‘t Veld blog: Happy 12th birthday BizTalk Server, The T-Rex!.

Of course, BizTalk people love them… Who doesn’t like T-Rex? Who doesn’t like BizTalk?

  • Well to respond to the first question, I think all of us like T-Rex… because they don’t exist anymore, they appear in so many movies, they are so cool, they are so huge, Rex meaning “king” in Latin by the way… so the tyrant king it is badass!
  • To respond the second question: Many people, for many reasons, they simply don’t understand the product (but wish to all of their features for free) or they simply do not realize what enterprise integration is. (but that is a different topic that I will not enter into detail here)

And some of the feedback, if we can call it that way, that I received from these group of people (the ones that don’t like BizTalk Server – probably the same that are always saying BizTalk is dead) was more or like this:

  • It is an old product, obsolete like the dinosaur

BizTalk has never been so alive that is today, we are actually seeing the PRO INTEGRATION team at Microsoft investing heavily in the product, not only supporting new platform updates but actually bringing new capabilities at a faster pace to the product. So, my response to these group of people and to this type of comment is in the creation of a new sticker: The BizTalk Server SuperHero: THE BIZMAN!BizMan, The BizTalk Server SuperHero

If you want to add this sticker to your laptop or another area, you just need to download the zip file below and send it to a graphic shop. It as in the perfect size/resolution for printing.

01-I-Still-love-T-Rex

I decide to call it BizMan but there were plenty of other amazing suggestions:

02-HyperBiz

03- Integrator-SuperTalk

04-Terminator

05-MessageBox-Mike

06-BizMan

07-BizTolkien-Hybrido

Hope you enjoy!

Special thanks to my two coworkers at DevScope: Frederico Junqueira, the artist and creator of the BizMan, The BizTalk Server SuperHero design and António Lopes for giving the final touches and help with everything related with the graphic.

You can download BizMan, The BizTalk Server SuperHero sticker from:

BizTalk Server SuperHero Sticker: BizMan (10,1 MB)
Microsoft | TechNet Gallery

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Custom CRM Portals Super-Charged by an Azure Data Layer

Custom CRM Portals Super-Charged by an Azure Data Layer

I wanted to talk a little about the architecture I designed recently for a Dynamics CRM + Portal + Integration project. In the initial stages of the project a number of options were considered for a Portal (or group of portals) which would support staff, students and other users which would integrate with Dynamics CRM and other applications in the application estate. One of the challenges I could see coming up in the architecture was the level of coupling between the Portal and Dynamics CRM. Ive seen this a few times where an architecture has been designed where the portal is directly querying CRM and has the CRM SDK embedded in it which is an obviously highly coupled integration between the two. What I think is a far bigger challenge however is the fact that CRM Online is a SaaS application and you have very little control over the tuning and performance of CRM.

Lets imagine you have 1000 CRM user licenses for staff and back office users. CRM is going to be your core system of record for customers but you want to build systems of engagement to drive a positive customer experience and creating a Portal which can communicate with CRM is a very likely scenario. When you have bought your 1000 licenses from Microsoft you are going to be given the infrastructure to support the load from 1000 users. The problem however is your CRM portal being tightly coupled to CRM may introduce another amount of users on top of the 1000 back office users. Well whats going to happen when you have 50,000 students or a thousands/millions of customers starting to use your portal. You now have a problem that CRM may become a bottle neck to performance but because its SaaS you have almost no options to scale up or out your system.

With this kind of architecture you have the choices to roll your own portal using .net and either Web API or CRM SDK integration directly to CRM. There are also options to use products like ADXStudio which can help you build a portal too. The main reason these options are very attractive is because they are probably the quickest to build and minimize the number of moving parts. From a productivity perspective they are very good.

An illustration of this architecture could look something like the below:

What we were proposing to do instead was to leverage some of the powerful features of Azure to allow us to build an architecture for a Portal which was integrated with CRM Online and other stuff which would scale to a much higher user base without having performance problems on CRM. Noting that problems in CRM could create a negative experience for Portal Users but also could significantly effect the performance of staff in the back office is CRM was running slow.

To achieve this we decided that using asynchronous approaches with CRM and hosting an intermediate data layer in Azure would allow us at a relatively low cost have a much faster and more scalable data layer to base the core architecture on. We would call this our cloud data layer and it would sit behind an API for consumers but be fed with data from CRM and other applications which were both on premise and in the cloud. From here the API was to expose this data to the various portals we may build.

The core idea was that the more we could minimize the use of RPC calls to any of our SaaS or On Premise applications the better we would be able to scale the portal we would build. Also at the same time the more resilient they would be to any of the applications going down.

Hopefully at this point you have an understanding of the aim and can visualise the high level architecture. I will next talk through some of the patterns in the implementation.

Simple Command from Portal

In this patter we have the scenario where the portal needs to send a simple command for something to happen. The below diagram will show how this works.

Lets imagine a scenario of a user in the portal adding a chat comment to a case.

The process for the simple command is:

  1. The portal will send a message to the API which will do some basic processing but then it will off load the message to a service bus topic
  2. The topic allows us to route the message to many places if we want to
  3. The main subscriber is a Logic App and it will use the CRM connectors to be able to interact with the appropriate entities to create the chat command as an annotation in CRM

This particular approach is pretty simple and the interaction with CRM is not overly complicated. This is a good candidate to use the Logic App to process this message.

Complex Command from Portal

In some cases the portal would publish a command which would require a more complex processing path. Lets imagine a scenario where the customer or student raised a case from the portal. In this scenario the processing could be:

  1. Portal calls the API to submit a case
  2. API drops a message onto a service bus topic
  3. BizTalk picks up the message and enriches with additional data from some on premise systems
  4. BizTalk then updates some on premise applications with some data
  5. BizTalk then creates the case in CRM

The below picture might illustrate this scenario

In this case we choose to use BizTalk rather than Logic Apps to process the message. I think as a general rule the more complex the processing requirements, the more I would tend to lean towards BizTalk than Logic Apps. BizTalks support for more complex orchestration, compensation approaches and advanced mapping just lends itself a little better in this case.

I think the great thing in the Microsoft stack is that you can choose from the following technologies to implement the above two patterns behind the scenes:

  • Web Jobs
  • Functions
  • Logic Apps
  • BizTalk

Each have their pro’s and con’s which make them suit different scenarios better but also it allows you to work in a skillset your most comfortable with.

Cloud Data Layer

Earlier in the article I mentioned that we have the cloud data layer as one of our architectural components. I guess in some ways this follows the CQRS pattern to some degree but we are not specifically implementing CQRS for the entire system. Data in the Cloud Data Layer is owned by some other application and we are simply choosing to copy some of it to the cloud so it is in a place which will allow us to build better applications. Exposing this data via an API means that we can leverage a data platform based on Cosmos DB (Document DB) and Azure Table Storage and Azure Blob Storage.

If you look at Cosmos DB and Azure Storage, they are all very easy to use and to get up and running with but the other big benefits is they offer high performance if used right. By comparison we have little control over the performance of CRM online, but with Cosmos DB and Azure Storage we have lots of options over the way we index and store data to make it suit a high performing application without all of the baggage CRM would bring with it.

The main difference over how we use these data stored to make a combines data layer is:

  • Cosmos DB is used for a small amount of meta data related to entities to aid complex searching
  • Azure Table store is used to store related info for fast retrieval by good partitioning
  • Azure Blob Storage is used for storing larger json objects

Some examples of how we may use this would be:

  • In an azure table a students courses, modules, etc may be partitioned by the student id so it is fast to retrieve the information related to one student
  • In Cosmos DB we may store info to make advanced searching efficient and easy. For example find all of the students who are on course 123
  • In blob storage we may store objects like the details of a KB article which might be a big dataset. We may use Cosmos DB to search for KB articles by keywords and tags but then pull the detail from Blob Storage

CRM Event to Cloud Data Layer

Now that we understand that queries of data will not come directly from CRM but instead via an API which exposes an intermediate data layer hosted on Azure. The question is how is this data layer populated from CRM. We will use a couple of patterns to achieve this. The first of which is event based.

Imagine that in CRM each time an entity is updated/etc we can use the CRM plugin for Service Bus to publish that event externally. We can then subscribe to the queue and with the data from CRM we can look up additional entities if required and then we can transform and push this data some where. In our architecture we may choose to use a Logic App to collect the message. Lets imagine a case was updated. The Logic App may then use info from the case to look up related entity data such as a contact and other similar entities. It will build up a canonical message related to the event and then it can store it in the cloud data layer.

Lets imagine a specific example. We have a knowledge base article in CRM. It is updated by a user and the event fires. The Logic App will get the event and lookup the KB article. The Logic App will then update Cosmos DB to update the metadata of the article for searching by apps. The Logic App will then transform the various related entities to a canonical json format and save them to Blob storage. When the application searches for KB articles via the API it will be under the hood retrieving data from Cosmos DB. When it has chosen a KB article to display then it will retrieve the KB article details from blob storage.

The below picture shows how this pattern will work.

CRM Entity Sync to Cloud Data Layer

One of the other ways we can populate the cloud data layer from CRM is via a with a job that will copy data. There are a few different ways this can be done. The main way will involve executing a fetch xml query against CRM to retrieve all of the records from an entity or all of the records that have been changed recently. They will then be pushed over to the cloud data layer and stored in one of the data stores depending on which is used for that data type. It is likely there will be some form of transformation on the way too.

An example of where we may do this is if we had a list of reference data in CRM such as the nationalities of contacts. We may want to display this list in the portal but without querying CRM directly. In this case we could copy the list of entities from CRM to the cloud data layer on a weekly basis where we copy the whole table. There are other cases where we may copy data more frequently and we may use different data sources in the cloud data layer depending upon the data type and how we expect to use it.

The below example shows how we may use BizTalk to query some data from CRM and then we may send messages to table storage and Cosmos DB.

Another way we may solve this problem is using Data Factory in Azure. In Data Factory we can do a more traditional ETL style interface where we will copy data from CRM using the OData feeds and download it into the target data sources. The transformation and advanced features in Data Factory are a bit more limited but in the right case this can be done like in the below picture.

In these data synchronisation interfaces it will tend to be data that doesn’t change that often and data which you don’t need the real time event to update it which it will work the best with. While I have mentioned Data Factory and BizTalk as the options we used, you could also use SSIS, custom code and a web job or other options to implement it.

Summary

Hopefully the above approach gives you some ideas how you can build a high performing portal which integrated with CRM Online and potentially other applications but by using a slightly more complex architecture which introduces asynchronous processing in places and CQRS in others you can create a decoupling between the portal(s) you build and CRM and other back end systems. In this case it has allowed us to introduce a data layer in Azure which will scale and perform better than CRM will but also give us significant control over things rather than having a bottle neck on a black box outside of our control.

In addition to the performance benefits its also potentially possible for CRM to go completely off line without bringing down the portal and only having a minimal effect on functionality. While the cloud data layer could still have problems, firstly it is much simpler but it is also using services which can easily be geo-redundant so reducing your risks. An example here of one of the practical aspects of this is if CRM was off line for a few hours while a deployment is performed I would not expect the portal to be effected except for a delay in processing messages.

I hope this is useful for others and gives people a few ideas to think about when integrating with CRM.

Agile integration with Microsoft Azure and BizTalk Server

Agile integration with Microsoft Azure and BizTalk Server

Agile is a word I really like to use now in the integration space, in my last event, the Integrate 2017, I started to present this my concept.
As we see the business’s dynamics are changing very fast, companies require fast integration every time more, sending and integrating data very easily and fast is now a key requirement.

There are many options now to obtain that, however some of them are better in some specific situation than the others.

During my last event, I presented how I approach to the agile integration using Microsoft Azure and some example with BizTalk Server as well.

Following these concepts, I started a new project called Rethink121, I like the idea to rethink technologies in different ways and this is what my customers appreciate more.

Most of the time we start using a technology following the messages provided by the vendor, without exploring other new possibilities and ways, I think is important to evaluate a technology like a little kid evaluates his first toy.

During the session, I shown some scenarios like, how to get fast and easy hybrid integration, BizTalk performance, concepts like agile and dynamic integration, cognitive integration.

Many people have been impressed by the demos, some other made me many questions and asking more about that, a new concept and a new view always creates a lot questions because it stimulates the creativity and the curiosity.

During the session, I have shown a BizTalk solution sample sending a flat file between a REST and a WCF endpoint and using a pipeline and a map to send and receive flat data between the endpoints.
The process executed by BizTalk Server was able to achieve that in 80 milliseconds for a single message in request response, I have shown how is possible to achieve real time performances in BizTalk Server without change the solution and reusing all the artefacts as is.

Do to that I use my framework named SnapGate which is able to be installed inside BizTalk Server and improve the performances.

The time spent to execute the single request response using SnapGate was around 4 milliseconds, quite impressive result.

I have shown the generic BizTalk adapter which is able to extend the BizTalk integration capabilities without any limit, the adapter is able to be extended using PowerShell or even a simple .Net code in 5 minutes.

I explain the concept of agile integration and how I approach and how I use the technologies, I like to map the BizTalk Server architecture to Microsoft Azure to better explain that.

During the demo, I have shown a scenario able to demonstrate how to send and integrate data across the world in very fast ways, the key point of this demo were:

  • The system was able to integrate data and create new integration points in real time
  • The system was using PowerBI in real-time and using normal graphs and PowerBI feature in real time.
  • The system was learning by itself about how to integrate these points and we don’t need to physically deploy our mediation stack.

Starting form a single integration point in London

I started a new integration point in Birmingham and the integration point in London started synchronising the adaptation layer with Birmingham and sending exchange messages between them

I started other more to show how much easy can be integrate new point and exchange data between them without care about to deploy any new feature, the system learns by itself, we can call this cognitive integration.

There are so many innovative aspects to consider here, the possibility to have fast hybrid integration at very low cost, integrate data fast and easy, the possibility to integrate PowerBI in real time data analytic scenario.

A last thing, many people asking me about BizTalk server and its future, well, the list below are the technologies I most like to use and how, I have shown this slide during the session.

There is one thing only I can say, BizTalk Server is S.O.L.I.D.

I will explain more in detail about agile and cognitive integration in my next session in the Integration Monday in September, see you there.

Author: Nino Crudele

Nino has a deep knowledge and experience delivering world-class integration solutions using all Microsoft Azure stacks, Microsoft BizTalk Server and he has delivered world class Integration solutions using and integrating many different technologies as AS2, EDI, RosettaNet, HL7, RFID, SWIFT.

A lap around Azure Functions, go serverless!

A lap around Azure Functions, go serverless!

Serverless is hot and happening. Hence, it is not a buzzword, but a new interesting part of Computer Science, which is amazing and also a driver of the second machine age, which we are currently experiencing. I read two books sequentially recently: Computer Science Distilled and the Second Machine Age.

The first book dealt with the concepts of Computer Science. And few aspects in it caught my attention like breaking a problem into smaller pieces. Hence, in Azure I could use functions to solve partial of a complete problem or process parts of a large workload. The second book discusses the second machine age around automation, robotics, artificial intelligence and so on. And little repetitive tasks can be build using Functions. Azure Functions to be precise that can automate those little tasks. Thus, why not consolidate my little research of the current state of Azure Functions into a blog post with the context of both books in the back of my mind.

Serverless

Serverless computing is a reality and Microsoft Azure provides several platform services that can be provisioned dynamically. Resources are allocated without you worrying about scale, availability and security. And the beauty of it all is you only pay what you use.

Azure Functions is one of Microsoft’s serverless capabilities in Azure. Functions enable you to run pieces of code in Azure. Cool eh! And can be run independently, in orchestration or flow (durable functions), or as a part of a Logic App definition or Microsoft Flow.

You provision a Function App, which acts as a container for one or more functions. Subsequently, either attach a price plan to it, when you want share resources with other services like web app or you choose a consumption plan (pay as you go).

Finally, you have the function app available and you can start adding functions to them. Either using Visual Studio that has templates for building a function or you use the Azure Portal (Browser). Both provide features to build and test your function. However, Visual Studio will deliver intellisense and debugging features to you.

Function Types

Functions can be build using your language of choice like C#, F#, JavaScript, or Node.js. Furthermore, there are several types of functions you can build such as a WebHook + API function or a trigger based function. The latter can be used to integrate with the following Azure Services and SaaS solutions :

  • Cosmos DB
  • Event Hubs
  • Mobile Apps (tables)
  • Notification Hubs
  • Service Bus (queues and topics)
  • Storage (blob, queues, and tables)
  • GitHub (webhooks)
  • On-premises (using Service Bus)
  • Twilio (SMS messages)

The integration is based upon a binding and trigger, key concepts with Azure Functions. Bindings provide a way to connect to in- and outputs of earlier mentioned services and solutions, see Azure Functions triggers and bindings concepts.

WebHook + API function

A popular quick start template for Azure Functions is WebHook + API function. This type of function is supported through the HTTP/WebHook binding and enables you to build autonomous functions that can be (re)used is various types of applications like a Logic App.

After provisioning a Function App you can add a function easily. As shown below you can select a premade function, choose CSharp and click Create this function.

A function named HttpTriggerCSharp1 will be made available to you. The sample is easy to experiment with. I changed the given function to something new like the screenshot below.

And now it gets interesting. You can click Get Function URL as the function is publically accessible that is if you know the function key. By clicking the Get Function URL you’ll receive an URL that looks like this:

https://myfunctioncollection.azurewebsites.net/api/HttpTriggerCSharp1?code=iaMsbyhujlIjQhR4elcJKcCDnlYoyYUZv4QP9Odbs4nEZQsBtgzN7Q==

And the code resembles the default function key, which you can change through the Manage pane in the Function App blade.

Since your function is accessible you can call it using for instance postman.

The screenshot above shows an example of a call to the function endpoint . The request includes the function key (code). However, a call like above might not be as secure as you need. Hence, you can secure the function endpoint by using API Management Service in Azure. See Using API Management to protect Azure Functions (Middleware Friday) blog post. The post explains how to do that and it’s more secure!

Integrate and Monitor

You can bind Azure Storage as an extra output channel for a function. Through the Integrate pane I can add an extra output to the function. Configure the new output by choosing Azure Blob Storage, set Storage Account Connection and specify the path.

Next you have to update the Function signature with outputBlob parameter and implement the outputBlob.

Finally, you can monitor your functions through the Monitor pane, which provides you some basic insights (logs). For a more richer monitoring experience, including live metrics and custom queries, Microsoft recommends using Azure Application Insights. See also Monitoring Azure Functions.

Visual Studio Experience

Azure Functions can be build with Visual Studio. Now the templates are now available after a default installation of Visual Studio. You need download them.  Visual Studio 2017 the templates for Azure Functions are available on the marketplace. For Visual Studio 2015 read this blog post, which includes the steps I did for my Visual Studio 2015 installation.

Once the templates are available in your Visual Studio version (2015 or 2017) you can create a FunctionApp project. Within the created FunctionApp project you can add functions. Right click the project and select Add –> New Azure Function. Now you can choose what type of function you can build. You will have a similar experience as with the portal.

For instance you can create a ServiceBusTrigger Function (WindSpeedToBeaufort), which will be triggered once a message arrives on a queue (myqueue).

As a result you will see the following code once you hit Create:

Now let’s work on the function so it will resemble the diagram below:

To modify the function that does the above the necessary code is shown below:

And the json.setting needs to be renamed to local.settings.json, the function.json needs modification to:

The connection string is moved to the local.settings.json as depicted below:

Most of all this change is important, otherwise you will run into errors.

Debugging with Visual Studio

Visual Studio provides the capability to debug your custom function. Compile and start a debug instance. A command line dialog box will appear and your function is running (i.e. hosted and running).

To debug our function in this blog a message is sent to myqueue using the ServiceBus360 service.

Once the message arrives at the queue it will trigger the function. Hence, the debugging can start on the position in the code, where a breakpoint has been set.

And the result of execution will be visible in the command line dialog box:

In conclusion this is the debugger experience you will have with Visual Studio. Combined with having intellisense while developing your function.

Deployment

You have build and tested your function to your satisfaction in Visual Studio. Now it’s time to deploy it to Azure, therefore you right click the project and choose publish. A dialog will appear and you can choose AppService. Subsequently, if you are logged in with your Azure Credentials you will see based on the subscription one or more resource groups.

You can click OK and proceed with next steps to publish your function to the desired resource group –> function app. However, this will in the end not work!

As a result you will need a workaround as explained in Publishing a .NET class library as a Function App at least that’s what I found online. However, I as able to deploy it. However, I stumbled on another error in the portal:

Error:

Function ($WindSpeedToBeaufort) Error: Microsoft.Azure.WebJobs.Host: Error indexing method ‘Functions.WindSpeedToBeaufort’. Microsoft.Azure.WebJobs.ServiceBus: Microsoft Azure WebJobs SDK ServiceBus connection string ‘AzureWebJobsconnection‘ is missing or empty.

Hence, not a truly positive experience. In the end it’s missing a setting i.e. application setting of the Function App.

Anyways, another walkaround is to create add a new function to existing function app. Choose ServiceBusTrigger template, create it and finally copy the code from the local project into the template over the existing code. In conclusion this works as now you see a setting for the Service Bus connection string in the application setting and the reference in the function.json file.

Considerations

There are some considerations around Azure Function you need to be aware of. First of all the cost of execution, which determines whether you will choose a consumption or app plan. See Function Pricing and use the calculator to have a better indication of costs. Also consider some of the best practices around functions. These practices are:

  • Azure Functions should do just one task,
  • finish as quickly as possible,
  • be stateless
  • and be idempotent.

See also Optimize the performance and reliability of Azure Functions.

Finally, be aware of the fact that some features of Azure Functions are still preview like Proxies, Slots and the Visual Studio Tools.

Resources

This blog contains several links to some resources you might like to explore. An excellent starting point for a researching Azure functions is https://github.com/Azure/Azure-Functions. And if you are interested how Functions can play a role in Logic Apps have a look at this blog post: Building sentiment analysis solution with Logic Apps.

Explore Azure Functions, go serverless!

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years.

Integrating Microsoft Teams as a Notification channel in BizTalk360

Integrating Microsoft Teams as a Notification channel in BizTalk360

Recently in one of our support tickets, a customer enquired on whether Microsoft Teams as a notification channel would be implemented in upcoming releases, as he heard of it in the INTEGRATE 2017 event, where Saravana introduced ServiceBus360 and Teams was one of the notification channels there.

Therefore, I thought if I could provide an alternate workaround to achieve the same in BizTalk360. This often occurs in Support where if we don’t have the functionality currently in the product, we do strive to provide a similar working functionality by discussing it with the development team. Here is my implementation of the feature request by using Logic Apps & WebHook Notification Channel.

Create the Channel in Microsoft Teams

Integrating Microsoft Teams as a Notification channel in BizTalk360

Integrating Microsoft Teams as a Notification channel in BizTalk360

Once the Team has been given a suitable name and was successfully created, then we can create a new channel for that Team. (Click the … near the newly created Team and choose ‘Add channel’.

Once the Channel has been created we can use the Team name & Channel Name in the Azure portal as the destination for the Post Message (Teams).

This can be achieved via Logic Apps or by creating a custom Notification channel. We will have a quick look at both the implementations.

1. Implementation via Logic Apps

Configuration in Logic Apps:

So I created a Demo Logic App, and here is a screenshot of the design used.

Integrating Microsoft Teams as a Notification channel in BizTalk360

I’ve used a Request-Response and added an Azure function to help Parse the JSON response received from the BizTalk360 Notification channel and then passed that composed message to a Post message action for Microsoft Teams. Azure will ask you to authenticate your login (Teams) and then allow you to select the specific Team and Channel from Microsoft Teams.

In the 1st Request, you will also need to supply the Request Body JSON Schema or use sample Payload to generate the schema. Please refer to this code for the Schema I used.

Azure Function Code implementation

To access the code used for the Azure function, please access this code at Github website.

Integrating Microsoft Teams as a Notification channel in BizTalk360

Configuring BizTalk360 WebHook Notification Channel for the Logic App

Please refer to this article which describes how to set up Webhook notification Channel.

https://assist.biztalk360.com/support/solutions/articles/1000245561-adding-a-webhook-notification-channel

You can get the URL for the Web API from the Logic App – refer to the screenshot earlier provided and the arrows identify where to get the URL from. Use that in configuring the BT360 webhook notification channel.

Integrating Microsoft Teams as a Notification channel in BizTalk360

Once the WebHook Notification Channel is configured, you can select it as the notification option in the specific Alarm
Integrating Microsoft Teams as a Notification channel in BizTalk360

Receiving the Notifications

Once the Threshold is violated, similar to the Email notifications, you will now see a notification in Microsoft Teams.

Integrating Microsoft Teams as a Notification channel in BizTalk360
While I have parsed the JSON message and only displayed the Application Name & Artifact Name that has the error, you can choose and customize your error messages as required.

2. Implementation via Custom Notification Channel

You can read these articles which show how to create a custom notification channel.

https://blogs.biztalk360.com/introduction-custom-notification-channel-sdk-biztalk360/

https://assist.biztalk360.com/support/solutions/articles/1000217940-adding-a-new-custom-notification-channel

You need to select the WebHook Connector from Microsoft teams. You need to copy the WebHook URL which you will then enter the code in the custom notification channel.

Integrating Microsoft Teams as a Notification channel in BizTalk360

You then need to setup the Custom Notification Channel as mentioned in the blogs.

Then you only need to add this code to the FileChannel.cs file either instead of after the successful completion of File notification completed. Again I have only output Alarm Name and an Error string. Please customize as required.

Integrating Microsoft Teams as a Notification channel in BizTalk360

So I hope this blog gave you a good idea as to how you can integrate Teams with BizTalk360 Notifications.

Microsoft Integration Weekly Update: July 17

Microsoft Integration Weekly Update: July 17

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Middleware Friday turns 6 Months

Middleware Friday turns 6 Months

Yes, it has been 6 months since we launched Middleware Friday! Saravana announced the launch of Middleware Friday in the first week of this year.  After a fair success from Integration Monday events under Integration User Group, Saravana and Kent had the idea to experiment this concept of short video blogs on some interesting Integration topics and interviews from industry leaders. In a larger view, Middleware Friday is something very much like a short news update on the latest Integration trends posted on every Friday.

Six Months and 26 sessions later…

We’ve had 23 sessions from Kent Weare and 3 guest sessions from Steef-Jan Wiggers (both of them are eminent MVP’s, and active community members) till date, with topics ranging from Logic Apps, Power Apps, Service Bus, Power BI, Cognitive Services, Serverless Integration, HTTP Connectors, API Management and lots more. We would like to extend our thanks to the speakers and attendees for making the #MiddlewareFriday sessions a success. We’re not done yet! One of the hardest parts in conducting these kinds of video logs is being short and yet deliver fair technical output that would help the viewer. We are very pleased with the outcome so far.

Protecting Azure Logic Apps with Azure API Management

Azure Logic Apps and Service Bus Peek-Lock

Logic Apps and Cognitive Services Face API – Part 1

Microsoft PowerApps and Cognitive Services Face API – Part 2

Serverless Integration

Logic Apps and Power BI Real-Time Data Sets

Azure Monitoring, Azure Logic Apps and Creating ServiceNow Tickets

Monitoring Azure Service Bus Queues and Topics using ServiceBus360

Azure Logic Apps and SAP – Part 1

Azure Logic Apps and SAP – Part 2

Azure Logic Apps and HTTP Connector

Using API Management to protect on-premises BizTalk endpoints

Global Integration Bootcamp – March 25, 2017, New York

Using API Management to protect Azure Functions

Introduction to Azure Functions Proxies

Azure Logic Apps and Azure EventHubs

BizTalk Server 2016 First Look

BizTalk Server 2016 + Logic Apps – Thunder and Lightning

BizTalk Server 2016 Feature Pack 1

Task Management Face off with Logic Apps and Flow

Austin City Limits with Stephen W. Thomas

Azure Logic Apps – Retry Policy

Azure Logic Apps – Azure Active Directory Connector

Azure Event Hubs: Auto-Inflate

INTEGRATE 2017 Preview Show

INTEGRATE 2017 Highlight Show

Over the course of next few weeks until further communication, Kent will be moving his time schedule to summer hours. Therefore, Middleware Friday episodes will be published every alternate week.

Author: Mohan Nagaraj

Mohan as a Senior Technical Writer responsible for product documentation and all other content related activities. He combines his passion for business, technology, and writing to spread the word about BizTalk360. Mohan is responsible to work with cross functional teams to visualize and create product documentation and marketing content. He feels writing is so much fun and it is satisfying to capture the company’s soul & passion and make it live through documentation.

Introducing cloud-native integration (and why you should care!)

Introducing cloud-native integration (and why you should care!)

I’ve got three kids now. Trying to get anywhere on time involves heroics. My son is almost ten years old and he’s rarely the problem. The bottleneck is elsewhere. It doesn’t matter how much faster my son gets himself ready, it won’t improve my family’s overall speed at getting out the door. The Theory of Constraints says that you improve the throughput of your process by finding and managing the bottleneck, or constraint. Optimizing areas outside the constraint (e.g. my son getting ready even faster) don’t make much of a difference. Does this relate to software, and application integration specifically? You betcha.

Software delivery goes through a pipeline. Getting from “idea” to “production” requires a series of steps. And then you repeat it over and over for each software update. How fast you get through that process dictates how responsive you can be to customers and business changes. Your development team may operate LIKE A MACHINE and crank out epic amounts of code. But if your dedicated ops team takes forever to deploy it, then it just doesn’t matter how fast your devs are. Inventory builds up, value is lost. My assertion is that the app integration stage of the pipeline is becoming a bottleneck. And without making changes to how you do integration, your cloud-native efforts are going to waste.

What’s “cloud native” all about? At this month’s Integrate conference, I had the pleasure of talking about it. Cloud-native refers to how software is delivered, not where. Cloud-native systems are built for scale, built for continuous change, and built to tolerate failure. Traditional enterprises can become cloud natives, but only if they make serious adjustments to how they deliver software.

Even if you’ve adjusted how you deliver code, I’d suspect that your data, security, and integration practices haven’t caught up. In my talk, I explained six characteristics of a cloud-native integration environment, and mixed in a few demos (highlighted below) to prove my points.

#1 – Cloud-native integration is more composable

By composable, I mean capable of assembling components into something greater. Contrast this to classic integration solutions where all the logic gets embedded into a single artifact. Think ETL workflows where ever step of the process is in one deployable piece. Need to change one component? Redeploy the whole thing. One step require a ton of CPU processing? Find a monster box to host the process in.

A cloud-native integration gets built by assembling independent components. Upgrade and scale each piece independently. To demonstrate this, I built a series of Microsoft Logic Apps. Two of them take in data. The first takes in a batch file from Microsoft OneDrive, the other takes in real-time HTTP requests. Both drop the results to a queue for later processing.

2017.07.10-integrate-01

The “main” Logic App takes in order entries, enriches the order via a REST service I have running in Azure App Service, calls an Azure Function to assign a fraud score, and finally dumps the results to a queue for others to grab.

2017.07.10-integrate-02

My REST API sitting in Azure App Service is connected to a GitHub repo. This means that I should be able to upgrade that individual service, without touching the data pipeline sitting Logic Apps. So that’s what I did. I sent in a steady stream of requests, modified my API code, pushed the change to GitHub, and within a few seconds, the Logic App is emitting out a slightly different payload.

2017.07.10-integrate-03

#2 – Cloud-native integration is more “always on”

One of the best things about early cloud platforms being less than 100% reliable was that it forced us to build for failure. Instead of assuming the infrastructure was magically infallible, we built systems that ASSUMED failure, and architected accordingly.

For integration solutions, have we really done the same? Can we tolerate hardware failure, perform software upgrades, or absorb downstream dependency hiccups without stumbling? A cloud-native integration solution can handle a steady load of traffic while staying online under all circumstances.

#3 – Cloud-native integration is built for scale

Elasticity is a key attribute of cloud. Don’t build out infrastructure for peak usage; build for easy scale when demand dictates. I haven’t seen too many ESB or ETL solutions that transparently scale, on-demand with no special considerations. No, in most cases, scaling is a carefully designed part of an integration platform’s lifecycle. It shouldn’t be.

If you want cloud-native integration, you’ll look to solutions that support rapid scale (in, or out), and let you scale individual pieces. Event ingestions unexpected and overwhelming? Scale that, and that alone. You’ll also want to avoid too much shared capacity, as that creates unexpected coupling and makes scaling the environment more difficult.

#4 – Cloud-native integration is more self-service

The future is clear: there will be more “citizen integrators” who don’t need specialized training to connect stuff. IFTTT is popular, as are a whole new set of iPaaS products that make it simple to connect apps. Sure, they aren’t crazy sophisticated integrations; there will always be a need for specialists there. But integration matters more than ever, and we need to democratize the ability to connect our stuff.

One example I gave here was Pivotal Cloud Cache and Pivotal GemFire. Pivotal GemFire is an industry-leading in-memory data grid. Awesome tech, but not trivial to properly setup and use. So, Pivotal created an opinionated slice of GemFire with a subset of features, but an easier on-ramp. Pivotal Cloud Cache supports specific use cases, and an easy self-service provisioning experience. My challenge to the Integrate conference audience? Why couldn’t we create a simple facade for something powerful, but intimidating, like Microsoft BizTalk Server? What if you wanted a self-service way to let devs create simple integrations? I decided use the brand new Management REST API from BizTalk Server 2016 Feature Pack 1 to build one.

I used the incomparable Spring Boot to build a Java app that consumed those REST APIs. This app makes it simple to create a “pipe” that uses BizTalk’s durable bus to link endpoints.

2017.07.10-integrate-05

I built a bunch of Java classes to represent BizTalk objects, and then created the required API payloads.

2017.07.10-integrate-04

The result? Devs can create a new pipe that takes in data via HTTP and drops the result to two file locations.

2017.07.10-integrate-06

When I click the button above, I use those REST APIs to create a new in-process HTTP receive location, two send ports, and the appropriate subscriptions.

2017.07.10-integrate-07

Fun stuff. This seems like one way you could unlock new value in your ESB, while giving it a more cloud-native UX.

#5 – Cloud-native integration supports more endpoints

There’s no turning back. Your hippest integration offered to enterprise devs CANNOT be SharePoint. Nope. Your teams want to creatively connect to Slack, PagerDuty, Salesforce, Workday, Jira, and yes, enterprisey things like SQL Server and IBM DB2.

These endpoints may be punishing your integration platform with a constant data stream, or, process data irregularly, in bulk. Doing newish patterns like Event Sourcing? Your apps will talk to an integration platform that offers a distributed commit log. Are you ready? Be ready for new endpoints, with new data streams, consumed via new patterns.

#6 – Cloud-native integration demands complete automation

Are you lovingly creating hand-crafted production servers? Stop that. And devs should have complete replicas of production environments, on their desktop. That means packaging and automating the integration bus too. Cloud-natives love automation!

Testing and deploying integration apps must be automated. Without automated tests, you’ll never achieve continuous delivery of your whole system. Additionally, if you have to log into one of your integration servers, you’re doing it wrong. All management (e.g. monitoring, deployments, upgrades) should be done via remote tools and scripts. Think fleets of servers, not long-lived named instances.

To demonstrate this concept, I discussed automating the lifecycle of your integration dependency. Specifically, through the use of a service broker. Initially part of Cloud Foundry, the service broker API has caught on elsewhere. A broad set of companies are now rallying around a single API for advertising services, provisioning, de-provisioning, and more. Microsoft built a Cloud Foundry service broker, and it handles lots of good things. It handles lifecycle and credential sharing for services like Azure SQL Database, Azure Service Bus, Azure CosmosDB, and more. I installed this broker into my Pivotal Web Services account, and it advertised available services.

2017.07.10-integrate-08

Simply by typing in cf create-service azure-servicebus standard integratesb -c service-bus-config.json I kicked off a fast, automated process to generate an Azure Resource Group and create a Service Bus namespace.

2017.07.10-integrate-09

Then, my app automatically gets access to environment variables that hold the credentials. No more embedding creds in code or config, no need to go to the Azure Portal. This makes integration easy, developer-friendly, and repeatable.

2017.07.10-integrate-10

Summary

It’s such an exciting time to be a software developer. We’re solving new problems in new ways, and making life better for so many. The last thing we want is to be held back by a bottleneck in our process. Don’t let integration slow down your ambitions. The technology is there to help you build integration platforms that are more scalable, resilient, and friendly-to-change. Go for it!

Advertisements



Categories: BizTalk, Cloud, Cloud Foundry, DevOps, General Architecture, Messaging, Microservices, Microsoft Azure, Node.js, Pivotal, Spring, Windows Azure Service Bus

Microsoft Integration Weekly Update: July 10

Microsoft Integration Weekly Update: July 10

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Building sentiment analysis solution with Logic Apps

Building sentiment analysis solution with Logic Apps

Integrate 2017, a well-organized Microsoft Integration focused event, took place from 26 to 28 of June at Kings Place in London. It attracted 380 plus attendees from 50 different countries and had 28 speakers from around the globe including the Microsoft Product Group. I did a session around Logic Apps from the consumer, end user, and business perspective and used sentiment analysis as for my demo.

Context

To provide you some context. Logic App service was the most prominent technology during the three-day event. This Azure Service became general available a year ago and is starting to build momentum as premier cloud integration capability. Most of all, the service fits rather well in the complete Azure Platform with its connectors to a wide variety of other Azure services and in addition, it connects with SaaS solutions such as Twitter, Zendesk, Salesforce, ServiceNow, PagerDuty, and Slack.

During Integrate 2017 I talked about empowering business with Logic Apps. And my goal was to show the audience the value of Logic Apps for the business. The service is a true iPaaS service according to the definition Wikipedia provides online. And it is a part of Azure, which is multi-tenant, has a subscription model or in the case of Logic Apps it’s consumption based (micro-billing), provides pre-built ready available connectors, deployment/manage/monitoring through the platform.

iPaaS

If you look at how for instance Gartner describes iPaaS then again Logic Apps are a true cloud-native integration platform. Consumers of Logic Apps in Azure can implement data, application, API and process integration projects spanning cloud-resident and on-premises endpoints. I will quote the Gartner report here:

“This is achieved by developing, deploying, executing, managing and monitoring “integration flows” (aka “integration interfaces”) — that is, integration applications bridging between multiple endpoints so that they can work together.”

And the iPaaS capabilities typically include according to Gartner:

• Communication protocol connectors (FTP, HTTP, AMQP, MQTT, Kafka, AS1/2/3/4, etc.)
• Application connectors/adapters for SaaS and on-premises packaged applications
• Several data formats (XML, JSON, ASN.1, etc.)
• Data standards (EDIFACT, HL7, SWIFT, etc.)
• Mapping and transformation of data
• Quality of data
• Routing and Orchestration
• Integration flow development and lifecycle management tools
• Integration flow operational monitoring and management
• Full lifecycle API management

Looking at the above capabilities than Logic Apps in combination with Integration Account and API Management provide those capabilities.

Gartner Quadrant

Logic Apps are positioned in Gartner Quadrant in the Visionaries box, which means that the vendor of the service is able to execute lower than the leaders (in the Quadrant vendors like Dell Boomi and Informatica), have a smaller install base, certain immaturity, timid marketing, reactive sales operation and lack of strategic commit to the market.

My take on that is that Logic Apps is relatively new in the iPaaS market.

  • A year ago it became general available. And it is maturing at a fast pace with new feature releases every two weeks with an expanding set of connectors.
  • Sales representation from Microsoft at Integrate 2017.
  • And finally, the commitment is strong with the Pro Integration Product Group presence at various conferences throughout 2017. This year they have or will attend Ignite, Build, Integrate2017 Europe, Inspire (former WPC), Integrate 2017 US, Integration Bootcamp, Global Integration Bootcamp, Global Azure Bootcamp, and smaller User Group meetings worldwide.

Hence I struggle a bit with the classification of the current state of Logic Apps. I strongly feel the service is close to the border of visionary and leader. It has promised to become a true iPaaS leader.

Benefits

Business can reap the benefits from this service as the attention is towards solving the problem(s) it is facing. Logic Apps is a part of a large Platform. And it can deliver solutions fast as there’s no need for procuring servers, or other infrastructure related capabilities. This accounts for the business that has transformed their business to the cloud and requires cloud-native solutions. That’s what fit for purpose with Logic Apps. And the costs are less and time to market of your solutions is fast.

Use Cases

The connectors provided by Logic App can help you build solutions for various enterprise scenarios. For instance, you leverage cognitive services to identify a person to subsequently grant him access to resources, start an onboarding process, or provide access to a facility. An example of leveraging Cognitive Services is to perform text analysis on tweets, which I will explain in further detail later in this post.

The text analysis can be useful to detect sentiment in a tweet. Particularly on a #hashtag, for instance, a person like Trump, product or service. I mention President Trump here as the current US President uses this social media service quite extensively. And the tweets he produces are evaluated intensively for stock trading.

Dynamics 365

Other thinkable use cases evolve around the Dynamics 365 CRM Online connector. This connector provides connectivity to Dynamics CRM that provides various features like customer service automation, marketing campaigns, and social engagements.

Dynamics 365 has several capabilities or flavors; one is Dynamics for Field Service, which provides a complete Field Service management solution, including service locations, customer assets, preventative maintenance, work order management, resource management, product inventory, scheduling and dispatch, mobility, collaboration, customer billing, and analytics. Therefore, during integrate I talked about leveraging this solution in combination with IoT devices. The picture below shows the data flow from device to the Dynamic Field Service features.

Building sentiment analysis solution with Azure Logic Apps

Data from a device can be consumed by IoT Hub service in Azure and pushed to the service bus queue, which can be read by Logic App. The Logi App forwards the data into Dynamics Field Service through the CRM connector. In conclusion, a Logic App or number of them can be part of an end-to-end solution for various field services.

The previous paragraph discussed one of the many use cases possible including Logic Apps. Moreover, there are many other scenarios thinkable since Logic Apps are a part of a bigger platform, which means you leverage them with other Azure Services or create flows to move data around. With sentiment analysis, you can detect sentiment within a text using one of the Cognitive Services API’s. The way sentiment analysis API functions are that it returns a numeric score between 0 and 1 on a given text. Scores close to 1 indicate positive sentiment and scores close to 0 indicate negative sentiment. A score of 0.5 is neutral. With Logic Apps, you can receive tweets within a certain interval (occurrence) based on filter i.e. hashtag and feed the body into Detect Sentiment action.

Sentiment Analysis Solution

To build a solution leveraging the capabilities Cognitive Services deliver with a Logic App, Azure Storage Account, Azure Function and Power BI you need to set up these services up.

Cognitive Service

The setup of the first is basically provisioning of a Cognitive Service instance i.e. API. In the Azure Portal, you find the Cognitive Service in the marketplace. Subsequently, you click on the service you specify a name, choose a subscription, and subsequently which API you like to use.

Building sentiment analysis solution with Azure Logic Apps

To detect sentiment analysis in a text you need to choose Text Analytics API, which as the time of writing is still in preview. The Text Analytics API is only available in region West US, and pricing of service varies depending on the tier you require. Below you can see the different pricing options.

Building sentiment analysis solution with Azure Logic Apps
As you can see in the picture above the Cognitive Service provides four features:

• Sentiment Analysis
• Key Phrase Extraction
• Topic Detection
• Language Detection

Once you have chosen the required tier you can create this service.

Power BI

The next service is Power BI, which is a part of the Office365 offering and can be found here: https://powerbi.microsoft.com/. You can sign in and start building datasets, dashboards, and reports. For a solution to visualize sentiment you can create a streaming data set. Go to the powerbi.com and “Streaming datasets”, create a dataset of type API, click next and name the dataset and add fields to the streaming data set like shown below.

Building sentiment analysis solution with Azure Logic Apps

The Solution

In a solution, I build I created four text fields and one number field. The historic data analysis was enabled to build a collection of the data to be used for a report.

Now both Cognitive Service and Power BI have been setup and next step is to create a storage account in Azure. This account will archive tweets in a blob storage container tweets. Provisioning a storage account is easy and straightforward process. In the marketplace find storage account, select it, specify name, deployment model, purpose (choose blob storage), replication, access tier (cold), secure transfer, subscription, resource group, and location.

The final service required for the solution is a function. The Function in our solution will be provided with the input from the Cognitive Service API response (Score). Azure Functions provide a serverless coding capability using a Browser and the piece’s code you write can run in Azure i.e. within a Function App.

For our solution, we add a GenericWebHook-CSharp. We will rename the function to AnalyseSentimentScore. And in the Develop tab, we see some generic default code, which we will change to the code below.

Building sentiment analysis solution with Azure Logic Apps

Architecture

The solution architecture I build looks like the diagram below and resembles a process manager pattern.

Building sentiment analysis solution with Azure Logic Apps

This pattern implies that a trigger message is sent to a process manager (Logic App). The process manager is a central processing unit and determines steps based on intermediate results. A tweet is the trigger message that starts a flow in a Logic App. The body is sent to Cognitive Service (Proc A) and the score is sent to a Function (Proc B), which will evaluate the score. The Tweet is stored in blob storage and a few fields are sent to Power BI to fill the dataset. A diagram of a process manager is depicted below.

Building sentiment analysis solution with Azure Logic Apps

Implementation

The implementation of the solution is slightly different than from the pattern as after the second intermediate step the tweet data is sent to Azure Blob Storage and Power BI dataset.

The Logic App is implemented with a Twitter trigger, authorized to use my twitter account, with the search text #integrate2017 and interval (frequency) of 5 minutes i.e. every 5 minutes tweets with #integrate2017 will be picked up. Subsequently, this trigger is followed by several actions.

Building sentiment analysis solution with Azure Logic Apps

The picture above shows the flow of the Logic App. First, a Twitter triggers then a compose action to create an element part containing the username of the tweet. Subsequently the detect sentiment and the detect key phrases actions. Then the second composes to create a JSON array of the key phrases. And after the second compose the score of the detected sentiment is send to the function, which will return a string (text) of the evaluated score (see also the function). Several tokenized elements are sent to blob storage (see picture below).

Building sentiment analysis solution with Azure Logic Apps

And the final step of this solution (Logic App definition) is sending some of the tokenized elements to a dataset row in Power BI dataset.

Building sentiment analysis solution with Azure Logic Apps

Now we have walked through the complete Logic App definition and the key actions of the solution.

Integrate 2017 Report

For integrate 2017 I ran the Logic App between 17th of June until the 1st of July. And the event took place between 26th and 28th of June in London. Every 5 minutes the Logic App collected tweets from Twitter with hashtag integrate2017. Over this period of 15 days, 3500 tweets have been aggregated around this event. It started slowly with around 50 tweets until the event started on the 24th with a burst of tweets. Below you can see a report created in Power BI with some visualization of sentiment measured in the tweets.

Building sentiment analysis solution with Azure Logic Apps

Around 2/3 of all the tweets, the sentiment was excellent/good, which can be viewed as positive. 1/3 of the tweets were evaluated as moderate. The Cognitive Service Text Analysis capability was unable to determine negative or positive. And finally, a very small percentage was negative (bad). Hence you can conclude that the event was a great event given the sentiment score.

The benefits of building a solution like described above are that with a relative simple Logic App sentiment can be analyzed leveraging several abilities provided by the Cognitive Service. Probably when a business likes to measure sentiment through Social Media channel it can use Logic Apps. Therefore, Logic Apps provide a quick solution in this manner to provide quick insights with low costs. There are no servers necessary and a pro-integration professional can build this type of solutions within a few hours depending on the complexity. Hence it provides quick time to market.

The costs

The interesting part of this solutions is cost. The breakdown of costs for this solution is:

– Logic App (Consumption)
– Function (Consumption)
– Cognitive Service (Tier)
– Storage Account (Volume)
– Power BI (Enterprise Plan)

The Logic App and Function are consumption based and measured on the execution of an action or function. And in general, it can sometimes be hard to predict the workload these services need to process. Hence you need to be aware of this. A good reference with regards to costs with Logic Apps is a post by Rene Brauwers, Tips & Tricks: Cost savings using Logic Apps.

For the Logic App in this solution, 3500 tweets were processed, and the Logic App consists of 8 actions (including the trigger). Hence 28K action calls costs based on the pricing (First 250K actions = €0.000675 / action) approximately 19 euro. And less than a euro for the executions of the Function.

Next, the costs for the Cognitive Service depends on the tier. The free tier could be an option, however, if the workload is too high then you run into rate limiting issues. The S1 Standard can be sufficient and costs 150 Euro a month. Yet you can turn it off after your campaign of measuring sentiment, which could be a few days. In this solution, the costs are 75 euro. Storage of less than 4 Mb of tweets is neglectable. This leaves the costs for Power BI. For the solution, I build I used the pro version, which is around 10 Euro per month. Thus, in total, a sentiment analysis solution costs around a 100 euro.

Conclusion

Depending what the requirements are and perceived value is, Logic Apps combined with other Azure Services and Office365 (Power BI) can be a good fit for purpose for low costs, agility and time to market. Logic Apps are becoming a leader in the iPaaS. On a short term, it will be able to cross the border from visionary to leaders in the Gartner Magic Quadrant. The Product Group is cranking out enhancements on the service and new connectors every two weeks. And they have kept this pace since the General Availability of the service a year ago. Nevertheless, the competition is strong however I am confident Logic Apps will be amongst the leaders.

Author: Steef-Jan Wiggers

Steef-Jan Wiggers has over 15 years’ experience as a technical lead developer, application architect and consultant, specializing in custom applications, enterprise application integration (BizTalk), Web services and Windows Azure. Steef-Jan is very active in the BizTalk community as a blogger, Wiki author/editor, forum moderator, writer and public speaker in the Netherlands and Europe. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 5 years.