Synegrate is now a Gold Partner for Microsoft’s Azure Cloud Platform

Synegrate is now a Gold Partner for Microsoft’s Azure Cloud Platform

I am proud to be part of Synegrate which is now certified as Gold Partner for Microsoft’s Azure Cloud Platform. Synegrate was already a Managed Microsoft Gold Certified Partner in Application Integration and a Microsoft Silver Certified Partner in Cloud Competency.

Today Synegrate achieved a Microsoft Gold Cloud Platform competency, demonstrating a best-in-class ability and competencies on the Azure platform.

The Cloud Platform competency is designed for partners to capitalize on the growing demand for infrastructure and software as a service (SaaS) solutions built on Microsoft Azure.

This is the highest attainable partnership level and is earned after achieving the defined competency requirements.

To earn a Microsoft gold competency, partners must successfully complete exams (resulting in Microsoft Certified Professionals) to prove their level of technology expertise, and then designate these certified professionals uniquely to one Microsoft competency, ensuring a certain level of staffing capacity. They also must submit customer references that demonstrate successful projects, meet a performance (revenue and or consumption/usage) commitment, and pass technology and/or sales assessments.

Achieving the Microsoft Gold Cloud Platform competency showcases Synegrate’s expertise in and commitment to today’s cloud technology market and demonstrates deep knowledge of Microsoft’s Cloud Platform.

Synegrate has proven a reliable partner for customers globally. Over the years the company has been successful in assisting customers in various Microsoft solution based endeavors that created value propositions ranging from reduced costs or complexity to increased availability and security. Here are some stories and examples of our loyal customers.

About Synegrate

Synegrate’s core focus is data. We have data coursing through our veins; it is in our DNA. We specialize in the storing, integration, dissemination, visualization and analytics of data. We create modern data driven applications, BPM (Business Process Management) processes to orchestrate data and dashboards for data analysis.

Synegrate is a 100% Microsoft focused company that is fully committed to the Microsoft Azure cloud services and solutions.

We utilize the platforms, products and tools provided by Microsoft, to provide our customers with innovation, analytics and insight. We’re a front runner in helping our customers realize their future state architectures on the Microsoft Azure cloud.

We have our Head Office in California, development centers in different regions, allowing us to service the US from coast to coast.

Connect with Synegrate @ http://www.synegrate.com/

Advertisements

Logic Apps Integration for Small Business

Logic Apps Integration for Small Business

When Logic Apps was first announced at the Integrate summit in Seattle a few years ago one of my first comments was that I felt that this could be an integration game changer for small business in due course. The reason I said this was if you look across the vendor estate for integration products you have 2 main areas. The first is the traditional integration broker and ESB type products such as BizTalk, Oracle Fusion, Websphere and others. The 2nd area is the newer generation of iPaas such as Mulesoft, Dell Boomi, etc. While the technicalities of how they solve problems has changed and their deployment models are different they all have 1 key thing in common. They all view integration as a high value premium thing that customers will pay a lot of money to do well.

This has always rules them out of the equation for small business and meant that over the years many SME companies will typically implement integration solutions with custom code from their small development teams. This was often their only choice and made it difficult because as they grow they would reach a point where they got to a certain size yet their integration estate would be a mess of custom scripts, components and other things.

What excites me with Logic Apps is that Microsoft have viewed the cost model in a different way. While it is possible to spend a lot of money on some premium features it is also possible to create complex integration solutions that have zero up front cost and a running cost of less than a cup of coffee. This mindset that integration is a commodity not a premium service can be put forward by Microsoft because they have a wide cloud platform and offering low cost integration will increase the compute usage by customers across their cloud platform. Other than the big cloud players such as AWS and Google its much harder for them to think of integration in this way because the vendor doesn’t have the other cloud features to offer. Likewise AWS and Google who do have the platform play don’t have any integration heritage so this puts

Outside of the integration companies, small business has looked at products like Zapier and IFTTT for a few years but these products can only go so far in terms of complexity of processes you want to implement.

Microsoft in a unique position where they have an integration offering with something for the biggest enterprise right down the scale to something for a one man band.

In Microsoft world if you’re a small company the likelihood is your using Office 365, there are some great features available on that platform and for my own small business ive been an Office 365 user for years. One example of how I use it is for my accounts and business finance. While I use it a lot, I do have one legacy solution in place from my pre-office 365 days. I had a Microsoft Access database which I wrote a small console app which would load transactions from the CSV files from my bank into it so I could process them and keep my accounts up to date. Ive hated this access solution for years but it did the job.

I have decided that now is a good opportunity to migrate this to Office 365 along with the rest of my accounts and finance info and this will let me get rid the console app.

The plan for this new interface was to use Logic Apps to pick up the csv file I can download from my bank and then load the transactions into an Office 365 SharePoint list and then copy the file into a SharePoint document library for back up.

At a high level the architecture will look like the below picture.

While this integration may seem relatively straightforward there were a few hoops to jump through so I thought it might be interesting to share the journey.

Issues with Barclays File

First off the thing to think about would be the Barclays file. I will always have to download this manually (it would be nice if I could get them to deliver it to be monthly). The file was a pretty typical CSV file but a couple of things to consider.

First there is no unique id for each transaction!! – I found this very strange but the problem it causes is each time I download the file the same transaction may be in multiple files. A file would typically have around 3 months data in it. This means I need to check for transactions which have already been processes.

2nd there is a number field in the file but this is not populated so ill ignore this for now.

3rd and most awkwardly is that its possible to have 2 or more rows in the file which would be exactly the same but refer to different transactions. This happens if you pay the same place 2 or more times with the same amount on the same day. Id have to figure out how to handle this.

Logic App – Load Bank Transactions

I was going to implement the solution with 2 logic apps. The first one will collect the file, it will then process the file and do a loop over each record but each record will be processed individually by a separate Logic App. I like this separation as it makes it easier to test the Logic Apps.

Before I get into the details of the logic apps, the below picture shows what the logic app looks like.

The main bit that is interesting in this Logic App is the parsing of the csv file. With Logic Apps if you have plenty of money to spend you can get an Enterprise Integration Account which includes flat file parsing capability. Unfortunately however I am a small business so I cant justify this cost. Instead I took advantage of the Azure Functions. In the Logic App I pass the file content to a function and then in the function I processed each line from the file and created an object model which will be returned as Json which will make the rest of the processing much easier.

In the function it was easy for me to use some .net code to do a bit of logic on the data and also to do things like trying to identify the type of transaction.

The big positive is using the consumption plan for Functions this means the cost is again very very cheap.

One interesting thing to note was I had some encoding issues calling the function from the logic app using the CSV data. I didn’t really get to workout the root cause of this as I could call it fine in Postman but I think its something about how the Logic App and its function connector encapsulate the call to the function. Fortunately it was really easy to work around this because I could just call the function with the HTTP connector instead!

The only other interesting point is I made the loop sequential just to keep processing simple so I didn’t have to worry about concurrency issues.

Azure Function Parse Bank Data

In the Azure Function I chose to tackle the problem of the duplicate looking transactions. The main job of the function was to convert the CSV data to JSON but I did a little extra processing to simplify things. One feature I implemented was to add a row key field. This would be used to populate the title field in the sharepoint list. This means id have a unique key to look up any existing records to update.

When calculating the row key I basically used the text from the entire row which in most cases was unique. As I processed records I checked if there was a transaction which already had that key. If it did I would add a counter to the end of it so if there were 3 rows with a row key id add a -2 and -3 to the 2nd and 3rd instances of the for to make them unique.

This isn’t the nicest solution in the world but it does work and gets us past the limitation from the Barclays data.

Response Object

Below is an picture of the response object returned from the functions so you can see its just an object wrapping a list of transactions.

Logic App – Load Single Transaction

Once I have my data in a nice JSON format, the parent Logic App would loop the records and call the child Logic App for each record. Below is a picture of the child Logic App.

In this next section I am going to talk about how I implemented the solution. Please note that while this works, I did have a chat with Jeff Holland afterwards and he advised me on some optimisations which I am going to implement which will make this work a little nicer. I am going to blog this as a separate post but this is based on me working through how to get it to work with designer only features.

What’s interesting about this Logic App was the hoops to do the upsert style functionality. First off I needed to use the SharePoint Get Items with a query expression to “Title eq ‘#Row Key#’”. This would take the row key parameter passed in and then get any matches. I also used the return max records 1 setting as there would only be 1 match.

I also initialized a variable which I then counted the number of records in the logic app array that was returned. In my condition I could check the record could to see if there was a match in the query of SharePoint which would direct me to insert or update.

From here the Insert Item action was very straight forward but the Update Item was a little more fiddly. Because get items was an array I needed to put the Update inside a loop even though I know there is only 1 row. In the upsert I could use either fields from the input object or the queried item depending if I want to change values or not.

At this point I now had a working solution which took me about 2-3 hours to implement and test and I now have migrated my bank transaction processing into Office 365. The Azure upfront cost for the solution was zero and the running costs is about 50 pence each time I process a file.

Summary

As you can see this integration solution is viable for a small business. The main expense is manpower to develop the solution. I now have a nice sandboxed integration solution in a resource group in my companies Azure subscription. Its easy to run and monitor/manage. The great thing is these kind of solutions can grow with my business.

If you think about it this could be a real game changer for some small businesses. When you think about B2B business, often its about how smoothly and well integrated two business can be that is the differentiator between success and failure. Typically big organisations are able to automate these B2B processes but now with Azure integration a very small business could be able to implement an integration solution on the cloud which could massively disrupt the status quo of how B2B integration works in their sector. When you consider that those smaller business also don’t have the slow moving processes and people/politics of big business this must create so many opportunities for forward thinking SME organisations.

Logic Apps Integration for Small Business

Logic Apps Integration for Small Business

When Logic Apps was first announced at the Integrate summit in Seattle a few years ago one of my first comments was that I felt that this could be an integration game changer for small business in due course. The reason I said this was if you look across the vendor estate for integration products you have 2 main areas. The first is the traditional integration broker and ESB type products such as BizTalk, Oracle Fusion, Websphere and others. The 2nd area is the newer generation of iPaas such as Mulesoft, Dell Boomi, etc. While the technicalities of how they solve problems has changed and their deployment models are different they all have 1 key thing in common. They all view integration as a high value premium thing that customers will pay a lot of money to do well.

This has always rules them out of the equation for small business and meant that over the years many SME companies will typically implement integration solutions with custom code from their small development teams. This was often their only choice and made it difficult because as they grow they would reach a point where they got to a certain size yet their integration estate would be a mess of custom scripts, components and other things.

What excites me with Logic Apps is that Microsoft have viewed the cost model in a different way. While it is possible to spend a lot of money on some premium features it is also possible to create complex integration solutions that have zero up front cost and a running cost of less than a cup of coffee. This mindset that integration is a commodity not a premium service can be put forward by Microsoft because they have a wide cloud platform and offering low cost integration will increase the compute usage by customers across their cloud platform. Other than the big cloud players such as AWS and Google its much harder for them to think of integration in this way because the vendor doesn’t have the other cloud features to offer. Likewise AWS and Google who do have the platform play don’t have any integration heritage so this puts

Outside of the integration companies, small business has looked at products like Zapier and IFTTT for a few years but these products can only go so far in terms of complexity of processes you want to implement.

Microsoft in a unique position where they have an integration offering with something for the biggest enterprise right down the scale to something for a one man band.

In Microsoft world if you’re a small company the likelihood is your using Office 365, there are some great features available on that platform and for my own small business ive been an Office 365 user for years. One example of how I use it is for my accounts and business finance. While I use it a lot, I do have one legacy solution in place from my pre-office 365 days. I had a Microsoft Access database which I wrote a small console app which would load transactions from the CSV files from my bank into it so I could process them and keep my accounts up to date. Ive hated this access solution for years but it did the job.

I have decided that now is a good opportunity to migrate this to Office 365 along with the rest of my accounts and finance info and this will let me get rid the console app.

The plan for this new interface was to use Logic Apps to pick up the csv file I can download from my bank and then load the transactions into an Office 365 SharePoint list and then copy the file into a SharePoint document library for back up.

At a high level the architecture will look like the below picture.

While this integration may seem relatively straightforward there were a few hoops to jump through so I thought it might be interesting to share the journey.

Issues with Barclays File

First off the thing to think about would be the Barclays file. I will always have to download this manually (it would be nice if I could get them to deliver it to be monthly). The file was a pretty typical CSV file but a couple of things to consider.

First there is no unique id for each transaction!! – I found this very strange but the problem it causes is each time I download the file the same transaction may be in multiple files. A file would typically have around 3 months data in it. This means I need to check for transactions which have already been processes.

2nd there is a number field in the file but this is not populated so ill ignore this for now.

3rd and most awkwardly is that its possible to have 2 or more rows in the file which would be exactly the same but refer to different transactions. This happens if you pay the same place 2 or more times with the same amount on the same day. Id have to figure out how to handle this.

Logic App – Load Bank Transactions

I was going to implement the solution with 2 logic apps. The first one will collect the file, it will then process the file and do a loop over each record but each record will be processed individually by a separate Logic App. I like this separation as it makes it easier to test the Logic Apps.

Before I get into the details of the logic apps, the below picture shows what the logic app looks like.

The main bit that is interesting in this Logic App is the parsing of the csv file. With Logic Apps if you have plenty of money to spend you can get an Enterprise Integration Account which includes flat file parsing capability. Unfortunately however I am a small business so I cant justify this cost. Instead I took advantage of the Azure Functions. In the Logic App I pass the file content to a function and then in the function I processed each line from the file and created an object model which will be returned as Json which will make the rest of the processing much easier.

In the function it was easy for me to use some .net code to do a bit of logic on the data and also to do things like trying to identify the type of transaction.

The big positive is using the consumption plan for Functions this means the cost is again very very cheap.

One interesting thing to note was I had some encoding issues calling the function from the logic app using the CSV data. I didn’t really get to workout the root cause of this as I could call it fine in Postman but I think its something about how the Logic App and its function connector encapsulate the call to the function. Fortunately it was really easy to work around this because I could just call the function with the HTTP connector instead!

The only other interesting point is I made the loop sequential just to keep processing simple so I didn’t have to worry about concurrency issues.

Azure Function Parse Bank Data

In the Azure Function I chose to tackle the problem of the duplicate looking transactions. The main job of the function was to convert the CSV data to JSON but I did a little extra processing to simplify things. One feature I implemented was to add a row key field. This would be used to populate the title field in the sharepoint list. This means id have a unique key to look up any existing records to update.

When calculating the row key I basically used the text from the entire row which in most cases was unique. As I processed records I checked if there was a transaction which already had that key. If it did I would add a counter to the end of it so if there were 3 rows with a row key id add a -2 and -3 to the 2nd and 3rd instances of the for to make them unique.

This isn’t the nicest solution in the world but it does work and gets us past the limitation from the Barclays data.

Response Object

Below is an picture of the response object returned from the functions so you can see its just an object wrapping a list of transactions.

Logic App – Load Single Transaction

Once I have my data in a nice JSON format, the parent Logic App would loop the records and call the child Logic App for each record. Below is a picture of the child Logic App.

In this next section I am going to talk about how I implemented the solution. Please note that while this works, I did have a chat with Jeff Holland afterwards and he advised me on some optimisations which I am going to implement which will make this work a little nicer. I am going to blog this as a separate post but this is based on me working through how to get it to work with designer only features.

What’s interesting about this Logic App was the hoops to do the upsert style functionality. First off I needed to use the SharePoint Get Items with a query expression to “Title eq ‘#Row Key#’”. This would take the row key parameter passed in and then get any matches. I also used the return max records 1 setting as there would only be 1 match.

I also initialized a variable which I then counted the number of records in the logic app array that was returned. In my condition I could check the record could to see if there was a match in the query of SharePoint which would direct me to insert or update.

From here the Insert Item action was very straight forward but the Update Item was a little more fiddly. Because get items was an array I needed to put the Update inside a loop even though I know there is only 1 row. In the upsert I could use either fields from the input object or the queried item depending if I want to change values or not.

At this point I now had a working solution which took me about 2-3 hours to implement and test and I now have migrated my bank transaction processing into Office 365. The Azure upfront cost for the solution was zero and the running costs is about 50 pence each time I process a file.

Summary

As you can see this integration solution is viable for a small business. The main expense is manpower to develop the solution. I now have a nice sandboxed integration solution in a resource group in my companies Azure subscription. Its easy to run and monitor/manage. The great thing is these kind of solutions can grow with my business.

If you think about it this could be a real game changer for some small businesses. When you think about B2B business, often its about how smoothly and well integrated two business can be that is the differentiator between success and failure. Typically big organisations are able to automate these B2B processes but now with Azure integration a very small business could be able to implement an integration solution on the cloud which could massively disrupt the status quo of how B2B integration works in their sector. When you consider that those smaller business also don’t have the slow moving processes and people/politics of big business this must create so many opportunities for forward thinking SME organisations.

INTEGRATE 2017 – BizTalk360 Partner & Product Specialist of the Year Awards

INTEGRATE 2017 – BizTalk360 Partner & Product Specialist of the Year Awards

During the course of INTEGRATE 2017, BizTalk360 Founder/CTO Saravana Kumar presented the Partner of the Year (2016) and Product Specialist awards to companies and technical leaders who have showcased and demonstrated expertise with the BizTalk360 product over the last year. We started this tradition in the BizTalk Summit 2015 and INTEGRATE 2016 events and the trend continues.

BizTalk360 Partner of the Year 2016 Awards

The Partner of the Year 2016 award was bagged by Codit from Netherlands, Solidsoft Reply from UK, and Evry from Sweden.

INTEGRATE 2017 Awards - PartnerINTEGRATE 2017 Awards - Partner

Product Specialist Of The Year 2016 Awards

BizTalk360 recognized the efforts of people who have proven their history of implementation of the product over the past year. The program recognizes these exceptional contributions by allowing product specialists early access to products and a forum for providing feedback. This year, we are extremely happy to present this award to 15 people –

  • Bart Scheurweghs
  • David Grospelier
  • Daniel Toomey
  • Daniel Wilen
  • Maarit Laine
  • Eldert Grootenboer
  • Eva De Jong
  • Joakim Wadskog
  • Jordy Maes
  • Kent Weare
  • Kien Pham
  • Maxime Delwaide
  • Milen Koychev
  • Nicolas Blatter
  • Steef – Jan Wiggers

We would like to thank all our Partners and Product Specialists for their efforts towards improving the reach of BizTalk360 to customers.

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”.

When to use Logic Apps vs BizTalk Server

When to use Logic Apps vs BizTalk Server

Now a days, this is a very common and valid question in the BizTalk community, both for existing BizTalk customer and for new one too.

Here is what Tord answered in the open Q&A with product group at 100th Episode of integration Monday.  Check at ~ 30.30 minutes of the video.

If your solution need to communicate with SaaS application, Azure workloads and cloud business partners (B2B) all in cloud then you should use Azure Logic Apps, but if you are doing lot of integration with on-premise processing by communicating with on-premise LOB applications, then BizTalk is the pretty good option. You can use both if you are doing hybrid integration.

So basically, it depends on scenario to scenario based on your need and architecture of your solution.

Many enterprises now use a multitude of cloud-based SaaS services, and being able to integrate these services and resources can become complex. This is where the native capability of Logic Apps can help by providing connectors for most enterprise and social services and to orchestrate the business process flows graphically.

If your resources are all based in the cloud, then Logic Apps is a definite candidate to use as an integration engine.

Natively, Logic Apps provides the following key features:

Rapid development: Using the visual designer with drag and drop connectors, you design your workflows without any coding using a top-down design flow. To get started, Microsoft has many templates available in the marketplace that can be used as is, or modified to suit your requirements. There are templates available for Enterprise SaaS services, common integration patterns, Message routing, DevOps, and social media services.

Auditing: Logic Apps have built-in auditing of all management operations. Date and time when workflow process was triggered and the duration of the process. Use the trigger history of a Logic App to determine the activity status:

  • Skipped: Nothing new was found to initiate the process
  • Succeeded: The workflow process was initiated in response to data  being available
  • Failed: An error occurred due to misconfiguration of the connector

A run history is also available for every trigger event. From this information, you can determine if the workflow process succeeded, failed, cancelled, or is still running.

Role-based access control (RBAC): Using RBAC in the Azure portal, specific components of the workflow can be locked down to specific users. Custom RBAC roles are also possible if none of the built-in roles fulfills your requirements.

Microsoft managed connectors: There are several connectors available from the Azure Marketplace for both enterprise and social services, and the list is continuously growing. The development community also contributes to this growing list of available connectors as well.

Serverless scaling: Automatic and built in on any tier.

Resiliency: Logic Apps are built on top of Azure’s infrastructure, which provides a high degree of resiliency and disaster recovery.

Security: This supports OAuth2, Azure Active Directory, Cert auth and Basic auth, and IP restriction.

There are also some concerns while working with Logic Apps, shared by Microsoft IT team at INTEGRATE 2017

You can also refer the book, Robust cloud integration with Azure to understand and get started with integration in cloud.

 

When you have, resources scattered in the cloud and on premise, then you may want to consider BizTalk as a choice for this type of hybrid integration along with Logic Apps.

BizTalk 2016 include an adapter for Logic Apps. This Logic App adapter will be used to integrate Logic Apps and BizTalk sitting on premise. Using the BizTalk 2016 Logic App adapter on-premise, resources can directly talk to a multitude of SaaS platforms available on cloud.

The days of building monolithic applications are slowly diminishing as more enterprises see the value of consuming SaaS as an alternative to investing large amounts of capex to buy Commercial Off the Self (COTS) applications. This is where Logic Apps can play a large part by integrating multiple SaaS solutions together to form a complete solution.

BizTalk Server has been around since 2000, and there have been several new products releases since then. It is a very mature platform with excellent enterprise integration capabilities.

Below is a short comparison matrix between BizTalk and Logic Apps:

Conclusion

Microsoft Integration platform has all the option for all kind of customer’s integration need.

Advertisements

INTEGRATE 2017 – Recap of Day 3

INTEGRATE 2017 – Recap of Day 3

After a scintillating Day 1 and Day 2 at INTEGRATE 2017, the stage was perfectly set for the last (Day 3) day of the event. Before you proceed further, we recommend you to take a read at the following links –

Quick Links

Session 1 – Rethinking Integration by Nino Crudele

Day 3 at INTEGRATE 2017 started off with the “Brad Pitt of the Integration Community” – Nino Crudele. It was a perfect start to the last day of this premier integration focused conference.

Nino started off his session by thanking his mentor, a fellow MVP for instilling knowledge about Power BI. This session was based on true experience. Nino shared his experience of how he calls the job as his passion with three different types of jobs – Bizzy (BizTalk), DEFCON1, and Chicken Way. In this context, what Nino refers to the Chicken way is the way in which you can actually solve the problem – you can take a direct or an indirect approach to solve the problem.

Nino even had some Chicken Way Red Cards to give away to the community and some reactions to that were –

Then Nino presented the most comical slide of the entire #Integrate2017 event – a question / answer from his 12-year old daughter about BizTalk.

The above slide shows how people perceive the technology actually. Therefore, it’s imperative that you have to choose the proper technology to solve the specific problem and make the customer happy. Nino also explained what according to him are the top technology stacks and made a mention that “BizTalk is SOLID” – a very solid technology platform.

Then Nino gave an example of his customer experience where the customers were using 15 BizTalk Servers! :O Nino suggested changes to certain approaches in their business process, and the way to get the real time performance improvement. The customer was also looking for a real fast hybrid integration (point to point) with BizTalk in the project with real time monitoring, tracing and so on. Nino suggested a framework that was completely built over the cloud. This approach was more reliable and the customer had complete control over the messaging system, scalable and so on. The solution made use of Logic App, Event Hubs, Service Bus, Blob storage and many more such integration solutions which made the customer happy.

The session moved into a cool demo from Nino (real time data visualization in Power BI using custom visualization) which you can get to watch when the videos go Live on the INTEGRATE 2017 website.

Session 2 – Moving to Cloud-Native Integration by Richard Seroter

The second session of the day was from Richard Seroter on Moving to Cloud-Native Integration. Richard started off his talk with the analogy of “theory of constraints” where processes are limited to throughput by the constraint (bottleneck). In any software environment, you have to focus on what is the constraint that is slowing you down and optimize it. In an organization environment, there are chances that the “integration might itself be the constraint” to slow things and slow down the business.

Therefore, Richard introduces the concept of cloud native integration to connect different systems.

Integration Today

According to Gartner, in current scenario, application-to-application integration is the most critical integration scenario, while few years down the line, cloud service integration will rise to the top. The actual spending on integration platforms is on the rise with the fastest growth in iPaaS and API Management.

Again, Gartner says, by 2020, 75% of the companies will establish a hybrid integration platform using an infrastructure that they assemble from different vendors. By 2021, atleast 50% of large companies will have incorporated citizen integrator capabilities into their integration infrastructure.

What is Cloud Native?

Cloud native is basically “how” do I build a software!

The following image clearly shows the difference between a traditional enterprise and a cloud native enterprise.

Delivering Cloud Native Integration

  • Build a more composable solution that is
    • Loosely coupled
    • Makes use of choreographed services
    • Push more logic to the endpoints
    • Offer targeted updates

Richard then jumped into his demos where in the first demo, he used a Logic App as a data pipeline. The Logic App receives a message from the queue, call a service running in Azure App service, call a Azure function that does some fraud processing, and feed the result message back to the queue for further processing.

To feed the queue, Richard deploys another Logic App where a file is picked up from OneDrive, parse the file as a JSON array and dump it to the queue which is on the other Logic App.

That’s not it! Richard had few more demos in store – Making BizTalk server easy where he used BizTalk 2016 FP1 Management APIs to create BizTalk artifacts self-service style, and automate Azure via Service Broker.

We recommend you to watch this session when the video is made available in a week’s time on the INTEGRATE 2017 website.

Session 3 – Overcoming Challenges When Taking Your Logic App into Production

Stephen started off with a key announcement about the readiness of a New Pluralsight Course – “BizTalk Server Administration with BizTalk360“. The course will be made available shortly.

Phase 1 of the session was targeted towards ‘Decision Making‘, phase 2 was on what we did right and wrong, and the last phase with some important tips.

Decisions

Stephen compared building a .NET parser solution to Logic Apps development. Logic Apps was calculated to have finished earlier and way cheaper. They even questioned if Integration Account are worth the price ($1000 per month)

What’s Wrong and Right?

    • Make design decisions based on the rules on the serverless platform and factoring costs per Logic Apps action
    • Stephen described that initially he used 2 subscriptions in 2 regions, but this made deployment across regions hard. Therefore, the best practice is to have one subscription in one region
    • Solution structure – Solution level maps to a resource group, use one project per Logic App, maintained 3 parameter files, one per environment. For performing deployment you can create a custom VM.
  • Serverless – is AMAZING, but sometimes things break for no fault of your own, sometimes Microsoft support needs to be called in for support/fixing issues

Tips

  • Read the available documentation
  • Don’t be afraid for JSON – code view is still needed especially with new features, but most of the time are soon available in designer and visual studio. Always save or check-in before switching to JSON.
  • Make sure to fully configure your actions, otherwise you cannot save the Logic App
  • Ensure name of action, hard to change afterwards
  • Try to use only one MS account
  • If you get odd deployment results, close / reopen your browser
  • Connections – Live at resource group level. Last deployment wins. Best practices: define all  connection parameters in one Logic App. One connection per destination, per resource group.
  • Default retries – all actions retry 4 additional times over 20s intervals.
  • Control using retry policies
  • Resource Group artefacts – contain subscription id, use parameters instead
  • For each loop – limited to 100000 loops . default to multiple concurrent loops, can be changed to sequential loops
  • Recurrence – singleton
  • User permissions (IAM) – multiple roles exist like the Logic App Contributor, and the Logic App Operator

With that, it was time for the attendees to take a break!

After the break, Duncan Barker from the BizTalk360 team took the stage to thank the wonderful team at BizTalk360 for all their effort in making INTEGRATE 2017 a great success!

Session 4 – BizTalk Server Deep Dive into Feature Pack 1

Tord was given a warm welcome with the song “Rise” by Katy Perry. Tord complimented the welcome by saying how good friends he and Katy Perry are and the story behind how she wrote the song for BizTalk. 🙂

Fun aside, Tord started off the session by saying how BizTalk Server 2016 almost got a pink theme for the icons! :O Just hours before the team was to do the final build for BizTalk Server 2016 Feature Pack 1 release, one of the engineers pointed out the pink stroke on the outside of all icons. The team managed to fix and ship the release.

But, do you know! There is one tiny pixel of pink somewhere in some icon? If you find it, send Tord an email and he will send you a nice gift!

BizTalk Connector in Logic Apps is now Generally Available with Full Support!!!

Microsoft IT team have built a first class project to help migrate easily to BizTalk Server 2016. You can get your downloadable version of the application from the below link. If migration is what is holding you, then make use of this application.

With BizTalk Server, you can do so many things! You can take advantage of the cloud through BizTalk Server. Tord walked through the different features that were released as a part of Feature Pack 1 in detail with some Live Demo.

Session 5 – BizTalk Server Fast & Loud

After that power packed introduction from Daniel Szweda for Sandro Pereira comparing him with Cristiano Ronaldo (who as well hails from Portugal), guess what happened! SANDRO PEREIRA forgot to Turn on his machine to show his presentation :O The IT admin guy at Kings Place almost had to show up 5 – 6 times to get the “problem” solved, and Sandro termed it with the famous word “Jetlag” that was associated with most speakers during any technical issues 😛 🙂 And.. there was a roar when the presentation worked for Sandro! Phew … There goes the BizTalk Wiki Ninja, BizTalk Mapper Man, The Stencil Guy into his session.

Sandro started off his session with this slide

Sandro’s session was more towards BizTalk Server optimization and performance. The points discussed in this session were –

SQL Server

  • Clients still don’t have BizTalk Jobs running
  • Comparing in a Car terminology,
    • BizTalk Server is the Chassis
    • SQL Server is the Engine
    • Hard Drivers is the Tiers
    • Memory is the Battery
    • CPU is the Fuel Injector
    • Network and Visualization Layer is the Exhaust pipe
  • Make sure BizTalk Server and SQL Server Agent jobs are configured and running
  • Treat BizTalk databases as a Black box
  • Size really matters in BizTalk! Large databases impact performance (Eg., MessageBoxDB, Tracking database)
  • Consider to dedicate SQL resources to BizTalk Server
  • Consider splitting the TempDB into multiple data files for better performance

Networking

  • Speed defines everything for this layer
  • At a minimum, you need to have 1 logical disk for data files, 1 for transaction log files, and 1 for TempDB data files
  • Remove unnecessary network
  • Scaling out is not a solution to all problems – sometimes you may also have to scale in to solve a problem!

Session 6 – BizTalk Health Check – What and How?

The last session before lunch was on BizTalk Health Check – What and How? by Saffieldin Ali. BizTalk Health Check is something similar to the MOT Testing that’s performed on vehicles in UK. MOT Testing is a compulsory test for exhaust and emissions of motor vehicles.

In BizTalk, the health check is performed to –

  • Identify symptoms and potential problems before it affects production environment
  • Review critical processes to achieve minimum downtime due to disaster recovery
  • Identify any warnings and red flags that may be affecting users
  • Understanding of common mistakes made by administrators and developers
  • Understand the supportability and best practices

BizTalk Health Check Process

Interviewing

  • Operations Interview (1-1 meetings with admins/dev teams to collect operational view of things)
  • Knowledge Transfer

Collecting

  • Run collection tools (BizTalk Health Monitor etc)
  • Collect informal information (say, I did something wrong last week during an informal discussion)

Analysis and Reporting

  • Run and examine analysis tools results
  • Write and Present final conclusion

BizTalk Health Check Areas

  1. Platform configuration for BizTalk Server
  2. BizTalk Server Configuration
  3. BizTalk Performance
  4. Resilience (High Availability)
  5. SQL Server Configuration for BizTalk Server
  6. Disaster Recovery
  7. Security
  8. BizTalk Application Management and Monitoring

BizTalk Health Check Key Tools

  1. Microsoft Baseline Security Analyser (MBSA)
  2. BizTalk Best Practices Analyser
  3. BizTalk Health Monitor (BHM)
  4. Perf Analysis of Logs (PAL)

Safieldin showed how each of the above products work and how they perform the checks on the BizTalk environment.

It was time for the community to break out for Lunch and some networking before the close of the event in the next couple of hours.

Session 7 – The Hitchhiker’s Guide to Hybrid Connectivity by Dan Toomey

The last leg of #Integrate2017 was something quite significant. All the 3 speakers – Daniel Toomey, Wagner Silveira and Martin Abbott are the ones who have flown into London after some long flights. Dan and Martin from Australia (about 20 hours) and Wagner from New Zealand (about 30 hours!).

Post lunch, it was time for Dan Toomey from Australia to take the stage to talk about The Hitchhiker’s Guide to Hybrid Connectivity.

Dan started his talk about the types of Azure Virtual Network –

  • Point to Site (P2S) – Something similar to connection when you work from home and connect to corporate network (connect to Citrix/VPN) over the internet
  • Site to Site (S2S) – taking an entire network and joining with another network over the internet
  • ExpressRoute – something like taking a giant cable (managed by someone else) and connecting your corporate network on that.

VNET Integration for Web/Mobile Apps

  • Requires Standard and Premium App Service Plan
  • VNET must be in the same subscription as App Service Plan
  • Must have Point to Site enabled
  • Must have Dynamic Routing Gateway

VNET with API Management

If you have API Management that is sitting in your Virtual Network with access to your Corporate Network gateway, you will get:

  • Added layer of security
  • All benefits of API Management (caching, policies, protocol translation [SOAP to REST], Analytics, etc)

Non-Network based Operations

Azure Relay (an alternate approach) – This is a new offering with Azure Service Bus

    • WCF Relay
    • Hybrid Connections
      • Operates at transport level

On-Premises Data Gateway

  • Generally available since 4th May 2017
  • Acts as a bridge between Azure PaaS and on-prem resources
  • Works with connectors for Azure Logic Apps, Power Apps, Flow and Power BI

Daniel wrapped up his talk by talking about the following business scenarios –

  1. Azure Web/Mobile App to On-Prem
  2. IaaS Server (VM) to On-Prem
  3. SaaS Service to On-Prem
  4. Business to Business
  5. Service Fabric Cluster to On-Prem

To know more about these scenarios that Dan talked about, please watch the video which will be made available soon.

Session 8 – Unlocking Azure Hybrid Integration with BizTalk Server by Wagner Silveira

In this session, Wagner started off his talk speaking about Why BizTalk + Azure, and what BizTalk brings to Hybrid Integration –

  • On-premises adapters
  • Azure adapters
  • Separation of concerns
  • Availability
  • For existing users
    • Leverage investment into the platform
    • Continuity to developers

Wagner talked about the ways in which you can connect to Azure in detail along with some scenarios-

  • Service Bus
  • Azure WCF Relay
  • App Services/API Management
  • Logic Apps

Wagner showed an exciting demo for 2 Line of Business (LoB) systems and finally some tweets coming out of Logic Apps.

Session 9 – From Zero to App in 45 minutes (using PowerApps + Flow) by Martin Abbott

There we were! The last session at #Integrate2017. Obviously not a good feeling being the speaker as you would be closing what was an amazing 3 days of learning and experience. But Martin did a great job in showing the power of PowerApps and Flows and showed how you can build an application in 45 minutes using the combo.

Martin started off his talk talking about Business Application Platform Innovation which is represented in a very nice diagram.

Martin just had 3 slides and it was an action packed session with demo to create an application in under 45 minutes. We recommend you to watch the video which will be available shortly on the event website.

Key Announcement – Global Integration Bootcamp 2018

Martin was one of the organizers of the recently concluded Global Integration Bootcamp event in March 2017. It’s now official that we will have the #GIB event in 2018. The event will happen on 24th March, 2018. You can follow the website http://www.globalintegrationbootcamp.com/ for further updates.

Sentiment Analysis on #Integrate2017

In the Day 1 Recap blog, we had shown some statistics on the sentiment analysis of tweets for hashtag #Integrate2017. Here is one last look at the report at 00:00 (GMT+0530) on June 29, 2017.

And, with that!!! It was curtains down on what has been a fantastic 3 days at INTEGRATE 2017. Well, we are not just done yet! As announced on Day 1 by Saravana Kumar, INTEGRATE 2017 will be back in Redmond, Seattle, USA on October 25-27, 2017. So if you missed attending this event in London, come and join us at Redmond.

We hope you had a great time at INTEGRATE 2017. Until next time, adios!!!

In case you missed it!

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”.

Integrate 2017 – Day 3 Recap

Integrate 2017 – Day 3 Recap

Rethinking Integration – Nino Crudele

Nino Crudele was perfectly introduced as the “Brad Pitt” of integration. We will not comment on his looks, but rather focus on his ability to always bring something fresh and new to the stage!

Nino’s message was that BizTalk Server has the ideal architecture for extensibility across all of its components. Nino described how he put a “Universal Framework” into each component of BizTalk. He did this to be able to improve the latency and throughput of certain BizTalk solutions, when needed and appropriate.

He also shared his view on how not every application is meant to fully exist in BizTalk Server alone. In certain situations BizTalk Server may only act as a proxy to something else. It’s always important to choose the right technology for the job. As an integration expert it is important to keep up with technology and to know its capabilities, allowing for a best of breed solution in which each component fits a specific purpose e.g. Event Hubs, Redis, Service Bus, etc…

Nino did a good job delivering a very entertaining session and every attendee will forever remember “The Chicken Way“.

Moving to Cloud-Native Integration – Richard Seroter

Richard Seroter presented the 2nd session of the day. He shared his views on moving to cloud-native thinking when building integration solutions. He started by comparing the traditional integration approach with the cloud-computing model we all know today. Throughout the session, Richard shared some interesting insights on how we should all consider a change in mindset and shift our solutions towards a cloud-native way of thinking.

“Built for scale, built for continuous change, built to tolerate failure”

Cloud-native solutions should be built “More Composable”. Think loose-coupling, building separate blocks that can be chained together in a dynamic fashion. This allows for targeted updates, without having to schedule downtime… so “More Always-On”. With a short demo, Richard showed how to build a loosely-coupled Logic App that consumed an Azure Function, which would be considered a dependency in the traditional sense. Then he deployed a change to the Azure Function – on-the-fly – to show us that this can be accomplished without scheduled downtime. Investing time in into the design and architecture aspect of your solution pays off, when this results in zero downtime deployments.

Next, he talked about adding “More Scalability” and “More Self-Service”. The cloud computing model excels in ease of use and makes it possible for citizen developers or ad-hoc integrators to take part in creating these solutions. This eliminates the need for a big team of integration specialists, but rather encourages a shift towards embedding these specialists in cross-functional teams.

In a fantastic demo, he showed us a nice Java app that provides a self-service experience on top of BizTalk Server. Leveraging the power of the new Management API (shipped with Feature Pack 1 for BizTalk 2016 Enterprise), he deployed a functioning messaging scenario in just a few clicks, without the need of ANY technical BizTalk knowledge. Richard then continued by stating that we should all embrace the modern resources and connectors provided by the cloud platform. Extend on premises integration with “More Endpoints” by using, for example, Logic-Apps to connect BizTalk to the cloud.

The last part focused on “More Automation”, where he did not only talk about automated build and automated deployment, but also recommended creating environments via automation to achieve the highest possible levels of consistency. In another short demo, Richard showed us how he automatically provisioned a ServiceBus instance and all related Azure resources from the Cloud Foundry Service Broker CLI.

Be sure to check out the recording of this session! It has some valuable insights for everyone involved in cloud integration!

Overcoming Challenges When Taking Your Logic App into Production – Stephen W Thomas

The third session of the day was presented by Stephen W Thomas, who gave us some insights into the challenges he faced during his first Logic Apps implementation at a customer.

He split up his session in three phases, starting with the decisions that had to be taken. After a short overview of the EDI scenario he was facing and going over the available options that were considered for the implementation, it was clear that Logic Apps was the winner due to several reasons. The timeline was pretty strict, and doing custom .NET development would have taken 10 times longer than using Logic Apps. The initial investment for BizTalk, combined with the limited presence of BizTalk development skills, made Logic Apps the logical choice in this case. However, if you already use EDI in BizTalk, it probably makes sense to keep doing so, since your investment is already there.

In the second phase, he reflected on the lessons learned during the project. The architecture had to be made with the rules of a serverless platform in mind. This included a 2-weekly release cadence, which could affect the current functionality, which in turn makes it important to check the release notes. Another thing to keep in mind is the (sometimes) unpredictable pricing: where every Action in Logic Apps costs money, in BizTalk you can just keep adding on expression shapes without worrying about additional cost.

In the last phase, he left us with some tips and tricks that he gained through experience with Logic Apps. “Don’t be afraid to use JSON”. Almost every new feature is introduced in code view first, so take advantage of it by learning to work with it. It’s also good to know that a For-Each loop in Logic Apps runs concurrently by default, but luckily this behaviour can be changed to Sequential (in the code view).

BizTalk Server Deep Dive into Feature Pack 1 – Tord Glad Nordahl

Tord had a few announcements to make which were appreciated by the audience:

  • The BizTalk connector for Logic Apps, which was in preview before today, is now generally available (GA).
  • Microsoft IT publicly released the BizTalk Server Migration Tool, which they use internally for their own BizTalk migrations. This tool should help in migrating your environment towards BizTalk Server 2016.

Tord discussed the BizTalk Server 2016 Feature Pack 1 next.

With the new ALM features, it’s possible to deploy BizTalk solutions to multiple environments from any repository supported by Visual Studio Team Services. Just like the BizTalk Deployment Framework (BTDF), it is also possible to have one central binding file with variables being replaced automatically to fit your specific target environment.
 
The Management API included in Feature Pack 1 enables you to do almost anything that is possible in the BizTalk Management Console. You can create your own tools based on the API. For example: end users can be provided with their own view on the BizTalk environment. The API even supports both XML and JSON.
 
Feature Pack 1 also includes a new PowerBI template, which comes with the added Analytics. The template should give you a good indication on the health of your environment(s). The PowerBI template can be changed or extended with everything you can see on the BizTalk Management Console, according to your specific needs.

Tord also discussed that the BizTalk team is working on several new things already, but he could not announce anything new at the moment. We are all very anxious to hear what will come in the next Feature Pack!

BizTalk Server Fast & Loud – Sandro Perreira

Fast and loud: a session about BizTalk performance optimizations. The key takeaway is that you need to tune your BizTalk environments, beyond a default installation, if you want to achieve really high throughput and low latency. Sandro pointed out that performance tuning must be done on three levels: SQL Server, BizTalk Server and hardware.

SQL Server is the heart of your BizTalk installation and the performance heavily depends on its health. The most critical aspect is that you need to ensure that the SQL Agent Jobs are up and running. The SQL agent jobs keep your MessageBox healthy and avoid that your DTA database gets flooded. Treat BizTalk databases as a black box: don’t create your own maintenance plans, as they might jeopardize performance and you’ll end up with unsupported databases. Besides that, he mentioned that you should avoid large databases and that it is always preferable to go with dedicated SQL resources for BizTalk.

Performance tuning on the BizTalk Server level is mostly done by tuning and configuring host instances. You should have a balanced strategy for assigning BizTalk artifacts to the appropriate hosts. A dedicated tracking host is a must-have in every BizTalk environment. Be aware that there are also configuration settings at host (instance) level, of which the polling interval setting provides the quickest performance win to reduce latency.

It’s advised to take a look at all the surrounding hardware and software dependencies. Your network should provide high throughput, the virtualization layer must be optimized and disks should be separated and fast.

These recommendations are documented in the Codit best practices and it’s also part of our BizTalk training offering.

BizTalk Health Check – What and How? – Saffieldin Ali

After all the technical and conceptual sessions, it is good to be reminded that existing BizTalk environments and solutions need to be monitored properly to assure a healthy BizTalk platform and maximize both reliability and performance proactively. Identifying threats and issues lower or even avoid downtime in case of a disaster.
 
Microsoft’s Saffieldin Ali shared his own experience, including various quotes that he collected throughout the years.
 
When visiting and interviewing customers, Ali has a list of red flags which, without even examining the environments, indicate that BizTalk may not be as healthy as you would want it to be. Discovering that customers have their own procedures to do backups, a lack of documentation of a BizTalk environment or not having the latest updates installed can be a sign of bad configuration. Any of which can cause issues in the future, affect operations and disrupt business.
 
To detect these threats, Ali explained how you can use tools like BizTalk Health Monitor (BHM), Performance Analysis of Logs (PAL) and Microsoft Baseline Security Analyzer (MBSA). He also showed us that, in BHM, there are two modes: a monitoring mode, which should be used as a basic monitoring tool and secondly, a reporting tool on the health of a BizTalk environment.

Incorporating the use of these tools in your maintenance plan is definitely a best practice every BizTalk user should know about!

The Hitchhiker’s Guide to Hybrid Connectivity – Dan Toomey

In the first session after the afternoon break, Dan Toomey presented the different types of hybrid connectivity that allow us to easily set-up secure connections between systems. 

The network based options being Azure Virtual Network (VNET), with integration for web and mobile apps and VNET with API Management. This has all the advantages of APIM, but with an added layer of security. The non-network based options are WCF Relay, Azure Relay Hybrid Connections and the  On-Premises Data Gateway.

The concept of WCF-Relay is based on a secured listener endpoint in the cloud, which is opened via an outbound connection from within a corporate network. Clients send messages via the listeners endpoint, without the receiving party having to make any changes to the corporate firewall.

WCF Relay, which has the advantage of being the cheapest option, works on the application layer, whereas Hybrid Connections (HC) work on the transport layer. HC rely on port forwarding and work cross-platform. It is set-up in Azure (Service Bus) and connects to the HC Manager which is installed on premises.

The On-Premises Data Gateway acts as a bridge between Azure PaaS and on premises resources, and works with connectors for Logic Apps, Power Apps, Flow & Power BI.

In the end, Dan went through some scenarios to illustrate which relay is the better fit for specific situations. Being a big fan of the Hybrid Connection, the Hybrid Connection was often the preferred solution.

Dan finally mentioned that he has a Pluralsight training that goes into this topic. Although a bit dated since it also discusses BizTalk Services, the other material is still relevant.

Unlocking Azure Hybrid Integration with BizTalk Server – Wagner Silveira

Why should we use BizTalk Server and Azure together? That is the question Wagner Silveira kicked off his talk with.

He then talked about the fact that, if you are working on a complex scenario, you may want to use BizTalk Server if there are multiple systems you wish to call on premises. If there are multiple cloud endpoints to interface with, you might want to base the solution on Azure components. The goal being to avoid creating a slingshot solution with multiple roundtrips between on premises and cloud.
Since most organizations still have on premises systems, they can use BizTalk Server to continually get value out of their investments, and to continue leveraging the experience which developers and support teams have acquired.

He went on to talk about the available options that are available to connect to Azure. Wagner gave an overview of these options, in which he discussed Service Bus, Azure WCF Relay, App Services, API Management and Logic Apps.
When discussing Service Bus for example, he talked about how Service Bus allows full content based routing and asynchronous messaging. The latter would allow you to overcome unreliable connectivity, allow for throttling into BizTalk Server and multicasting scenarios from BizTalk to multiple subscribers.

Next he spoke about WCF-Relay. He talked about some of the characteristics of this option, stating that it supports both inbound and outbound communication based on dynamic relay, which is optimized for XML and supports ACS and SAS Security. WCF-Relay also has REST-support, which can be used to expose REST-services as well. You can then use WCF-Relay to publish for either inbound or outbound communication. Outbound communication is generally allowed by default, inbound communication will require network changes. Finally, you can also define outbound headers to support custom authentication.

A couple of typical scenarios for inbound WCF-relay that Wagner gave as examples were: real-time communication, exposing legacy or bespoke systems and to minimize the surface area (no “swiss cheese” firewall).
Examples of outbound scenarios are: leveraging public API’s and shifting compute to the cloud (for batch jobs for example), which allows us to minimize the BizTalk infrastructure footprint.

Next up was the Logic Apps adapter for BizTalk Server. Scenarios for using this solution would include extending workflows into Azure (think of connecting BizTalk Server to SalesForce for example). Another example would be exposing on premise data to Logic Apps.
For flows from Logic Apps into BizTalk on the other hand, it allows for securing internal systems, pre-validating messages and leveraging on premises connectors to expose legacy/bespoke systems.

The main takeaway for this session is that you should get to know the tools available, understand the sweet spots and know what to avoid. Not only from a technology and functional point of view, but from a pricing perspective as well.

There are many ways to integrate… Mix, match, and experiment to find the balance!

From Zero to App in 45 minutes (using PowerApps + Flow) – Martin Abbott

It is hard to give an overview of the last session by Martin Abbot about PowerApps since Martin challenged the “demo gods”, by making it a 40-minute demo, with only 3 slides. A challenging, but interesting session where Martin created a PowerApps app, using some entities in the Common Data Service. He then connected PowerApps to Microsoft Flow and created a custom connector to be consumed as well, demonstrating the power of the tools. As one of the “founding fathers” of the Global Integration Bootcamp, he also announced the date for the next #GIB2018 event: the event will occur on March 24th 2018

Thank you for reading our blog post, feel free to comment with your feedback. Keep coming back, since there will be more blogs post to summarize the event and to give you some recommendations on what to watch when the videos are out.

This blogpost was prepared by:

Pieter Vandenheede (BE)
Toon Vanhoutte (BE)
Jonathan Gurevich (NL) 
Carlo Garcia-Mier (UK)
Jef Cools (BE)
Tom Burnip (UK)
Michel Pauwels (BE)
Ricardo Marques (PT)
Paulo Mendonça (PT)
Pim Simons (NL)
Iemen Uyttenhove (BE)
Mariëtte Mak (NL)
Jasper Defesche (NL)
Robert Maes (BE)
Vincent Ter Maat (NL)
Henry Houdmont (BE)
René Bik (NL)
Bart Defoort (BE)
Peter Brouwer (NL)
Iain Quick (UK)

Announcing: BizTalk Server Migration tool

We are very happy to announce a wonderful tool provided by MSIT, the tool will help in a multiple scenarios around migrating your environment or even taking backup of your document applications.

It comes with few inbuilt intelligence like

  • Connectivity test of source and destination SQL Instance or Server
  • Identify BizTalk application sequence
  • Retain file share permissions
  • Ignore zero KB files
  • Ignore files which already exist in destination
  • Ignore BizTalk application which already exist in destination
  • Ignore Assemblies which already exist in destination
  • Backup all artifacts in a folder.
Features AvailableFeatures Unavailable
  • Windows Service
  • File Shares (without files) + Permissions
  • Project Folders + Config file
  • App Pools
  • Web Sites
  • Website Bindings
  • Web Applications + Virtual Directories
  • Website IIS Client Certificate mapping
  • Local Computer Certificates
  • Service Account Certificates
  • Hosts
  • Host Instances
  • Host Settings
  • Adapter Handlers
  • BizTalk Applications
  • Role Links
  • Policies + Vocabularies
  • Orchestrations
  • Port Bindings
  • Assemblies
  • Parties + Agreements
  • BAM Activities
  • BAM Views + Permissions
  • SQL Logins
  • SQL Database + User access
  • SQL Jobs
  • Windows schedule task
  • SSO Affiliate Applications

Download the tool here

For a small guide take a look here

Integrate 2017 – Day 2 Recap

Integrate 2017 – Day 2 Recap

Microsoft IT: journey with Azure Logic Apps – Padma/Divya/Mayank Sharma

In this first session, Mayank Sharma and Divya Swarnkar talked us through Microsoft’s experience implementing their own integrations internally. We got a glimpse of their approach and the architecture of their solution.

Microsoft uses BizTalk Server and several Azure services like API Management, Azure Functions and Logic Apps, to support business processes internally.
They run several of their business processes on Microsoft technologies (the “eat your own dog food”-principle). Most of those business processes now run in Logic App workflows and Divya took the audience through some examples of the workflows and how they are composed.

Microsoft has built a generic architecture using Logic Apps and workflows. It is a great example of a decoupled workflow, which makes it very dynamic and extensible. It intensively uses the Integration Account artifact metadata feature.

They also explained how they achieve testing in production. They can, for example, route a percentage of traffic via a new route, and once they are comfortable with it, they switch over the remaining traffic. She however mentioned that they will be re-evaluating how they will continue to do this in the future, now that the Logic Apps drafts feature was announced.

For monitoring, Microsoft Operations Management Suite (MOMS) is used to provide a central, unified and consistent way to monitor the solution.

Divya gave some insights on their DR (disaster recovery) approach to achieve business continuity. They are using Logic Apps to keep their Integration Accounts in sync between active and passive regions. BizTalk server is still in use, but acts mostly as the proxy to multiple internal Line-of-Business applications. 

All in all, a session with some great first-hand experience, based on Microsoft using their own technology.
Microsoft IT will publish a white paper in July on this topic. A few Channel9 videos are also coming up, where they will share details about their implementation and experiences.

Azure Logic Apps – Advanced integration patterns – Jeff Hollan/Derek Li

Jeff Hollan and Derek Li are back again with yet another Logic Apps session. This time they are talking about the architecture behind Logic Apps. As usual, Jeff is keeping everyone awake with his viral enthusiasm!

A very nice session that explained that the Logic Apps architecture consists out of 3 parts:

The Logic Apps Designer is a TypeScript/React app. This contained app can run anywhere e.g.: Visual Studio, Azure portal, etc… The Logic Apps Designer uses OpenAPI (Swagger) to render inputs and outputs and generate the workflow definition. The workflow definition can be defined as being the JSON source code of the Logic App.

Secondly, there is the Logic App Runtime, which reads the workflow definition and breaks it down into a composition of tasks, each with its own dependencies. These tasks are distributed by the workflow orchestrator to workers which are spread out over any number of (virtual) machines. Depending on the worker – and its dependencies – tasks run in parallel to each other. e.g. a ForEach action which loops a 100 times might be executed on 100 different machines.

This setup makes sure any of the tasks get executed AT LEAST ONCE. Using retry policies and controllers, the Logic App Runtime does not depend on any single (virtual) machine. This architecture allows a resilient runtime, but also means there are some limitations.

And last, but not least, we have the Logic Apps Connectors, connecting all the magic together.
These are hosted and run separately from the Logic App or its worker. They are supported by the teams responsible for the connector. e.g. the Service Bus team is responsible for the Service Bus connectors. Each of them has their own peculiarities and limits, all described in the Microsoft documentation.

Derek Li then presented an interesting demo showing how exceptions can be handled in a workflow using scopes and the “RunAfter” property, which can be used to execute different actions if an exception occurs. He also explained how retry policies can be configured to determine how many times an action should retry. Finally, Jeff gave an overview of the workflow expressions and wrapped up the session explaining how expressions are evaluated inside-out.

Enterprise Integration with Logic Apps – Jon Fancey

Jon Fancey, Principal Program Manager at Microsoft, took us on a swift ride through some advanced challenges when doing Enterprise Integration with Logic Apps.

He started the session with an overview and a demo where he showed how easy it is to create a receiver and sender Logic App to leverage the new batch functionality. He announced that, soon, the batching features will be expanded with Batch Flush, Time-based batch-release trigger options and EDI batching.

Next, he talked about Integration Accounts and all of its components and features. He elaborated on the advanced tracking and mapping capabilities.
Jon showed us a map that used XSLT parameters and inline C# code processing. He passed a transcoding table into the map as a parameter and used C# to do a lookup/replace of certain values, without having to callback to a database for each record/node. Jon announced that the mapping engine will be enriched with BOM handling and the ability to specify alternate output formats like HTML or text instead of XML only.

The most amazing part of the session was when he discussed the tracking and monitoring capabilities. It’s as simple as enabling Azure Diagnostics on your Integration Account to have all your tracking data pumped into OMS. It’s also possible to enable property tracking on your Logic Apps. The Operations Management Suite (OMS) centralizes all your tracking and monitoring data.

Jon also showed us an early preview of some amazing new features that are being worked on. OMS will provide a nice cross-Logic App monitoring experience. Some of the key features being:

  • Overview page with Logic App run summary
  • Drilldown into nested Logic-App runs
  • Multi-select for bulk download/resubmit of your Logic App flows.
  • New query engine that will use the powerful Application Insights query language!

We’re extremely happy and excited about the efforts made by the product team. The new features shown and discussed here, provethat Microsoft truly listens to the demands of their customers and partners.

Bringing Logic Apps into DevOps with Visual Studio – Jeff Hollan/Kevin Lam

The last Microsoft session of Integrate 2017 was the second time Kevin Lam and Jeff Hollan got to shine together. The goal of their session was to enlighten us about how to use some of the tooling in Visual Studio for Logic Apps.

Kevin took to the stage first, starting with a small breakdown of the Visual Studio tools that are available:

  • The Logic Apps Designer is completely integrated in a Visual Studio “Resource Group Project”.
  • You can use Cloud Explorer to view deployed Logic Apps
  • Tools to manage your XML and B2B artifacts are also available

The Visual Studio tools generate a Resource Group deployment template, which contains all resources required for deployment. These templates are used, behind the scenes, by the Azure Resource Manager (ARM). Apart from your Logic Apps this also includes auto-generated parameters, API connections (to for example Dropbox , Facebook, …) and Integration Accounts. This file can be checked-in into Source Control, giving you the advantage of CI and CD if desired. The goal is to create the same experience in Visual Studio as in the Portal.

Jeff then started off by showing the Azure Resource Explorer. This is an ARM catalog of all the resources available in your Azure subscription.

Starting with ARM deployment templates might be a bit daunting at first, but by browsing through the Azure Quickstart Templates you can get a hang of it quickly. It’s easy to create a single template and deploy that parameterized template to different environments. By using a few tricks like Service Principals to automatically get OAuth tokens and using the resourceId() function to get the resourceId of a freshly created resource, you are able to automate your deployment completely.

What’s there & what’s coming in BizTalk360 & ServiceBus360 – Saravana Kumar

On the tune of “Rocky“, Saravana Kumar entered the stage to talk about the latest updates regarding BizTalk360 and ServiceBus360.

He started by explaining the standard features of BizTalk360 around operations, monitoring and analytics.
Since May 2011, 48 releases have been published of BizTalk360, adding 4 or 5 new features per release.

The latest release includes:

  • BizTalk Server License Calculator
  • Folder Location Monitoring for FILE, FTP/FTPS, SFTP
  • Queue Monitoring for IBM MQ
  • Email Templates
  • Throttling Monitoring

Important to note: BizTalk360 supports more and more cloud integration products like Service Bus and Logic Apps. What they want to achieve is having a single user interface to configure monitoring and alerting.

Similar to BizTalk360, with ServiceBus360, Kovai wants to simplify the operations, monitoring and analytics for Azure Service Bus.

Give your Bots connectivity, with Azure Logic Apps – Kent Weare

Kent Weare kicked off by explaining that the evolution towards cloud computing does not only result in lower costs and elastic scaling, but it provides a lot of opportunities to allow your business to scale. Take advantage of the rich Azure ecosystem, by automating insights, applying Machine Learning or introducing bots. He used an example of an energy generation shop, where bots help to increase competitiveness and the productivity of the field technicians.

Our workforce is changing! Bring insights to users, not the other way around.

The BOT Framework is part of the Cognitive Services offering and can leverage its various vision, speech, language, knowledge and search features. Besides that, the Language Understanding Intelligence Service (LUIS) ensures your bot can smoothly interact with humans. LUIS is used to determine the intent of a user and to discover the entity on which the intent acts. This is done by creating a model, that is used by the chat bot. After several iterations of training the model, you can really give your applications a human “face”.

Kent showed us two impressive demos with examples of leveraging the Bot Framework, in which both Microsoft Teams and Skype were used to interact with the end users. All backend requests went through Azure API Management, which invoked Logic Apps reaching out to multiple backend systems: SAP, ServiceNow, MOC, SQL and QuadrigaCX. Definitely check out this session, when the videos are published!

Empowering the business using Logic Apps – Steef-Jan Wiggers

Previous sessions about Logic Apps mainly focused on the technical part and possibilities of Logic Apps.
Steef-Jan Wiggers took a step back and looked at the potential of Logic Apps from a customer perspective.

Logic Apps is becoming a worthy player in the IPaaS hemipshere. Microsoft started an entirely new product in 2015, which has matured to its current state. Still being improved upon on a weekly basis, it seems it is not yet considered as a a rock-solid integration platform.
Customers, but even Gartner in their Magic Quadrant, often make the mistake of comparing Logic Apps with the functionality that we are used to, with products like BizTalk Server. They are however totally different products. Logic Apps is still evolving and should be considered within a broader perspective, as it is intended to be used together with other Azure services.
As Logic Apps continues to mature, it is quickly becoming “enterprise integration”-ready.

Steef-Jan ended his session by telling us that Logic Apps is a flexible and easy way to deliver value at the speed of the business and will definitely become a centralized product in the IPaaS market.

Logic App continuous integration and deployment with Visual Studio Team Services – Johan Hedberg

In the last session before the afternoon break, Johan Hedberg outlined the scenario for a controlled build and release process for Logic Apps. He described a real-life use case, with 3 typical personas you encounter in many organizations. He stressed on the importance of having a streamlined approach and a shared team culture/vision. With the available ARM templates and Visual Studio Team Services (VSTS), you have all the necessary tools to setup continuous integration (CI) and continuous deployment (CD).  

The session was very hands-on and to the point. A build pipeline was shown, that prepared the necessary artifacts for deployment. Afterwards, the release process kicked off, deploying a Logic App, an Azure Function and adding maps and schemas to a shared Integration Account. Environment specific parameter files ensured deployments that are tailored for each specific environment. VSTS can cover the complete ALM story for your Logic Apps, including multiple release triggers, environment variables and approval steps. This was a very useful talk and demo, because ALM and governance of your Azure application is key if you want to deliver professional solutions.

Integration of Things. Why integration is key in IoT solutions? – Sam Vanhoutte

The penultimate session of the day was held by our very own CTO Sam Vanhoutte. Sam focused his presentation in sharing some of the things Codit learned and experienced while working on IoT projects.

He started by stressing the importance of connectivity within IoT projects: “Connectivity is key” and “integration matters“. Sam summarized the different connectivity types: direct connectivity, cloud gateways and field gateways and talked about each of their use cases and pitfalls.

Another important point of Sam’s speech was in regard to the differences in IoT projects during Proof of Concepts (PoC) and an actual project implementation. During a PoC, it’s all about showing functionally, but in reality it is about focusing on robustness, security and connectivity.
Sam also approached the different responsibilities and activities regarding to gateways. He talked about the Nebulus IoT gateway and his ideas and experiences with it.

But IoT is not only about the cloud, Sam shared some insights on Azure IoT Edge as a Microsoft solution. Azure IoT Edge will be able to run within the devices own perimeter, but is not available yet or even in private preview. It can run on a variety of operating systems like Windows or Linux. Even on devices as small or even smaller than a Raspberry Pi. The session was concluded with the quote “Integration people make great IoT Solutions”.

Be sure to check out our two IoT white-papers:

Also be sure to check out our IoT webinar, acccessible via the Codit YouTube channel.

IoT – Common patterns and practices – Mikael Hakansson

Mikael Hakansson started the presentation by introducing IoT Hub, Azure IoT Suite and what this represents in the integration world. The Azure IoT Hub enables bi-directional connectivity between devices and cloud, for millions of devices, allowing communication in a variety of patterns and with reliable command & control.

A typical IoT solution consists of a cold path, which is based on persistent data, and a hot path, where the data is analyzed on the fly. Since a year,  the device twin concept has been introduced in IoT Hub. A twin consists of tags, a desired state and a reported state; so really maintaining device state information (metadata, configurations, and conditions). 

Mikael Hakansson prepared some demos, where a thermometer and a thermostat were simulated. The demos began with a simulated thermometer with a changing temperature, while that information was being sent to Power BI, via IoT Hub and Stream Analytics. After that, an Azure Function was able to send back notifications to that device. To simulate the thermostat, a twin device with a desired state was used to control the temperature in the room. 

Thank you for reading our blog post, feel free to comment or give us feedback in person.

This blogpost was prepared by:

Pieter Vandenheede (BE)
Toon Vanhoutte (BE)
Jonathan Gurevich (NL) 
Carlo Garcia-Mier (UK)
Jef Cools (BE)
Tom Burnip (UK)
Michel Pauwels (BE)
Pim Simons (NL)
Iemen Uyttenhove (BE)
Mariëtte Mak (NL)
Jasper Defesche (NL)
Robert Maes (BE)
Vincent Ter Maat (NL)
Henry Houdmont (BE)
René Bik (NL)
Bart Defoort (BE)
Peter Brouwer (NL)
Iain Quick (UK)
Ricardo Marques (PT)

Logic Apps sessions on Integrate 2017 Day 2 – Part 1

Logic Apps sessions on Integrate 2017 Day 2 – Part 1

What a day it was at ‘Integrate 2017’ today. For Logic Apps enthusiasts, it was a treat. have you  missed the sessions? don’t worry, I am going write on all that was talked about today on logic apps.

Azure Logic Apps – Microsoft IT journey with Azure Logic Apps – By Divya Swarnkar and Mayank Sharma

Microsoft has a large IT wing to serve its business which is called ‘MSIT’. This team is well known for ‘eating its own dog food’. Mayank and Divya are from MSIT’s integration team. When they started their session by describing the scale of business their team is serving, we were all blown away. Look at the number of business entities they are serving. Around 170 million messages flow through their 175 BizTalk servers serving 1000 plus trading partners across various business entities.

Azure Logic Apps - Microsoft IT journey with Azure Logic Apps

“We are moving all of this Integration to Logic Apps.”

MSIT is modernizing their integration landscape completely. Divya and Mayank made it very clear that they are moving all the BizTalk interfaces to Logic Apps and BizTalk is only going to be used as a proxy to serve existing partner requests. They so far were able to deliver three releases.

  1. Release 1.0 they moved most of their interfaces relying on X12 and AS2, Logic Apps.
  2. Release 1.5 they were able to move interfaces related to EDIFACT to Logic Apps.
  3. Release 2.0 release they moved many of the XML-oriented interfaces.

All these interfaces helped them to achieve following goals.

  • Enable Order to Cash Flow for digital supply chain management.
  • Running trade integrations and all customer declaration transactions.
  • They became ready to retire “Microsoft BizTalk Services” instances by end of July.
    Azure Logic Apps - Microsoft IT journey with Azure Logic Apps

Solution Architecture

They then continued to explain their solution architecture. Below is the slide that they presented. Following are some of the important aspects of their solution architecture.

solution architecture

  • Azure API Management: All trading partners send the messages(X12/EDIFACT/XML) through Microsoft’s Gateway store. Azure API management service then routes the message to an appropriate logic app.
  • Integration Account: The Logic apps they have built, make full use of Integration account artefacts such as Trading Partner Agreements, Certificates, Schemas, Transformations etc.
  • On-premises BizTalk: On-premises BizTalk is merely used as a proxy for Line of business applications. This makes sense as they may not want to change all the connections which already exist for Line of Business Applications and also they need to support the continuity of other interfaces. This is the perfect example of how other organizations can start migrating their interfaces to Logic Apps.
  • Logic App Flow: The Logic apps make use of typical VETER pipeline which involves AS2 connector, X12 connector, Transformation, Encoding and HTTP connectors as shown below.
    logic apps workflow
  • OMS for Diagnostics and Monitoring: Operational Management Suits(OMS) is used for collection of diagnostic logs from Integration Accounts, Logic Apps and Azure functions which are part of their solution. Once all the diagnostic data is collected they will be able to query and create nice dashboards for getting analytics on their interfaces. Currently, Integration accounts have their built-in solutions for OMS. Please refer the video http://www.integrationusergroup.com/business-activity-tracking-monitoring-logic-apps/ to know about Diagnostic logs in Logic Apps and Integration accounts.

Fall-back and Production Testing Using APIM

They have scenarios where they want to test the logic apps in production and also want to fall back to previous stable versions of the logic app. They make use of APIM to achieve this requirement. APIM is configured with rules to switch between the logic apps end points.
Fall-back and Production Testing Using APIM

Disaster Recovery

Business continuity is very important especially for MSIT with the scale of messaging they are handling. In order to achieve the business continuity assurance, they make use of Disaster Recovery feature which comes along with integration account.

disaster recovery

The disaster recovery is achieved by creating similar copies of logic apps, integration accounts and azure functions in two different regions. As you can see from the picture they have this replication in both Central US and West US regions. Visit the documentation https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-b2b-business-continuity  to know more about disaster recovery feature.

Huge confidence Boost to Customers who are contemplating on moving to Logic Apps

Azure Logic Apps – Advanced integration patterns By Jeff Hollan and Derek Li

I am a big fan of Jeff Hollan. When he is on the stage it’s a treat to listen to him. He brings life into technical talks and involves the audience by leaving a lasting impression. Enough of personifying him. Jeff Hollan and Derek Li were on to the stage to talk about advanced integration patterns in Logic apps.

Internals of Logic Apps Platform

Jeff arrived on the stage with the clear intention of explaining the internal architecture of Logic Apps platform. You might be wondering why we should be knowing about the internals of Logic Apps as it is a PaaS offering and we generally treat them as a black box from the end user perspective. However, he gave three powerful reasons why we should understand the internals.

  • There are some published limits for the Logic apps. We need to understand them in order to design enterprise grade solutions.
  • Understanding the nature of the workflows
  • Internals help us to clearly understand the impact of design on throughput especially when we are working with Long running flows.
  • We will be able to leverage the platform as much as possible for concurrency.
  • Helps us to understand the structure and behavior of our data

Agenda

The agenda was not just talking about the internal architecture of logic apps but also to talk about Parallel Actions, Exception handling, workflow expressions.

Internals of Logic Apps Platform

Logic Apps Designer

Logic apps designer is apparently a TypeScript/React JS application. All the functionality that we observe in logic apps designer is all self-contained in this application. This is the main reason how they are able to host it in visual studio. This  makes use of Swagger to render the inputs and outputs.  Also as we already aware it generates the workflow definition in JSON.

logic apps designer

Logic Apps Runtime

As we know logic apps will have  triggers and actions. When we create a logic app all these will be defined in a JSON file. When we click save button, logic apps runtime handles it as below.

logic apps runtime

  • Runtime engine reads the workflow definition and breaks down into various tasks and identifies the dependencies. the tasks will not be executed until their dependencies are worked out.
  • It spins distributed workers which coordinate to complete the execution of the tasks. This is very revealing to know that all the workers are distributed which makes the logic app more resilient
  • Runtime engine ensures that all the tasks inside the flow are executed at least once. he announced that in the history of logic apps he has not seen any instance where a task is left unexecuted.
  • There is no limit on the number of threads executing these tasks and hence there is no overhead of managing active threads.

Example logic App

He gave an example of a logic app with a service bus trigger receiving  list of products, and writes each product to a SQL database.

logic app with service bus trigger

In this example, his main intention was to show how runtime identifies the tasks which can be executed. In this example, a for each loop decides that run time can spin parallel tasks to execute the SQL task.  The workflow orchestrator then completes the message by calling service bus complete connector and ends the workflow.

Parallel action

Now with run times ability to spin parallel tasks,  he showed us how to use parallel action in logic app definition.

parallel action in logic app definition

parallel action in logic app definition

From above picture, it is clear that we can add as many parallel actions we want to add by just clicking Plus symbol on the branches.

Exception handling

At this point, Derek Li took over the stage to show some geeky stuff. He started off by creating a logic app in which one of the action fails and when it fails he would send an email to Jeff. To achieve this he puts a scope and adds all the actions required. After the scope, he configured the run after settings for an action. I do not have an exact snapshot from his slide but it was something like below.

exception handling

With run after configuration for an action,  it is easy to handle the error conditions. Also, he showed how we can set the timeout configuration for an action.

exception handling

When the timeout expires, we can take some action  again by setting run after configuration to “has time out”

exception handling

Workflow expressions

He spoke about important aspects of workflow expressions. Following are the highlights.

  • Any input that changes for every run is an expression. He showed some example expressions.
    workflow expressions
  • He explained the difference between different constructs such as “@”, “{}”,”[]” and “()”.

@ is used for referring a JSON node, {} means a string, [] is used as JSON path and () is used to contain the expressions for evaluation. He also showed the order in which elements of an expression executed.

elements of executed expression

Summary

As explained earlier it was a real treat for all the logic app enthusiasts and gave a lot of insights into a logic app platform.

  • The first session from Mayank and Divya gave the audience a great level of confidence about going with logic app implementations.
  • The session from Jeff and Derek brought an understanding of logic apps internals and patterns.
Author: Srinivasa Mahendrakar

Technical Lead at BizTalk360 UK – I am an Integration consultant with more than 11 years of experience in design and development of On-premises and Cloud based EAI and B2B solutions using Microsoft Technologies.