The Routing Slip Pattern

The Routing Slip Pattern

The Pattern


A routing slip is a configuration that specifies a sequence of processing steps (services). This routing slip must be attached to the message to be processed. Each service (processing step) is designed to receive the message, perform its functionality (based on the configuration) and invoke the next service. In that way, a message gets processed sequentially by multiple services, without the need of a coordinating component. The schema below is taken from Enterprise Integration Patterns.

Some examples of this pattern are:

Routing Slip

Routing slips can be configured in any language, JSON or XML are quite popular. An example of a simple routing slip can be found below. The header contains the name of the routing slip and a counter that carries the current step number. Each service is represented by a routing step. A step has its own name to identify the service to be invoked and has a specific key-value configuration pairs.

Remark that this is just one way to represent a routing slip. Feel free to add your personal flavor…

Assign Routing Slip

There are multiple ways to assign a routing slip to a message. Let’s have a look:

  • External: the source system already attaches the routing slip to the message
  • Static: when a message is received, a fixed routing slip is attached to it
  • Dynamic: when a message is received, a routing slip is attached, based on some business logic
  • Scheduled: the integration layer has routing slips scheduled that contain also a command to retrieve a message


A service is considered as a “step” within your routing slip. When defining a service, you need to design it to be generic. The executed logic within the service must be based on the configuration, if any is required. Ensure your service has a single responsibility and there’s a clear boundary of its scope.

A service must consist of three steps:

  • Receive the message
  • Process the message, based on the routing slip configuration
  • Invoke the next service, based on the routing slip configuration

There are multiple ways to invoke services:

  • Synchronous: the next service is invoked without any persistence in between (e.g. in memory). This has the advantage that it will perform faster.
  • Asynchronous: the next service is invoked with persistence in between (e.g. a queue). This has the advantage that reliability increases, but performance degrades.

Think on the desired way to invoke services. If required, a combination of sync and async can be supported.


Encourages reuse

Integrations are composed of reusable and configurable building blocks. The routing slip pattern forces you to analyze, develop and operate in a streamlined manner. Reuse is heavily encouraged on different levels: the way analysis is performed, how patterns are implemented, the way releases are rolled out and how operational tasks are performed. One unified way of working, built on reusability.

Configuration based

Your integration is completely driven by the assigned routing slip. There are no hard-coded links between components. This allows you to change its behavior without the need of a re-deployment. This configuration also serves as a great source of documentation, as it explains exactly what message exchanges are running on your middleware and what they exactly do.

Faster release cycles

Once you have set up a solid routing slip framework, you can increase your release cadence. By leveraging your catalogue of reusable services, you heavily benefit from previous development efforts. The focus is only on the specifics of a new message exchange, which are mostly data bound (e.g. mapping). There’s also a tremendous increase of agility, when it comes to small changes. Just update the routing slip configuration and it has an immediate effect on your production workload.

Technology independent

A routing slip is agnostic to the underlying technology stack. The way the routing slip is interpreted, is of course specific to the technology used. This introduces ways to have a unified integration solution, even if it is composed of several different technologies. It enables also cross technology message exchanges. As an example, you can have an order that is received via an AS2 Logic App, being transformed and sent to an on premise BizTalk Server that inserts it into the mainframe, all governed by a single routing slip config.

Provides visibility

A routing slip can introduce more visibility into the message exchanges, for sure from an operational perspective. If a message encounters an issue, operations personnel can immediately consult the routing slip to see where the message comes from, what steps are already executed and where it is heading to. This visibility can be improved, by updating the routing slip with some extra historical information, such as the service start and end time. Why even not including an URL in the routing slip that points to a wiki page or knowledge base about that interface type?


Not enough reusability

Not every integration project is well-suited to use the routing slip pattern. During analysis phase, it’s important to identity the integration needs and to see if there are a lot of similarities between all message exchanges. When a high level of reusability is detected, the routing slip pattern might be a good fit. If all integrations are too heterogenous, you’ll introduce more overhead than benefits.

Too complex logic

A common pitfall is adding too much complexity into the routing slip. Try to stick as much as possible to a sequential series of steps (services) that are executed. Some conditional decision logic inside a routing slip might be acceptable, but define clear boundaries for such logic. Do not start writing you own workflow engine, with its own workflow language. Keep the routing slip logic clean and simple, to stick to the purpose of a routing slip.

Limited control

In case of maintenance of the surrounding systems, you often need to stop a message flow. Let’s take the scenario where you face the following requirement: “Do not send orders to SAP for the coming 2 hours”. One option is to stop a message exchange at its source, e.g. stop receiving messages from an SFTP server. In case this is not accepted, as these orders are also sent to other systems that should not be impacted, things get more complicated. You can stop the generic service that sends a message to SAP, but then you also stop sending other message types… Think about this upfront!

Hard deployments

A very common pain-point of a high level of reuse, is the impact of upgrading a generic service that is used all over the place. There are different ways to reduce the risks of such upgrades, of which automated system testing is an important one. Within the routing slip, you can specify explicitly the version of a service you want to invoke. In that way, you can upgrade services gradually to the latest version, without the risk of a big bang deploy. Define a clear upgrade policy, to avoid that too many different versions of a service are running side-by-side.


A message exchange is spread across multiple loosely coupled service instances, which could impose a monitoring challenge. Many technologies offer great monitoring insights for a single service instance, but lack an overall view across multiple service instances. Introducing a correlation ID into your routing slip, can highly improve the monitoring experience. This ID can be generated the moment you initialize a routing slip.


Routing slips are a very powerful mechanism to deliver unified and robust integrations in a fast way. The main key take-aways of this blog are:

  • Analyze in depth if can benefit from the routing slip pattern
  • Limit the complexity that the routing slip resolves
  • Have explicit versioning of services inside the routing slip
  • Include a unique correlation ID into the routing slip
  • Add historical data to the routing slip

Hope this was a useful read!

Celebrating 100th Integration Monday Episode – Live Q&A session with Microsoft Product Group

Celebrating 100th Integration Monday Episode – Live Q&A session with Microsoft Product Group


It all started back in 2006 when Michael Stephenson and Saravana Kumar identified that people in the integration space lack the technical know-how of concepts. In an effort to bridge this gap, they decided to create a strong community where people can share their experience and learning with others. This saw the birth of BizTalk User Group. Later, when the integration scope expanded beyond BizTalk to WCF, AppFabric, BizTalk Services, the community was renamed as UK Connected Systems User Group (UKCSUG). In 2015, as the integration scope grew wider, the community user group was renamed as Integration User Group. You can read the detailed history behind organizing Integration Monday’s in our Integration User Group launch blog.

The 100th Episode- A Milestone enroute!!

Since the launch of Integration Monday on January 19, 2015, it has taken us close to 29 months to hit the milestone 100th Integration Monday episode mark. We have strived our best to consistently deliver one session every Monday (except public and bank holidays).  There is a separate team working to ensure the sessions are slotted out in advance for a quarter, getting in touch with potential speakers and scheduling them, having test sessions before the webinar, getting the registrations, social media promotions, uploading the videos and presentations after the event, and so on.


A look at some of the statistics from the Integration Monday sessions.

integration user group

We wanted to make the 100th Integration Monday episode a grand one. After a lot of email conversations and brainstorming, we narrowed down on the option of having a 1 hour Q&A session with the Microsoft Product Group. Then we realized that the 100th Integration Monday episode falls exactly one week before INTEGRATE 2017. So it would only make sense to make the 100th Integration Monday episode to be a prelude to the biggest integration focussed conference starting on June 26th.

Join the community and get to share your knowledge with developers and architects to learn about the evolving integration technologies. Register for our future events.

Preparations for the Special Episode on Integration Monday

Few back and forth emails with the Microsoft Product Group (thanks to Saravana), we were all set for the 100th Integration Monday episode. We learnt that we will have the Pro Integration team presence across the different product offerings from Microsoft such as BizTalk, Azure Logic Apps, Azure API Management, Azure Service Bus.

integration user group

Jim Harrer – ‎Pro Integration Group PM, Jon Fancey – ‎Principal Program Manager (Azure Logic Apps & BizTalk), Tord Glad Nordahl – Program Manager (owning BizTalk Server), Dan Rosanova – ‎Principal Program Manager (Product Owner for Azure Messaging), Jeff Hollan – ‎Senior Program Manager at Microsoft (Azure), Kevin Lam – Principal Program Manager for Microsoft Azure Scheduler, Logic Apps, Azure Resource Manager and other services, Vladimir Vinogradsky – Principal PM Manager (Azure API Management).

Since it was only a one hour Q&A session, we decided to collect the questions upfront from the registrants. So, the team quickly set course to design an event landing page with all the session details and a simple form for users to submit their questions for the Pro Integration team.


We received close to 200 registrations for the event and some very interesting questions from the attendees. We categorized the questions based on the product offering and shared it in advance with the Pro Integration team so that they can plan out their responses in the best interest of time.

Recap from the 100th Integration Monday Episode

The stage was perfectly set for the 100th Integration Monday episode. As attendees started to join in, Saravana Kumar started off the broadcast at 0735 BST welcoming the Pro Integration team and the attendees on the webinar. After a round of quick self introductions, it was time to get into the questions from the attendees. I’ll try to highlight some of the key discussions that happened on the webinar in this blog post.

integration user group

Question: What does Microsoft see as the future of Integration and what it means to Microsoft?

Jim Harrer: The past year (since the major announcements at INTEGRATE 2016) has been extremely busy for Microsoft in terms of bringing the team together and respond better to customer requirements, cater to the demands of our partner ecosystem and define the strategy around application integration and enterprise integration. Microsoft has achieved this by building the Hybrid Integration platform. Microsoft has been talking and dealing with “Better Together” strategy when it comes to cloud and on-premise offering. Therefore, the entire team (under the Program Managers on the webinar) has been focussing on the Integration strategy.

The team has really stuck to be Hybrid Integration platform and delivered some awesome stuff around it — Feature Pack 1 for BizTalk Server, Logic Apps and BizTalk Connector to connect the on-premise and cloud solutions, first class experience with Azure Service Bus and API Management. The focus for the future is to extend these offerings into other Azure services in order to have a Better Together strategy across all product offerings. In the last year, the key highlights were the GA of BizTalk 2016 and the Feature Pack 1 (totally a new concept from Microsoft) that received a lot of positive feedback from the community.

For more “exciting” information on the future of Microsoft and what’s lined up, you may have to wait one more week for INTEGRATE 2017 where the Pro Integration team will be releasing their vision, strategy and roadmap for the upcoming year. So stay tuned for our further blog posts from INTEGRATE 2017 🙂

Question: What kind of solutions are customers using Microsoft’s offerings? In other words, what kinds of features are customers leveraging Microsoft technologies for?

Tord Glad Nordahl: Customers are moving into the Digital Transformation world. Say, for eg., BizTalk server being used in scenarios where we would have never thought of in the past after release of Feature Pack 1. Customers have been able to define their workflows and build their own integration solution. BizTalk customers have started taking advantage of (for eg.,) Powerapp to manage their BizTalk server environment, connect BizTalk to SignalR, etc., and make their integration solution more interesting, smart and predictive.

Jim Harrer: “Integration is HOT. We are enjoying the hotness of this concept”. All Microsoft’s products are seeing the growth and the customer numbers are on the rise. Customers no longer can have siloed applications; instead they need to extend them out and maximize the value by integrating with different other systems. Vlad’s team (API Management team) have enjoyed the success where legacy systems are now starting to put their API into the API Management platform.

Vladimir Vinogradsky – Previously, customers were exposing APIs for mobile apps, partner integrations (closed connection). The way customers expose their APIs is now changing. These days, companies use API Management to manage both their external and internal APIs across the organization.

Dan Rosanova – Enterprise integration has got the right meaning over the last few months or so. Earlier it was within a team, department or business. Previously, for instance, someone may have only used Service Bus and some compute to perform all their integration. Nowadays, you need not write any code to use all the functionalities in Service Bus as Logic Apps gives you the complete control by means of its connectors.

Jon Fancey – Customers visit the Microsoft platform from different locations for different reasons. The general feedback is that they value the fact that they can get started from one place and then expand using Microsoft’s Integration Portfolio (rich services that are available on-premises and on Azure).

Question: How is being “Serverless” helping Microsoft?

Jeff Hollan: Serverless is the next step of evolution with Platform as a Service (PaaS). It does not mean there are no servers! There are servers, but, as an operator/developer, you need not worry about the servers. No worries about the server being secure, scalable etc!!

In Azure, there are few core offerings that are serverless – Azure Functions and Azure Logic Apps. The unique advantage in the serverless story when it comes to Azure is that integration and serverless are treated as “hand in glove”. With Serverless, customers feel they can get something into production really quick and connect it to the system/API that I am concerned about. This helps the project IT to move faster to get into the speed of business.

Question: How is Microsoft BizTalk Server 2016 Feature Pack 1 being received by the customers? What’s the plan moving forward?

Tord Glad Nordahl: It was a complete team restructure that we had to go through during the release of Feature Pack 1 and the release process (from once in every 2 years for a major release). Feature Pack 1 was mainly intended to help customers do the better integration. Most suggestions for the features for Service Pack 1 actually came from customers through the Uservoice (Customer Feedback) portal. With Feature Pack releases, customers can do more with the services provided by Microsoft and improve on what they already have in store.

The plan is to continue the investment and working on the features that were shipped as a part of Feature Pack 1. For what’s coming more in upcoming Feature Packs, stay tuned for INTEGRATE 2017 announcements in a week’s time 🙂

Question: We see updates for new ServiceBus Library for a .Net Client to use Azure AD Authentication. What will happen to the existing Library that uses Connection String from Shared Access Policy. Will that continue to be in use with new updates added to them?

Dan Rosanova: Yes, both the libraries will continue to use SAS as it is very useful for high messaging scenarios. For the new library, the team is working on implementing Active Directory MSI (Secure Managed Identities for Services).

Question: I have a multi-cloud environment. Are there any Logic App AWS connectors that are in the pipeline?

Jeff Hollan: At present, there are no out-of-the-box connectors in the library (of the available 160+ connectors). If you would like to request for this connector, you can go to the Logic Apps Connectors Uservoice page and search if the request for the connector is already available. If yes, vote for the request so that the team knows which connector to work on priority. If not, you can create the request for the connector and the team will assess the demand based on the votes.

Request from the Pro Integration team – If you require any new connector or a feature in any of the products, the best place to request/show your support is through the Uservoice page for the particular product.

Question: Should I hollow out my EDI exchange running on BizTalk Server 2010 and move into Azure Logic Apps, or should I upgrade to BizTalk 2016?

Tord Glad Nordahl: This completely depends on where you are doing the workflow/integration. If it’s all on the cloud and you are communicating with your partners on the cloud, then Logic Apps is the best way to go forward. However, if you are doing a lot of stuff on-premise, then BizTalk is also the right choice. If there is a hybrid scenario where you do processing both on-premise and the cloud, then you can use both in conjunction. Therefore, it all depends on the need of the customer. So Microsoft’s responsibility is to provide features and products that customers ask for!

Question: When will we see a true PaaS implementation of API Management, with a corresponding price model?

Vladimir Vinogradsky:  There are thoughts behind getting a PaaS implementation of API Management, but no concrete timelines on the availability of this functionality.

Question: My question is around using SQL Availability groups in BizTalk setup. Currently with BizTalk Server 2016 and SQL Server 2016, it needs atleast 8 SQL instances to run BizTalk HA environment with SQL availability groups. With the announcement that SQL Server 2017 supports distributed transactions for databases in availability groups, does it mean that the minimum number of instances required will reduce from 8 to 2?

Tord Glad Nordahl: Definitely, yes! This will be addressed. The BizTalk team is working hard with the SQL team to get this addressed.

Question: Now that BizTalk Services is dead we are certain that the two tools that will be kept are BizTalk (On-Prem) and Logic Apps (cloud)?

Jon Fancey: A common question received by the Logic Apps team was “When should I use BizTalk Services and when should I use Logic Apps?” Since its absolutely ridiculous to have the same offering in multiple features, the team worked hard over the last 18 months to make sure all features that are a part of the BizTalk Services are shifted to Logic Apps. This has ZERO IMPACT on BizTalk Server. Although the name has the word “BizTalk Server”, it does not mean the end of the road for BizTalk Server. It’s just a shift to the capabilities and what the team is focusing on – BizTalk Server, Logic Apps, and Enterprise Integration.

Question: What’s the Future of Service Bus On-Premises?

Dan Rosanova: This was announced in January. The future is very well defined that it goes out of mainline support in January 2018. There are no plans to replace it. The On-premise roadmap involves Azure stack for better alignment with other services.

Question: Is Logic Apps a mature technology or not considering that it’s pretty much a new concept?

Jeff Hollan: Reading through the customer stories where customers talk about how they have been using Logic Apps in their environments and the different scenarios that they have implemented, it’s only unfair to comment on the maturity of the product as a whole. Logic Apps is just about 12 months since it went GA and seeing the number of customer success stories and numerous blog posts on how the community have been using Logic Apps makes us feel that we are in the right direction. Therefore, if there is any chance that Logic Apps ended up not having a great SLA, it’s not only for Logic Apps but around 10-12 other connected services/products in Microsoft’s offering feeling the ripple effect.

Logic Apps has been built very consciously by taking the learnings from BizTalk server and used the learning to build a very strong cloud platform for our customers.

With that, it was almost close to one hour since we started the session! Time just flew in the blink of an eye, but boy! what an engrossing discussion that was from the team. You can watch the video of the session here –

Final Question: What’s the roadmap for Healthcare companies to move to the cloud?

Jim Harrer: The Pro Integration team is already working on improving the vertical strategy given that a real good functionality exists around the product. The team is challenged to put together different solutions for different verticals, healthcare being one of them.

Jon Fancey: Microsoft is keen to developing and building a solid stable platform to provide a lot of general purpose integration capabilities across the board so that people can build mission critical integration solutions.

If you have any specific questions related to any vertical, get a chance to meet the same team next week at London at INTEGRATE 2017.

Feedback from the Community

Here’s what the community had to say about the Integration User Group initiative and on reaching the 100th episode –

Integration User Group Evangelises Microsoft Integration developments

Dedicated people, talking about things they love; The sessions stimulate me to try new things

Big kudos to BizTalk360 team for doing an amazing job in evangelizing Microsoft Enterprise Integration.

Feedback like this drive us to move forward and deliver the best content to our attendees. If you have not registered for our event updates, we recommend you to register for the upcoming events on the Integration User Group website.

Final wrap up of the session

Jim Harrer thanked the attendees who joined the webcast, congratulated the team behind Integration User Group for reaching their 100th milestone episode and the speakers who presented sessions on Integration User Group.

You can watch the previous episodes of Integration Monday on the Past Events section, and register for the upcoming events.

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”.

Choosing between BizTalk360 and ServiceBus360 for monitoring Azure Service Bus

Choosing between BizTalk360 and ServiceBus360 for monitoring Azure Service Bus

Recently we received few support emails where people were asking about the overlap between BizTalk360 and ServiceBus360 when it comes to monitoring Azure Service Bus. Which ones should they go for? Also, the question was extended in such a way that if they are using Azure Logic Apps and Web API’s (Web Endpoints), then which is the better product to opt for.

Given both the products got the capability to monitor Azure Service Bus, it’s a valid question and let me try to clarify the positioning of both the products.


When we released BizTalk360 version 8.1, we introduced a bunch of Azure Monitoring capabilities in the product like:

The Web End Points Monitoring capability was also heavily enhanced to support features like adding query strings, body payload, HTTP headers etc., in the request message and enriched validation like JSONPath, XPath, response time etc., on the response message. The changes made the feature super powerful for monitoring SOAP, REST/HTTP-based web endpoints.

The long term goal for us at BizTalk360 is to provide a consolidated single pane of glass operations, monitoring and analytics solution for customers who are using Microsoft Integration Stack for their integration needs. In the upcoming 8.5 version, we are extending Azure capability even further by bringing support for Azure Integration Accounts within BizTalk360.

If you are a Microsoft BizTalk Server customer and slowly started leveraging Azure Service Bus, Logic Apps, API apps, and Web API’s for your integration requirements, then BizTalk360 will be the ideal product both for Managing and Monitor the entire infrastructure. Typically Microsoft BizTalk Server customers who started utilizing some of the Azure Integration technology stacks like Azure Service Bus, Logics Apps, API apps will get benefitted by using BizTalk360.

When it comes to Azure Service Bus monitoring in BizTalk360, we only cover Azure Service Bus Queues. Currently, we do not cover Azure Service Bus Topics, Azure Service Bus Relay and Azure Service Bus EventHubs. Therefore, if you are using any of these technologies (that are not monitored with BizTalk360), then you’ll also need ServiceBus360.


ServiceBus360 is designed and developed to provide complete operations and monitoring capabilities for Azure Service Bus Messaging, Relay and Event Hubs. ServiceBus360 provides in-depth monitoring capabilities for:

ServiceBus360 is also not just a monitoring solution for Azure Service Bus. The idea of ServiceBus360 is to make it a world class product for complete operations, monitoring and analytics of Azure Service Bus. The product already supports a variety of productivity and advanced operational capabilities like:

The above is not the complete list of features – it just gives you the flavor of what can be accomplished with ServiceBus360. Clearly, BizTalk360 will not have this level of coverage for Azure Service Bus.

Therefore, if you are using Azure Service Bus for mission critical integration work, then ServiceBus360 is the viable option to improve productivity and avoid disaster.

Author: Saravana Kumar

Saravana Kumar is the Founder and CTO of BizTalk360, an enterprise software that acts as an all-in-one solution for better administration, operation, support and monitoring of Microsoft BizTalk Server environments.

Atlassian Bamboo–How to create multiple Remote Agents on single server to do continuous deployment for BizTalk / WCF.

Atlassian Bamboo–How to create multiple Remote Agents on single server to do continuous deployment for BizTalk / WCF.


I’m writing this post to demonstrate how we can create multiple remote agent on a single server to do the parallel deployment to the BizTalk/WCF servers. Bamboo comes with the concept of local agents and remote agents. Remote agents are installed on the individual servers for the artefact/solution deployment. Remote agent runs on a windows wrapper service, whenever there is a new server, the project team need to install Remote Agent and run the services. This is trouble with large organisation, and Remote agents are not free.

Follow the below steps to create multiple Remote Agent on one/two/three particular dedicate machine for Bamboo.

Sr No.TaskDescription
1.Download Remote Agent

Download bamboo-agent-installer-5.14.1.jar from bamboo agent page


Copy jar file

Copy .jar file to a folder.


Create Remote Agent 1 – <ServerName>.<Env>.<Domain>.lan

Follow the below steps to install Remote Agent 1.
1 – Open CMD prompt, CD into the folder where .Jar file exists.
2- Run the below command.
java -Dbamboo.home=d:bamboo-1 -jar atlassian-bamboo-agent-installer-5.14.1.jar http://<AgentServer>/agentServer/
The process will stop and ask to approve the remote agent. Login to the Bamboo portal, navigate to Agents, click on Agent Authentication under Remote Agents. Approve the operations. Process will resume.
3- After the completion of the above, navigate to the folder D:bamboo-1Conf.
4- Open the file wrapper.conf
5- Edit the file with the below information:
         wrapper.console.title=Bamboo Remote Agent 1
         wrapper.ntservice.displayname=Bamboo Remote Agent 1
6. Navigate to d:bamboo-1bin. Run the following .bat file in order as per below:
7. A Service name “Bamboo Remote Agent 1” will get installed and started. Use bamboo user to login to the service.


Remote Agent 1 – <ServerName>.<Env>.<Domain>.lan

This remote agent will appear on the online remote agents tab under Remote Agents.

5.Create Remote Agent 2 – <ServerName>.<Env>.<Domain>.lan (2)

Follow the below steps to install Remote Agent 1.
1 – Open CMD prompt, CD into the folder where .Jar file exists.
2- Run the below command.
java -Dbamboo.home=d:bamboo-2 -jar atlassian-bamboo-agent-installer-5.14.1.jar http://<AgentServer>/agentServer/
The process will stop and ask to approve the remote agent. Login to the Bamboo portal, navigate to Agents, click on Agent Authentication under Remote Agents. Approve the operations. Process will resume.
3- After the completion of the above, navigate to the folder D:bamboo-2Conf.
4- Open the file wrapper.conf
5- Edit the file with the below information:
         wrapper.console.title=Bamboo Remote Agent 2
         wrapper.ntservice.displayname=Bamboo Remote Agent 2
6. Navigate to d:bamboo-2bin. Run the following .bat file in order as per below:
7. A Service name “Bamboo Remote Agent 2” will get installed and started. Use bamboo user to login to the service.


Create Remote Agent 3 – <ServerName>.<Env>.<Domain>.lan (3)

Follow the below steps to install Remote Agent 1.
1 – Open CMD prompt, CD into the folder where .Jar file exists.
2- Run the below command.
java -Dbamboo.home=d:bamboo-3 -jar atlassian-bamboo-agent-installer-5.14.1.jar http://<AgentServer>/agentServer/
The process will stop and ask to approve the remote agent. Login to the Bamboo portal, navigate to Agents, click on Agent Authentication under Remote Agents. Approve the operations. Process will resume.
3- After the completion of the above, navigate to the folder D:bamboo-3Conf.
4- Open the file wrapper.conf
5- Edit the file with the below information:
         wrapper.console.title=Bamboo Remote Agent 3
         wrapper.ntservice.displayname=Bamboo Remote Agent 3
6. Navigate to d:bamboo-2bin. Run the following .bat file in order as per below:
7. A Service name “Bamboo Remote Agent 3” will get installed and started.  Use bamboo user to login to the service.


Three Remote Agents available.


Once the remote agent is created you need to create PowerShell script using New-PSSession and Remote connection, something like :

$LocalDir= "${bamboo.biztalk.server}C$Users${bamboo.remote_username}Documents" $session = New-PSSession -ComputerName $biztalk_server -ConfigurationName Microsoft.PowerShell32 $LastExitCode = Invoke-Command -Session $session -File "${LocalDir}US_Controller_BizTalk_Database.ps1" -ArgumentList "undeploy","$list","$biztalk_sql_instance","$log_dir"

Some people might disagree with this approach, but if we can create multiple local agents on the same server then why not remote agents?

Many Thanks.




Saving time via Logic Apps: a real world example

Saving time via Logic Apps: a real world example


At Codit, I manage the blog. We have some very passionate people on board who like to invest their time to get to the bottom of things and – also very important – share it with the world!
That small part of my job means I get to review blog posts before publishing on a technical level. It’s always good to have one extra pair of eyes reading the post before publishing it to the public, so this definitely pays off!

An even smaller part of publishing blog posts is making sure they get enough coverage. Sharing them on Twitter, LinkedIn or even Facebook is part of the job for our devoted marketing department! And analytics around these shares on social media definitely come in handy! For that specific reason we use Bitly to shorten our URLs.
Every time a blog post gets published, someone needed to add them manually to out Bitly account and send out an e-mail. This takes a small amount of time, but as you can imagine it accumulates quickly with the amount of posts we generate lately!

Logic Apps to the rescue!

I was looking for an excuse to start playing with Logic Apps and they recently added Bitly as one of their Preview connectors, so I started digging!

First, let’s try and list the requirements of our Logic App to-be:


  • The Logic App should trigger automatically whenever a new blog post is published.
  • It should create a short link, specifically for usage on Twitter.
  • It also should create a short link, specifically for LinkedIn usage.
  • It should send out an e-mail with the short links.
  • I want the short URLs to appear in the Bitly dashboard, so we can track click-through-rate (CTR).
  • I want to spend a minimum of Azure consumption.


  • I want the Logic App to trigger immediately after publishing the blog post.
  • I want the e-mail to be sent out to me, the marketing department and the author of the post for (possibly) immediate usage on social media.
  • If I resubmit a logic app, I don’t want new URLs (idempotency), I want to keep the ones already in the Bitly dashboard.
  • I want the e-mail to appear as if it was coming directly from me.

Logic App Trigger

I could easily fill in one of the first requirements, since the Logic App RSS connector provides me a very easy way to trigger a logic app based on a RSS feed. Our Codit blog RSS feed seemed to do the trick perfectly!

Now it’s all about timing the polling interval: if we poll every minute we get the e-mail faster, but will spend more on Azure consumption since the Logic App gets triggered more… I decided 30 minutes would probably be good enough.

Now I needed to try and get the URL for any new posts that were published. Luckily, the links – Item provides me the perfect way of doing that. The Logic Apps designer conveniently detects this might be an array of links (in case two posts get published at once) and places this within a “For each” shape!

Now that I had the URL(s), all I needed to do was save the Logic App and wait until a blog post was published to test the Logic App. In the Logic App “Runs history” I was able to click through and see for myself that I got the links array nicely:

Seems there is only one item in the array for each blog post, which is perfect for our use-case!

Shortening the URL

For this part of the exercise I needed several things:

  • I actually need two URLs: one for Twitter and one for LinkedIn, so I need to call the Bitly connector twice!
  • Each link gets a little extra information in the query string called UTM codes. If you are unfamiliar with those, read up on UTM codes here. (In short: it adds extra visibility and tracking in Google Analytics).
    So I needed to concatenate the original URL with some static UTM string + one part which needed to be dynamic: the UTM campaign.

For that last part (the campaign): we already have our CMS cleaning up the title of a blog post in the last part of the URL being published! This seems ideal for us here.

However, due to lack of knowledge in Logic Apps-syntax I got a bit frustrated and – at first – created an Azure Function to do just that (extract the interesting part from the URL):

I wasn’t pleased with this, but at least I was able to get things running…
It however meant I needed extra, unwanted, Azure resources:

  • Extra Azure storage account (to store the function in)
  • Azure App Service Plan to host the function in
  • An Azure function to do the trivial task of some string manipulation.

After some additional (but determined) trial and error late in the evening, I ended up doing the same in a Logic App Compose shape! Happy days!

Inputs: @split(item(), ‘/’)[add(length(split(item(), ‘/’)), -2)]

It takes the URL, splits it into an array, based on the slash (‘/’) and takes the part which is interesting for my use-case. See for yourself:

Now I still needed to concatenate all pieces of string together. The concat() function seems to be able to do the trick, but an even easier solution is to just use another Compose shape:

Concatenation comes naturally to the Compose shape!

Then I still needed to create the short links by calling the Bitly connector:

Let’s send out an e-mail

Sending out e-mail, using my Office365 account is actually the easiest thing ever:


My first practical Logic App seems to be a hit! And probably saves us about half an hour of work every week. A few hours of Logic App “R&D” will definitely pay off in the long run!

Here’s the overview of my complete Logic App:

Some remarks

During development, I came across – what appear to me – some limitations :

  • The author of the blog post is not in the output of the RSS connector, which is a pity! This would have allowed me to use his/her e-mail address directly or, if it was his/her name, to look-up the e-mail address using the Office 365 users connector!
  • I’m missing some kind of expression shape in Logic Apps!
    Coming from BizTalk Server where expression shapes containing a limited form of C# code are very handy in a BizTalk orchestration, this is something that should be included one way or the other (without the Azure function implementation).
    A few lines of code in there is awesome for dirty work like string manipulation for example.
  • It took me a while to get my head around Logic Apps syntax.
    It’s not really explained in the documentation when or when not to use @function() or @{function()}. It’s not that hard at all once you get the hang of it. Unfortunately it took me a lot of save errors and even some run-time errors (not covered at design time) to get to that point. Might be just me however…
  • I cannot rename API connections in my Azure Resource Group. Some generic names like ‘rss’, ‘bitly’ and ‘office-365’ are used. I can set some connection properties so they appear nicely in the Logic App however.
  • We have Office365 Multi-Factor Authentication enabled at our company. I can authorize the Office365 API connection, but this will only last for 30 days. I might need to change to an account without multi-factor authentication if I don’t want to re-authorize every 30 days…

Let me know what you think in the comments! Is this the way to go?
Any alternative versions I could use? Any feedback is more than welcome.

In a next blog post I will take some of our Logic Apps best practices to heart and optimize the Logic App.

Have a nice day!

Automating BizTalk Administration tasks using BizTalk360 : Data Monitoring Actions

Automating BizTalk Administration tasks using BizTalk360 : Data Monitoring Actions


On a day to day basis, a BizTalk administrator must perform few monotonous activities such as terminating instances, enabling receive locations, ensuring the status of SQL jobs etc. BizTalk360 has few powerful features which help you to automate such monotonous tasks. These features are hidden gems and are overlooked by many BizTalk360’s users, despite the availability of a good documentation. That prompted me to start my new blog series called “Automating BizTalk administration tasks using BizTalk360”. In this blog series, I will be explaining these automation capabilities which BizTalk360 brings to its users.

To start off with in this first blog I am focusing on “Data Monitoring Actions”.

What is Data Monitoring in BizTalk360?

As we are aware, BizTalk collects a diverse set of data into message box database, tracking database, BAM primary import and ESB databases.   BizTalk360 brings all these data into a single console and on top of that provides a powerful capability to set alerts based on various thresholds. This feature is called data monitoring. Below is a screenshot that shows all different data sets which can be used in data monitoring feature.

biztalk administration
Below table briefly explains various types of data items which could be monitored.

Data monitoring category

Process MonitoringWith process monitoring you will be able to monitor the number of messages being processed by receive ports, send ports. This is popularly also called as “non-event monitoring

Ex: if you want to alert when less than 50 messages received in an hourly window during business hours, then process monitoring is the best fit.

Refer the assist article Process Monitoring for more information.

Message Box Data monitoringWith this you will be able to set alerts on the number of suspended, running, dehydrated messaging instance.

Refer the assist article Message Box Data Monitoring for more information.

Tracking Data MonitoringWith this you can set alerts on tracked messaging events and tracked service instances.

Refer the assist article Tracking Data Monitoring for more information.

BAM Data MonitoringWith this you can set alerts on the data stored in BAM tables.

Refer the assist article BAM Data Monitoring for more information.

EDI Data MonitoringWith this you can set alerts on the EDI and AS2 reporting data stored in BAM tables.

Refer the assist article EDI Data Monitoring for more information.

ESB Data MonitoringWith this you can set alerts on the ESB data and exceptions stored in BAM and ESB tables.

Refer the assist article ESB Data Monitoring for more information

Logic Apps Metrics MonitoringWith this you can set alerts on metrics emitted by Logic apps.

Refer the assist article Logic Apps Metrics Monitoring for more information

Message Box Data Monitoring Actions

In Message Box Data Monitoring, the user can configure the queries to monitor service instances and messages. Monitoring service will send the notification to the users whenever the service instances/Messages count violates the threshold condition.

Message Box Data Schedule can be configured in Data Monitoring > Message Box Data. It can be scheduled at the different frequencies (Daily, Weekly, and Monthly) based on the volume and priority to take the action on service instances/messages.

Query Condition

BizTalk360 provides Highly advanced query builders for selecting the precise and expected suspended instances based data-result. While querying the suspended/All In-Progress Service Instances you can apply the filters like Error Code, Application, Service Class, Service Name etc.

biztalk administration

Context Properties

A Message-Context based query is been provided by BizTalk360 for higher business-friendly scenarios. In Message payload, context/promoted properties can be selected to know the transactional message. In Data monitoring schedule the user can choose which context promoted properties to be in an email alert.

biztalk administration

Action on Service Instances

The operational user must closely watch the suspended service instances to act on. It is a tedious process to look after all the time. Message Box data monitoring feature will take automatic action on service instances when the set actions are configured in our schedule. The monitoring service will Terminate/Resume the service instances based on either error or warning condition which doesn’t require any manual intervention.

biztalk administration

Archiving & Downloading the Message Instances

Message content & context is required for auditing or other purposes of reconciliation. If you have not enabled the tracking option, it is not possible to get hold of the data again. Keeping this in mind, we have implemented archiving the message context when setting the action is taken on instances. In BizTalk360 Settings>System Settings, Archive and Download location of message instances must be configured to archive and download the message instances. Automatic actions with desired backup steps are been taken to make sure all the data are preserved before taking any action.

Note: In order take action on suspended service instances the monitoring service account has to be created as superuser in BizTalk360.

biztalk administration

In Data Monitoring dashboard, every monitoring cycle status is shown. When the user clicks on the status tab, it will bring the details about the Query result, Task Action and Exception summary.

biztalk administration

In Task Action tab, you can download each instance separately or by using “Click here” button you can download all the instances to the server location. Service Instances messages are download in server location as Zip file with activity ID for the monitoring run cycle.

biztalk administration


Data Monitoring, an Auto-Monitoring feature of BizTalk Administration which can take corrective actions with all backup steps in the event of any threshold violations. With just a one-time setting we have our BizTalk360 to make sure all your routine tasks are addressed without a manual intervention. Also, BizTalk360 offers much more monitoring features which will enable all administrators to be pro-actively monitoring the BizTalk environment(s). Next article will see the Auto correction on BizTalk Artifacts and Logic Apps.

Author: Senthil Palanisamy

Senthil Palanisamy is the Technical Lead at BizTalk360 having 12 years of experience in Microsoft Technologies. Worked various products across domains like Health Care, Energy and Retail.

We’re just days away from INTEGRATE 2017!

We’re just days away from INTEGRATE 2017!

It’s time for you to pack your bags and prepare for your trip to London for INTEGRATE 2017 — the biggest Integration focused conference of the year. We are almost there! (just a week away before the event). We decided to write this blog with some last minute information to make it easy for you to attend the event. If you still haven’t booked your tickets, we have the last 10 tickets up for grabs on a first come first serve basis. Don’t miss out the chance to be at INTEGRATE 2017!

Attendee Count

We are expecting close to 380+ attendees this year for INTEGRATE 2017. It’s quite amazing to see the response year after year for this event and the amount of hope the folks in the Microsoft Integration Community have on BizTalk360 to consistently and successfully organize this event. We will be able to present you the exact stats on the first day of the event.

Event Venue

Kings Place Events
90 York Way, London, N1 9AG.

The venue is located in the heart of London. Just a five minute walk from Kings Cross and St. Pancras International Stations. If you are travelling from:

  • London Heathrow Airport – Kings Place is approximately 50 mins by train
  • London Gatwick Airport – Kings Place is approximately an hour by train and underground
  • London City Airport – Approximately 45 minutes by underground and DLR

There are high-speed services from Kent, majority of all trains from the North arrive at either Kings Cross or Euston (which is only 10 mins walk), and most underground lines stop at Kings Cross. St Pancras is also the home of Eurostar.

Quick Link: Tube Map to reach Kings Place

Event Registration

The registration desk will be open from 0730 hrs on Day 1. To ease the registration process, there will be 4 booths and will be categorized alphabetically (as per your first name) for you to register on the 1st day. You will be provided your conference ID badges. Please remember to wear your badge at all times.

The easiest way to make your way through the event venue is to follow the signage or simply reach out to one of our volunteers for any assistance.

Day 1 – It’s all Microsoft, Microsoft, and Microsoft sessions….

You simply cannot miss Day 1 of INTEGRATE 2017! We have lined up 9 sessions from the Microsoft Product Group team starting off with the keynote speech by Jim Harrer on what’s happened in the Hybrid Integration Platform over the past year and how AI is changing the way Microsoft thinks about enterprise application integration. The subject matter of interest then slowly shifts to BizTalk, Enterprise Messaging, and finally into the vast ocean of Azure related topics like Event Hubs, Logic Apps, Azure Functions, Microsoft Flows, and API Management. And probably, this is the best day you can get your questions answered from the Microsoft Product Group or the community team present at the event. As Saravana Kumar, founder/CTO of BizTalk360 says,

If you cannot find an answer to your question in this room (INTEGRATE event), you probably will not be able to find an answer elsewhere.

Evening Drinks with Networking

We have arranged for evening networking after the end of Day 1 over some drinks. Enjoy your drink after an informative Day 1 at INTEGRATE 2017 and get a chance to meet fellow integration MVPs, the Product Group and people from the Microsoft Integration space.

The first half of Day 2 (till 1145 AM) is also covered with sessions from the Microsoft Product Group, after which the remaining 1.5 days belong to the Integration MVPs.

Quick Link: INTEGRATE 2017 Agenda

Meet our Sponsors

INTEGRATE 2017 would not be the same without our sponsors and we would like to extend our thanks to our Titanium Sponsor Microsoft, Platinum Sponsor Codit, Gold Sponsors – Bouvet, Reply Solidsoft, Active Adapter, and our Silver sponsors – QuickLearn Training, Middleway, Affinus. You can walk through the sponsor booths on the mezzanine floor during coffee/lunch breaks and engage in a conversation.

BizTalk360 & ServiceBus360 Booths – Meet the team!

That’s not all! The core team from BizTalk360 & ServiceBus360 – the think tank team, Development folks, QA people, customer support team, client relationship group (who keep our customers happy!) are all available over the 3 days of event. Come over to the BizTalk360 and ServiceBus360 booths at the event venue to meet the team who work behind the scenes on these products.

Informal Entertainment on Day 1 Evening

We have some informal entertainment planned for Day 1 evening during the drinks/networking session.

Social Media – Post, Follow, Like, Comment, Share about the event

Let it just not be a one-sided action at INTEGRATE! Come and join us on social media and spread the word about the event to the world. Show us how you are enjoying INTEGRATE by sharing photographs from the event venue.

Official Event Hashtag – #Integrate2017

If you are not attending the event, don’t worry! Simply follow us on –


Packing your stuff for travel

We care about our attendees who are travelling into London for INTEGRATE. We have people travelling all the way from New Zealand flying approximately over 30 hours, and folks from the US crossing the pond.

Temperatures are slightly on the warmer side during this time, but can become overcast with spells of rain. So make sure you pack the right set of clothes. The average daytime temperatures are around 23°C/73.4F.

The dress code for INTEGRATE 2017 is standard Business Casuals.

Wishing you a Safe Travel! See you at INTEGRATE 2017

On behalf of the INTEGRATE 2017 Event Management Team, I would like to wish you a safe travel — if you are travelling by plane, train, bus, or any other mode. We look forward to seeing you at INTEGRATE 2017 event on June 26th at Kings Place. For any more details about INTEGRATE 2017, you can visit the event website.

See you in the next few days! 🙂

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”.

Microsoft Integration Weekly Update: June 19

Microsoft Integration Weekly Update: June 19

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

IoT, Stream Analytics and other Big Data Stuff via The Azure Podcast


Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Run BizTalk extension objects in Logic Apps

Run BizTalk extension objects in Logic Apps

Extension objects are used to consume external .NET libraries from within XSLT maps. This is often required to perform database lookups or complex functions during a transformation. Read more about extension objects in this excellent blog.



We are facing two big challenges:

  1. We must execute the existing XSLT’s with extension objects in Logic App maps
  2. On premises Oracle and SQL databases must be accessed from within these maps


It’s clear that we should extend Logic Apps with non-standard functionality. This can be done by leveraging Azure Functions or Azure API Apps. Both allow custom coding, integrate seamlessly with Logic Apps and offer the following hybrid network options (when using App Service Plans):

  • Hybrid Connections: most applicable for light weight integrations and development / demo purposes
  • VNET Integration: if you want to access a number of on premise resources through your Site-to-Site VPN
  • App Service Environment: if you want to access a high number of on premise resources via ExpressRoute

As the pricing model is quite identical, because we must use an App Service Plan, the choice for Azure API Apps was made. The main reason was the already existing WebAPI knowledge within the organization.


A Site-to-Site VPN is used to connect to the on-premise SQL and Oracle databases. By using a standard App Service Plan, we can enable VNET integration on the custom Transform API App. Behind the scenes, this creates a Point-to-Site VPN between the API App and the VNET, as described here. The Transform API App can be consumed easily from the Logic App, while being secured with Active Directory authentication.



The following steps were needed to build the solution. More details can be found in the referenced documentation.

  1. Create a VNET in Azure. (link)
  2. Setup a Site-to-Site VPN between the VNET and your on-premises network. (link)
  3. Develop an API App that executes XSLT’s with corresponding extension objects. (link)
  4. Foresee Swagger documentation for the API App. (link)
  5. Deploy the API App. Expose the Swagger metadata and configure CORS policy. (link)
  6. Configure VNET Integration to add the API App to the VNET. (link)
  7. Add Active Directory authentication to the API App. (link)
  8. Consume the API App from within Logic Apps.

Transform API

The source code of the Transform API can be found here. It leverages Azure Blob Storage, to retrieve the required files. The Transform API must be configured with the required app settings, that define the blob storage connection string and the containers where the artefacts will be uploaded.

The Transform API offers one Transform operation, that requires 3 parameters:

  • InputXml: the byte[] that needs to be transformed
  • MapName: the blob name of the XSLT map to be executed
  • ExtensionObjectName: the blob name of the extension object to be used


You can run this sample to test the Transform API with custom extension objects.

Input XML

This is a sample input that can be provided as input for the Transform action.

Transformation XSLT

This XSLT must be uploaded to the right blob storage container and will be executed during the Transform action.

Extension Object XML

This extension object must be uploaded to the right blob storage container and will be used to load the required assemblies.

External Assembly

Create an assembly named, TVH.Sample.dll, that contains the class Common.cs. This class contains a simple method to generate a GUID. Upload this assembly to the right blob storage container, so it can be loaded at runtime.

Output XML

Deploy the Transform API, using the instructions on GitHub. You can easily test it using the Request / Response actions:

As a response, you should get the following output XML, that contains the generated GUID.

Important remark: Do not forget to add security to your Transform API (Step 7), as is it accessible on public internet, by default!


Thanks to the Logic Apps extensibility through API Apps and their VNET integration capabilities, we were able to build this solution in a very short time span. The solution offers an easy way to migrate BizTalk maps as-is towards Logic Apps, which is a big time saver! Access to resources that remain on premises is also a big plus nowadays, as many organizations have a hybrid application landscape.

Hope to see this functionality out-of-the-box in the future, as part of the Integration Account!

Thanks for reading. Sharing is caring!

Microsoft Integration Weekly Update: June 12

Microsoft Integration Weekly Update: June 12

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:


Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.