Migrating BizTalk to Azure iPaaS – Part 1

Migrating BizTalk to Azure iPaaS – Part 1

By Bill Chesnut

This is the 1st of a multipart series of Blog Posts to help companies understand Migrating BizTalk to Azure Integration Platform as a Service (IPaaS).

Should I be migrating from BizTalk to Azure iPaaS? This is a question I am being ask more and more lately for a couple of different reasons.

BizTalk development has always been a very specialized skill set with a limited number of resources available in the market, so companies are now starting to look at Azure iPaaS as a viable alternative to BizTalk in terms of finding developers with the correct skill sets.

There are 3 other reasons many companies are looking at moving from BizTalk to Azure iPaaS:

  • cost, consumption pricing instead of product licensing;
  • location of data, many companies are dealing with their consumers over the internet, so having their integration resources in Azure makes more sense;
  • Infrastructure, companies are seeing the advantages of not having to maintain their own infrastructure and using Platform as a Service (PaaS) and Software as a Service (SaaS) offerings as a cost-effective alternative.

Unlike the BizTalk middleware products that covered most features required for all integration scenarios, Azure iPaaS is a collection of Azure services that may or may not be utilised for each scenario.

The base components that make up Azure iPaaS are Service Bus, Logic Apps, API Management and Event Grid. For some integration scenarios you may also use services like but not limited to: SQL Database, Storage, Function App, Web Site (hosing your WebAPI or WebJobs), Cosmos Db and Virtual Machines.  In this series of blog posts, I will look at the different scenarios and the Azure services required.

Instead of spending time here to explain each of the Azure iPaaS services, I have included links to Azure documentation on each product/service mentioned:

All Azure Product/Services can be found here: https://docs.microsoft.com/en-us/azure/#pivot=products

Before you jump full steam into migrating from your on-premises (or cloud hosted) BizTalk to Azure iPaaS, there are a few things that you need to look at to see if the migration makes sense:

  • Location of your data
  • Sensitivity of your data
  • Security Policies for accessing cloud-based resources
  • Location of consumers/partners

Let’s look at each of these items in detail:

Location of your data, if most of your data currently used by BizTalk is located inside of your data center it would be hard to justify moving all of that data to and from Azure to use Azure iPaaS.

If your data is from your consumers/partners and it is coming in and going out via the internet, iPaaS can be a great solution in terms of cost, scalability and availability.

Sensitivity of your data, again this will relate back to the first item, if the data is coming in and going out over the internet, your company has already accepted the risk, so iPaaS will be a viable solution, otherwise, a risk assessment will need to be undertaken.

Security Policies for accessing cloud based resources can sometimes be one of the hardest nuts to crack, it requires educating your management to the benefits and security safe guards of Azure. It also means designing your iPaaS solution to use some of the additional security features available in Azure.  The final point to look at is location of consumers/partners, again if everyone using the integration service of BizTalk are located inside of your data center, having them traverse to Azure and back for data does not make sense, but if many of your consumers/partners are outside of your offices or mobile users, Azure make perfect sense for availability and scale.

Once a company has decided that moving to Azure iPaaS makes sense, the next item to consider is the structure of their existing BizTalk integration solution, does the existing BizTalk solutions follow best practice:

  • having incoming data mapped to an internal/canonical format
  • processed in BizTalk using the internal/canonical format
  • then mapped back to the outbound format before leaving BizTalk

if so, this will make the migration to Azure iPaaS a less disruptive and staged process.

This series will continue with the following blog posts:

  • Migrating inbound messaging to Azure using Logic Apps, Event Grid, Service Bus and the BizTalk Claim Check Pipeline component that I have built.
  • Migrating outbound messaging to Azure, using the same tooling as inbound
  • Choices for message archiving in Azure
  • End to End Monitoring of Azure iPaaS
  • EDI (EDIFACT, X12 & AS2) Capabilities of Azure iPaaS
  • Azure iPaaS CI/CD

Thanks for taking the time to view this blog post.

Cross Posted on: http://www.sixpivot.com.au

Integration Down Under Webcast Launched

Martin Abbott, Dan Toomey, Wagner Silveira, Rene Brauwers and myself have decide we need a Australian/New Zealand time zone Integration focused webcast similar to Integration Mondays from the UK.

Out 1st webcast will be February 8th at 8pm ADST to register – https://register.gotowebinar.com/register/4469184202097473281

Thanks to SixPivot for providing the GoToWebinar

We will spend some time introducing the leaders of the group and then each person will present a short presentation

Bill Chesnut – API Management REST to SOAP

Martin Abbott – In this session, Martin will walk through the new visual tooling available in Azure Data Factory v2. He’ll look at what you can do, set up source control, install an Integration Runtime for on premises fun, and do a simple data copy to give a flavour of how quick and easy it is to get going.

Wagner Silveira – A Lap around Azure Functions Proxy, Azure Functions Proxy is a simple API Toolkit embedded in Azure Functions, enable quick composition of APIs from various sources. In this lightning talk, Wagner Silveira will show the main features and how to quickly compose an API from various sources.

Dan Toomey – Microsoft recently released the public preview of Azure Event Grid – a hyper-scalable serverless platform for routing events with intelligent filtering. No more polling for events – Event Grid is a reactive programming platform for pushing events out to interested subscribers. Support for massive scalability and minimal latency makes this an ideal solution for a number of scenarios, including monitoring, governance, IoT, or general integration. This talk will demonstrate how easy it is to configure the capture of events with Azure Event Grid.

Rene Brauwers – a reactive integration primer

Melbourne Global Integration Bootcamp–Live Webinar

Melbourne Global Integration Bootcamp–Live Webinar

For those that cannot make it to a Global Integration Bootcamp event live, we will be doing live webinar for the Melbourne, Australia event (recordings will also be available after the event)

Thanks to SixPivot for providing the facility to present the Live Webinars

You need to register for each session you would like to join using the link below.

Welcome & Session 1 : API Management & API Apps

https://attendee.gotowebinar.com/register/6751857195964514305

Join Bill Chesnut, Cloud Platform & API Evangelist for SixPivot, for the opening of the 2017 Global Integration Bootcamp in Melbourne, Australia and his session on API Management + API Apps

Session 2 : Service Bus, Enterprise Integration Pack & On-Premises Gateway

https://attendee.gotowebinar.com/register/7355755161427574787

Join Nathan Fernandez, Director for Interprit, for Service Bus + Enterprise Integration Pack + On-Premises Gateway

Session 3 : Logic Apps, Flow & BizTalk 2016 (Hybrid Integration)

https://attendee.gotowebinar.com/register/7705137075781361667

Join Tanvir Chowdhury, Principal Consultant for Mexia, for Logic Apps + Flow + BizTalk 2016 (Hybrid Integration)

Session 4 : Logic Apps (cloud adapters), Azure Functions & Azure Storage

https://attendee.gotowebinar.com/register/8961793104422407170

Join Paco de la Cruz, Senior Consultant for Mexia, for Logic Apps (cloud adapters) + Azure Functions + Azure Storage

Session 5 : IoT Hub, Stream Analytics, DocumentDB & PowerBI (IoT)

https://attendee.gotowebinar.com/register/9197003930371197954

Join Sonja Bijou, Solution Designer for Interprit, forIoT Hub + Stream Analytics + DocumentDB + PowerBI (IoT)

Creating BizTalk Server 2016 Developer from Azure Gallery Image

Creating BizTalk Server 2016 Developer from Azure Gallery Image

Published by: Bill Chesnut

The process for creating a BizTalk 2016 Developer machine with 2016 has changed from the previous versions, the Azure Gallery Image does not have SQL, Visual Studio or BizTalk installed, so I will walk you through those steps here.

Start with either the Classic or ARM Azure Gallery Image (ARM Image show below)

image

Logon to your newly created machine

image

You might want to turn off the IE Enhanced Security Configuration or you will be prompted for every website you visit

image

The image does not contain Visual Studio or SQL so you will need to download those:

You might need to signup for a free Visual Studio Dev Essentials subscription on http://my.visualstudio.com

SQL 2016 Developer:

image

Visual Studio 2015 (Community is free, use Professional or Enterprise if you have an MSDN subscription)

image

SQL Server Management Tools – http://go.microsoft.com/fwlink/?LinkID=822301

You should now have these files in your download directory

image

Installing SQL 2016 Developer Edition

We are now going to install SQL 2016 Developer Edition on the image

Note: for this installation I will be using the Administrator (bizadmin) for all services, you can use the accounts dictated by your companies security policies

Installing SQL Management Tools

We are now going to install SQL 2016 Management Tools on the image

Double Click on the setup application image
Click Install image
Installation Complete, Click Close image

Installing Visual Studio 2015 Community Edition

We are now going to install Visual Studio 2015 Community Edition on the image

Note: if you have MSDN subscription, feel free to install professional or enterprise

Installing BizTalk Server 2016 Developer Edition

We are now going to install BizTalk Server 2016 Developer Edition on the image

Configuring BizTalk Server 2016 Developer Edition

We are now going to configure BizTalk Server 2016 Developer Edition on the image

Enter the “User name” and “Password” that you want to use for the BizTalk Server Services, I am using the administrators account but feel free to create and use another account

Click Configure

Note: Ignore the warning about administrator account, this is a developer machine not production

image
Click Next image
Configuration Complete

Click Finish

image

Installing LOB Adapters

We are now going to Install the LOB Adapters & SDK on the image

Double Click on Setup image
Click “Install Microsoft BizTalk Adapters” image
Click Step 1… to install SDK

Follow Setup Wizard

Note: I typically do a Complete install in everything in the Adapter Pack on a developer machine

image
Click Step 2… to install Adapter Pack

Follow Setup Wizard

Note: I typically do a Complete install in everything in the Adapter Pack on a developer machine

image
Click Step 3… to install Adapter Pack (x64)

Follow Setup Wizard

Note: I typically do a Complete install in everything in the Adapter Pack on a developer machine

image
(Optional)
Click Step 4… to install Adapter for Enterprise Applications

Follow Setup Wizard

Note: I typically do a Complete install in everything in the Adapter Pack on a developer machine

image
Open the BizTalk Server Administration Console

Expand Tree, Right Click Adapters, Click New, Click Adapter

image
Type Name WCF-SQL, Select WCF-SQL from Adapter drop down list

Click OK

Note: this is the only adapter I typically add, but feel free to add the rest of the adapters if you development project is going to need them.

image

Installing & Configuring ESB Toolkit (Optional)

We are now going to Install and Configure ESB Toolkit on the image

Final Configuration for Developer Image

We are now going to do the final configuration on the image

Open Microsoft SQL Server Management Studio from the Menu image
Connect to your database server

Click Connect

Note: full stop or period means local machine

image
Click “New Query” image
For a developer image of BizTalk is it not necessary to create transaction logs for the BizTalk Databases, so the command below will set all database into simple recovery mode so no transaction logs are created and not database backups are required

Execute the Query

select ‘alter database [‘+name+’] set recovery simple’ from master.sys.databases where database_id > 4 and state_desc = ‘online’

image
Copy the output of the previous command into a new query window image
Click Execute

Note: all the databases are in simple recovery mode so there is no need to configure the BizTalk Backup Job

image
To keep the tracking database for becoming large on the developer image, we need to setup the DTA Purge Job

Right click on the “DTA Purge and Archive (BizTalkDTADb)”

Click Properties

image
Change the name to “DTA Purge (BizTalkDTADb)”

Click Steps

image
Edit the Step image
Change the step name to “Purge”

Update the Command to the text below

Click Ok, Click Ok

declare @dtLastBackup datetime
set @dtLastBackup = GetUTCDate()
exec dtasp_PurgeTrackingDatabase 1, 10, 11, @dtLastBackup

image
Right click on the “DTA Purge (BizTalkDTADb)”

Click Enable

Note: you can then do the same and select “Start Job at Step” to make sure you don’t have any syntax errors.

image

Conclusion

This finished the basic configuration with a few extra, if you are using BizUnit, BTDF, SpecFlow or any other tools, you will need to install them now.

I would also recommend running windows update to get the latest updates installed

I hope this instruction and screen shots help you get your BizTalk Server 2016 Developer Azure Gallery Image up and running smoothly

How application infrastructure is evolving to support digital business

How application infrastructure is evolving to support digital business

After a few customer engagements recently this topic has come up several times, some lean in closer and let’s have a chat on it.

How application infrastructure is evolving to support digital business?

The great thing in this modern era is that businesses are placing great pressure on traditional IT, integrators and solution architects to innovate and look for that next disruptive edge.

With a pea sized idea, access to the rapidly evolving cloud technologies businesses can disrupt even the most cemented industries. In fact many of my customer meetings are around disruption, “What/How can we disrupt today?”. The business climate has never been better & hence the birth of the phrase ‘digital business’.

Many companies that ‘push tin’ are scrambling to offer a range of other services as they also see the future is not in tin, not so much in technical expertise (while I do love being part of a great team, cloud templates can put short work to our previous ‘cluster’ expert), but in being agile! Taking an idea whether it’s IoT or anything else and realising the solution. In my opinion, this is the skill that will be much sought after in the market.

Integration is key here.

howtointegrate.png

What about the application landscape? How has it changed?

Great question!

middlewareFigure 2 – Depicting the role of middleware

From what I’ve experienced is that over the past 20 odd years, application integration ‘layers’ (or middleware) were large, monolithic and usually cost a fair bit of $$. Associated with a software platform purchase was a good 9-18 month evaluation of the platform to see if it was ‘fit for purpose’.

(There were a whole series of VAN’s in this space as well – Value Added Networks, which made life simple to move data/messages through a particular industry vertical. This was partly because it was made complex by proprietary interfaces, and also the fact that the VAN providers wanted customer lock-in. So naturally everything to do with these environments was ‘hard’ and we let the ‘experts’ deal with it)

In these times many of the systems and applications all had custom ways to communicate. Software vendors reluctantly opened up ways of getting data in/out of their system. Communications standards were lacking, as well as message formats and protocols.

The fact that the middleware platforms communicated with a large number of systems from DB2, SAP, OracleEB, JD Edwards through to Pronto (ERPs) made attractive choices for businesses that saw the value in getting ‘end-to-end integration’ and a full 360 degree customer view.

Fast forward to present day…..

Software vendors are exploding at a rate of knots all over the web. Applications are more about functionality than specifically where they run – on-premises or cloud or…phone…or where ever. Applications today are modular and connected by well known interfaces – although the resiliency and interface workflows are lagging behind a little in this realm.

The software industry realises that mobile devices are king and they build applications accordingly. For e.g. the pseudo modern accepted message/data exchange JSON/HTTP (aka REST) based APIs. I had one of my team members complain when an interface they were talking to was SOAP. I felt like I was talking about tape backups 🙂

Given that Mobile is taking charge and software is providing better foundations for ease of mobile development and operation, you could call it the mobile evolution….

REST based APIs are accepted as the norm with OpenId/OAuth being token based authentication standards, allowing the beauty of delegated authentication (something that plagued many previous integration implementations in the quest for the elusive single sign-on capability).

middleware_today.pngFigure 3 – Illustrates the business need to maximise and provide a comprehensive set of APIs in which to monetise.

Businesses today are realising they don’t need mobile, they don’t need a website….they need an API!!! An API:

  • to expose their business to public consumers
  • to expose their business to down stream consumers
  • to expose their business to upstream consumers
  • to commercialise their offerings.

They now realise they can get away from building ‘yet another mobile app’ and focus squarely on turning data into information.

Software today needs to produce analytics cause gone are the days when we got excited at just being ‘connected’.

Advertisements

How application infrastructure is evolving to support digital business

How application infrastructure is evolving to support digital business

After a few customer engagements recently this topic has come up several times, some lean in closer and let’s have a chat on it.

How application infrastructure is evolving to support digital business?

The great thing in this modern era is that businesses are placing great pressure on traditional IT, integrators and solution architects to innovate and look for that next disruptive edge.

With a pea sized idea, access to the rapidly evolving cloud technologies businesses can disrupt even the most cemented industries. In fact many of my customer meetings are around disruption, “What/How can we disrupt today?”. The business climate has never been better & hence the birth of the phrase ‘digital business’.

Many companies that ‘push tin’ are scrambling to offer a range of other services as they also see the future is not in tin, not so much in technical expertise (while I do love being part of a great team, cloud templates can put short work to our previous ‘cluster’ expert), but in being agile! Taking an idea whether it’s IoT or anything else and realising the solution. In my opinion, this is the skill that will be much sought after in the market.

Integration is key here.

howtointegrate.png

What about the application landscape? How has it changed?

Great question!

middlewareFigure 2 – Depicting the role of middleware

From what I’ve experienced is that over the past 20 odd years, application integration ‘layers’ (or middleware) were large, monolithic and usually cost a fair bit of $$. Associated with a software platform purchase was a good 9-18 month evaluation of the platform to see if it was ‘fit for purpose’.

(There were a whole series of VAN’s in this space as well – Value Added Networks, which made life simple to move data/messages through a particular industry vertical. This was partly because it was made complex by proprietary interfaces, and also the fact that the VAN providers wanted customer lock-in. So naturally everything to do with these environments was ‘hard’ and we let the ‘experts’ deal with it)

In these times many of the systems and applications all had custom ways to communicate. Software vendors reluctantly opened up ways of getting data in/out of their system. Communications standards were lacking, as well as message formats and protocols.

The fact that the middleware platforms communicated with a large number of systems from DB2, SAP, OracleEB, JD Edwards through to Pronto (ERPs) made attractive choices for businesses that saw the value in getting ‘end-to-end integration’ and a full 360 degree customer view.

Fast forward to present day…..

Software vendors are exploding at a rate of knots all over the web. Applications are more about functionality than specifically where they run – on-premises or cloud or…phone…or where ever. Applications today are modular and connected by well known interfaces – although the resiliency and interface workflows are lagging behind a little in this realm.

The software industry realises that mobile devices are king and they build applications accordingly. For e.g. the pseudo modern accepted message/data exchange JSON/HTTP (aka REST) based APIs. I had one of my team members complain when an interface they were talking to was SOAP. I felt like I was talking about tape backups 🙂

Given that Mobile is taking charge and software is providing better foundations for ease of mobile development and operation, you could call it the mobile evolution….

REST based APIs are accepted as the norm with OpenId/OAuth being token based authentication standards, allowing the beauty of delegated authentication (something that plagued many previous integration implementations in the quest for the elusive single sign-on capability).

middleware_today.pngFigure 3 – Illustrates the business need to maximise and provide a comprehensive set of APIs in which to monetise.

Businesses today are realising they don’t need mobile, they don’t need a website….they need an API!!! An API:

  • to expose their business to public consumers
  • to expose their business to down stream consumers
  • to expose their business to upstream consumers
  • to commercialise their offerings.

They now realise they can get away from building ‘yet another mobile app’ and focus squarely on turning data into information.

Software today needs to produce analytics cause gone are the days when we got excited at just being ‘connected’.

Advertisements

Integrate 2016 Talk In Text: API Apps 101 for BizTalk Server Developers

Integrate 2016 Talk In Text: API Apps 101 for BizTalk Server Developers

In this post, I am going to try to capture in text the presentation that I gave at the Integrate 2016 conference over in London. Text is likely the worst medium in which to capture such a session, but, alas, I do realize that sometimes it is the best medium for proper digestion of such content. If you’d rather see it in video form, click here.

So with that, let’s pretend that you are sitting among fellow professionals in a beautiful room on the 3rd floor of ExCeL London – complete with bright colored lights to set the mood. A wild American then appears, flailing his arms and babbling about how it’s actually 3 AM, and we’ve all been deceived. Then he starts talking about food.

Getting Our Priorities Straight

The world of software development might be a better place if we approached our tasks in that world the way that we approach each meal. We don’t really start each meal by a trip to the racks or shelves that hold our appliances – thinking, “Well, I have a vegetable peeler, and a fondue pot, I guess that means we’re eating some melted Gruyère and Emmentaler mixed with white wine, and carrot strings for every meal.”

Usually, the way it actually works out is that I’m thinking about what I’m craving, the ingredients that I have on hand and their flavor/nutritional value relative to my needs. From there, I look to proven recipes that satisfy those things, and finally, reach for the specific tools needed to do the job. If I don’t have them, I acquire them, or fashion a workable approximation.

We have to be really careful that we, when approaching software development and integration, take the same approach as we would when crafting an excellent meal. An approach that looks first to the needs and constraints, then to proven patterns/recipes, and allows the tools used to flow from the rest – even to the point of crafting/buying new tools that we haven’t used before if necessary.

Business Challenges / Constraints

So, let’s imagine that we all work together now. We want to take the approach outlined above – one in which we have to consider the business need and the constraints that we may very well simply be stuck with. From there, we can consider proven patterns that might help us overcome, and then finally identify/acquire/create the tools required to get the job done.

Our company makes custom bobbleheads.

The way that it works is that a customer uploads a 3D model of their face, and then selects a pre-built body from the gallery. The 3D print of their face starts immediately so that the order can be shipped as soon as possible. The customer is permitted to take as long as they need to select the body from the gallery of pre-built bodies. Once they select a body, we attach the printed face to the chosen body and ship the assembled dolls to the customer.

So what happens behind the scenes that we can’t escape?

Well, sadly we don’t do greenfield development here. It’s not really brownfield development either. It’s more like a house haunted by ghost IT – shadow IT that has left.

Whenever a customer uploads their 3D face model, an XML notification message is created that contains the order id and a reference to their face model. At the time it was built, our developers emulated the BizTalk Server demos of the day and built distributed, fault-tolerant XML file copy operations whilst applying the wisdom of Chris Maden who has been quoted saying, “XML is like violence. If it doesn’t solve your problem, you’re not using enough of it.”

These same developers learned while attending conferences over the past few years that Dropbox is the next big thing in enterprise integration. They may have been wrong, but it’s now up to us to Make Dropbox Great Again™.

Once the customer selects a body for their doll, another XML message is created and dropped into the same folder in Dropbox with the same file naming conventions. Ultimately, we need to pair up the two components of the order – the head and body – in order to complete the processing.

Proven Patterns

It’s at this point that we would consult the great oracles of all wisdom and knowledge in the world of integration, Gregor Hohpe and Bobby Woolf. We will search through the patterns to find pieces that help solve each piece of the puzzle.

Which patterns might we find? Well, to handle the communication with Dropbox, we might utilize the adapter pattern in hopes that one day the data will be sourced from a different system. We could apply the pipes and filters pattern and build a reusable translation/transformation pipeline made up of reusable independent steps organized in the proper order to provide the required translation/transformation/message enrichment for each interface.

From there, we could apply the publish and subscribe pattern to enable loosely coupled communication between the source system and any number of downstream subscribers – maybe routing the message in a content-based fashion. We could also layer on top of this a process manager to enable content-based correlation.

Tools

How would we use these patterns in concert with the tools we have/don’t have? BizTalk Server might seem like a natural fit.

It already provides for us the concept of a port that begins with an adapter which delivers a byte stream to a pipes and filters style pipeline responsible for translation of the message and promotion of context properties used for routing, this is followed by a transform before publishing to the message box. From there, we have process managers in the form of BizTalk Orchestrations that understand the concept of content-based correlation of published messages, allowing them to be reunited by the messaging engine. You get adapters, pipelines, maps, orchestration out of the box – and publish subscribe whether you want it or not!

It’s already bringing to the table everything I need. Everything but a handy Dropbox adapter. Now, I know that we could always build our own, or use out of the box adapters with ungodly amounts of WCF extensions to make some magic happen, but maybe that won’t be our best bet here.

So, let’s set that aside for a moment, and consider what might become possible if we started using Logic Apps like this? It’s really the same question I posed before about MABS.

In this case, we’re marrying Logic Apps and Service Bus. We have some Logic Apps that act in a similar fashion as BizTalk Ports. They provide adaptation and message enrichment and transformation through the use of relevant API apps for those concerns. Others act more like BizTalk Server orchestrations, coordinating the sends and receives of messages and operating on the content.

The messages are routed to the “orchestration” style Logic Apps through Service Bus. Each flow is triggered by subscribing to messages that arrive on a given Service Bus topic subscription (pre-created). Correlation can then be enabled by subscriptions dynamically created mid-flow.

At this point, you may have the following thought (which I humbly share indeed):

Demo Walkthrough

This isn’t all just a pipe dream – it’s real. I’ve built it. So, let’s see how it can fit together. The flow kicks off with an XML message. For this message, I have created a BizTalk Server 2016 schema (i.e., a regular XML schema with special notations about properties that should be promoted to the message context for routing purposes). The message looks like this:

The message contains a promoted OrderId property that we should be able to correlate on. In other words, the second message that will show up in Dropbox for the order should also contain the same OrderId value – which allows us to determine that they are indeed related messages. The first message also contains a reference to the head for the bobblehead doll that we will be printing.

When this message is uploaded to Dropbox, it will be picked up by our “Port” style Logic App that looks like this:

The first API App after the Dropbox receive is a custom API app that essentially builds a context property bag when it is passed an XML payload. It does this by comparing the document to a BizTalk schema, and using the instructions in the schema to “promote” properties by extracting the relevant content. It takes two inputs to operate.

The first input is a URL to the root of an Azure Blob Storage container that contains BizTalk schemas. It will use these schemas to perform message type resolution and property promotion. The second input is a string containing the entirety of an XML message. Not exactly the screaming performance of a forward-only streaming pipeline component, but it gets the job done, considering we’re already taking on latency to get to the cloud in the first place.

The output of that API app looks like this (click to enlarge):

The next API app takes the payload, along with the property bag (which it treats as a set of brokered message properties) and publishes the message to an Azure Service Bus Topic. This is just the out of the box connector using the outputs of our custom API app. The call out to that app looks like this:

This published message is picked up by our Logic App that is acting like an Orchestration. That Logic App has a pre-defined subscription on the same service bus topic for any message with a Message Type of Print Job.

After the message is received, the Logic App must quickly setup a subscription for any related messages that come in for that order. Unfortunately, the out of the box connector for Service Bus doesn’t yet have a way to create a new subscription – only ways to subscribe for messages on an existing subscription.

Thus, we will have to use a custom API app to create a subscription unique to this running instance of the Logic App – one that is based on the OrderId property of the received message. To provide this capability, we have a custom API app called CreateInstanceSubscription.

It requires quite a few inputs to function since we don’t yet have the capability of reading details from a stored API connection in a custom API app.

The API takes in a Correlation Property property, which contains the name of the property that is shared in common between the message that triggered the Logic App instance and the message that will be correlated with this running instance.

It also takes in the Message Type (in the namespace + # + root node name format) of the next expected message. Both of these properties will be used to create a new subscription on the service bus topic referenced by the last two configuration properties (Service Bus Topic, and Service Bus Connection String).

After it executes, we might expected to see a subscription like the following (click to enlarge):

 

Now that we have the subscription created, we can take our time with the rest of the process until we absolutely require the second message. In this case, we’re calling another custom API which provides a visualization of the received messages. In order to read the content, we can either use the xpath() function of Logic Apps to read the XML directly, or we can covert it to JSON first using the json() function, and then simply dot into it. I decided to use the JSON function since I hadn’t attempted to use it in a situation like this yet. It was okay (it was pretty darn verbose). xpath() would have been a better choice here – and the more natural choice given an XML payload.

This yields the following visualization (a body-less bobblehead awaiting its correlated message containing information about which body to use):

And we would expect that there is both an instance subscription in service bus and a Logic App that is still actively processing – waiting for Service Bus to re-activate it with a new message.

At this point, the new message can arrive at any moment in time. It will land in the same Dropbox folder, and process through the same Logic App serving as a “Port” – with the same XML property promotion, and Service Bus publishing action. It will land in the same Service Bus topic as before, and with a matching order id to the originally submitted message.

This second message carries some new information, however. In this case, it contains the body that the customer selected for their custom bobblehead doll.

Once published to the topic, the instance subscription previously created by the second Logic App in the process will be matched by a listening Service Bus connector.

 

The connector uses the topic name and instance subscription name passed to it from the Create Instance Subscription API App. The name of the subscription will be a randomly generated id for that running instance of the Logic App.

Now that we have the message, it’s time to ensure that we don’t bring the problem of zombies into the world of Logic Apps. There is a step that follows which will clean up the instance subscription for the Logic App before continuing with the final bits of the process.

Again, since the OOTB Service Bus connector does not contain any operations for managing subscriptions, the custom API App is called to delete the instance subscription using the details returned from the original call to the API which created it.

After that, it’s back off to the bobblehead visualizer with the details from the correlated message received.

Call to Action

So that’s pretty cool. We can now stand with confidence and proclaim that content-based correlation is possible with Logic Apps! However, it was built out of necessity, and required custom crafted components – as is often the case with anything worth doing.

You may be wondering why this talk was titled API Apps 101 for BizTalk Developers. I didn’t really tell you how to create API apps. Instead, I showed that API apps behave in a fashion similar to different components within BizTalk Server (adapters, pipeline components, orchestration shapes, etc…). I don’t want to leave you hanging though, because we are at a point in time where there is a golden opportunity to make your mark in the foundations of this new world.

This is the ground floor of Logic Apps and API Apps. As BizTalk Server developers, we know the required ingredients of enterprise integrations. We know the recipes for success. It’s just a matter of crafting some additional tools for use in the world of Logic Apps, and for the first time we have a unified marketplace to share and even sell these components.

From working on BizTalk Server integrations, we know that we will need custom API apps that can serve as adapters, pipeline components, and pattern utility apps (e.g., content-based correlators). In fact, you may have built such things before. It’s honestly not that difficult to port those things over into this new world of integration (where it makes sense) and reap the rewards. If you need inspiration, check out the listings of such components that have already been created for BizTalk Server. Each component represents a solution to a specific integration challenge – many of which are timeless challenges.

We write BizTalk components and API Apps in the same languages, though with different techniques, and targeting a different runtime.

How do we make that all happen? Well, today we are providing the world with “the goods”. All of the slides from this talk, a sample module from the February 2016 version of our Cloud-based Integration Using Azure App Service course, and all of the code involved in the demo. With those combined resources, you should be set on the right track to start building custom API Apps for use in Logic Apps – leveraging skills and work you’ve already accomplished.

If you’re ready to get started, click the image below to download the resources:

Until Next Time

That’s all for now! Again, go forth and create API apps and come visit us in our Cloud-based Integration Using Azure App Service course if you’d like to learn more.

I’ll leave Simon Young with the final word – and dining tip!

Exposing a service bus topic using Azure API Management

Introduction

Microsoft released a new service to Azure, called API Management. This service was released on May the 12th 2014.

Currently the Azure API Management is still in Preview, but already it enables us to easily create an API façade over a diverse set of currently available Azure services like Cloud Services, Mobile Services, Service bus as well as on premise web-services.

Instead of listing all the features and write an extensive introduction about Azure API Management I’d rather supply you with a few links which contains more information about the Azure API Management service:

Microsoft:

http://azure.microsoft.com/en-us/services/api-management/

http://azure.microsoft.com/en-us/documentation/services/api-management/

Blogs:

http://trinityordestiny.blogspot.in/2014/05/azure-api-management.html

http://blog.codit.eu/post/2014/05/14/Microsoft-Azure-API-Management-Getting-started.aspx

Some more background

As most of my day-to-day work involves around the Microsoft Integration space in which I am mainly focusing on BizTalk Server, BizTalk Services and Azure in general, I was looking to a find a scenario in which I, as an Integration person, could and would use Azure API management.

The first thing which popped in to my mind; wouldn’t it be great to virtualize my existing BizTalk Service Bridges using Azure API management. Well currently this is not possible, as the only authentication and authorization method on a BizTalk Service Bridge is ACS (Access Control Service) and this is not supported in the Azure API Management Service.

Luckily with the last feature release of BizTalk Services included support for Azure Service Bus Topics / queues as a source J and luckily for me Azure Service Bus supports SAS (Shared Access Signatures) and using such a signature I am able to generate a token and use this token in the HTTP Header – Authorization section of my http request.

Knowing the above, I should be able to define API’s which virtualize my Service bus Endpoints. Create a product combining the defined API’s and assign policies to my API operations.

Sounds easy? Doesn’t it. Well it actually is. So without further ado, let’s dive into a scenario which involves exposing an Azure Service bus Topic using Azure API Management.

Getting started

Before we actually start please note that the sample data (Topic names, user-accounts, topic subscriptions, topic subscription rules etc.) I use for this ‘Step by step’ guide is meant for a future blog post 😉 extending an article I posted a while back on the TechNet Wiki

Other points you should keep in mind are

  • Messages send to a TOPIC may not exceed 256Kb in size, if a message is larger you will receive an error message from Service Bus telling you that the message is too large.
  • Communication to service bus is asynchronous; thus we send a message and all we get back is an HTTP code telling us the status of the submission (200 OK, 210 Accepted etc.) or an error message indicating that something went wrong (401 Access Denied, etc.). So actually our scenario is using the Fire and Forget principle.
  • You will have to create a SAS Token, which needs to be send in the header of the message in order to authenticate to Service bus.

Enough set, let’s get started.

For the benefit of the reader I’ve added hyperlinks below such that you can skip those sections involved which you might already know.

Sections Involved

Create a new Azure Account

If you don’t already have an Azure account, you can sign up for a free trial here

Provision an Azure Service Bus entity

Once you’ve signed up for an Azure account, login to the Azure Portal and create a new Azure Service Bus Topic by following the steps listed below.

  1. If you have logged in to the preview portal, click on the ’tile’ Azure Portal. This will redirect you to an alternative portal which allows for a more complete management of your Azure Services.

  2. In the ‘traditional’ portal click on Service Bus

  3. Create a new Namespace for your Service bus Entity

  4. Enter a name for your Service bus namespace and select the region to which it should be deployed and select the checkmark which starts the provisioning.

  5. Once the provisioning has finished, select the Service Bus Entity and click on Connection Information

  6. A window will appear with Access Connection Information, in this screen copy the ACS Connection String to your clipboard. (We will need this connection string later on) and then click on the checkbox to close to window

Create a new Azure Service Bus TOPIC

Now that a new Service Bus Entity has been provisioned, we can proceed with creating a TOPIC within the created Service Bus entity, for this we will use the Service Bus Explorer from Paolo Salvatori. Which you can download here. Once you’ve downloaded this tool, extract it and execute the ServiceBusExplorer.exe file and follow the below mentioned steps.

  1. Press CRTL+N which will open the “Connect to Service Bus Namespace” window
  2. In the Service Bus Namespaces box, select the following option from the dropdown: “Enter Connection String”
  3. Copy and paste the ACS Connection String (you copied earlier, see previous section step 6) and once done press “OK”

  4. The Service Bus Explorer should now have made a connection to your Service Bus Entity

  5. In the menu on your left, select the TOPIC node, right click and select “Create Topic”

  6. In the window which now appears enter a TOPIC name “managementapi_requests” in the “Path” box and leave all other fields blank (we will use the defaults). Once done press the “Create button”

  7. Your new Topic should now have been created

Add Subscriptions and filters to your newly created TOPIC

Now that we have created a TOPIC it is time to add some subscriptions. The individual subscriptions we will create will contain a filter such that messages which are eventually posted to this TOPIC end up in a subscription based on values set in the HTTP header of the submitted messages. In order to set up some subscriptions follow the below mentioned steps:

  1. Go back to your newly created TOPIC in the Service Bus Explorer
  2. Right click on the TOPIC and select “Create Subscription”

  3. The Create Subscription windows will now show in which you should execute the following steps
    • A) Subscription Name: BusinessActivityMonitoring
    • B) Filter: MessageAction=’ BusinessActivityMonitoring’
    • C) Click on the Create button
  4. Now repeat Steps 2 and 3 in order to create the following subscription
    • Subscription Name: Archive
    • Filter: 1 = 1

Assign Shared Access Policies to your TOPIC

At this point we have set up our Topic and added some subscriptions le viagra est il efficace. The next step consists of adding a Shared Access Policy to our topic. This policy than allows us to generate a SAS token which later on will be used to authenticate against our newly created Topic. So first things first, let’s assign a Shared Access Policy first. The next steps will guide you through this.

  1. Go to the Service Bus menu item and select the Service Bus Service you created earlier by clicking on it.
  2. Now select TOPICS from the tab menu

  3. Select the Connection Information Icon on the bottom

  4. A new window will pop up, in this windows click on the link “Click here to configure”

  5. Now un the shared access policies:
    • A) Create a new policy named ‘API_Send’
    • B) Assign a Send permission to this policy
    • C) Create a new policy named ‘API_Receive’
    • D) Assign the Listen permission to this policy
    • E) Create a new policy named ‘API_Manage’
    • F) Assign the Manage permission to this policy
    • G) Click on the SAVE icon on the bottom
  6. At this point for each policy a Primary and Secondary key should be generated.

Generate a SAS Token

Once we’ve added the policies to our Topic we can generate a token. In order to generate a token, I’ve build a small forms application which uses part of the code which was originally published by Santosh Chandwani. Click the following link to start downloading the application “Sas Token Generator“. Using the SAS Token Generator application we will now generate the token signatures.

  1. Fill out the following form data
    • A) Resource Uri = HTTPs Endpoint to your TOPIC
    • B) Policy Name = API_Send
    • C) Key = The Primary Key as previously generated
    • D) Expiry Date = Select the date you want the Sas token to expire
    • E) Click on generate
  2. After you have clicked GENERATE by Default a file will be created on your desktop containing all generated Sas Tokens, the file is named SAS_tokens.txt. Once saved you will be asked if you want to copy the generated token to your Clipboard. Below 2 images depicting the message-prompt as well as the contents stored in the generated file.Perform step 2 for the other 2 policies as well (API_Listen and API_Manage)

Create a new API Management Instance

At this point we have set up our Service Bus Topic, Subscriptions and have generated our Sas Tokens we are all set to start exposing the newly created service bus topic using Azure API Management, but before we can start with this we need to create a new API Management Instance. The steps below detail how to do this.

  1. Click on API Management, in the right menu-bar, and click on the link “Create an API Management Service”

  2. A menu will pop up in which you need to select CREATE once more

  3. At this point a new window will appear:
    • A) Url: Fill out an unique name
    • B) Pricing Tier: Select Developer
    • C) Subscription: Check your subscription Id (Currently it does not show the subscription name, which I expect to be fixed pretty soon)
    • D) Region: Select a region close to you
    • E) Click on the right arrow
  4. You now will end up at step 2:
    • A) Organization Name: Fill out your organization name
    • B) Administration E-Mail: Enter your email address
    • C) Click on the ‘Checkmark’ icon
  5. In about 15 minutes your API management Service will have been created and you will be able to login.

Now that we have provisioned our Azure API management service it is time to create and configure an API which exposes the previously defined Azure Service Bus Topic such that we can send messages to it. The API which we are about to create will expose one operation pointing to the Azure Service Bus Topic and will accept both XML as JSON messages. Later on we will define a policy which will ensure that if a JSON message is received it is converted to XML and that the actual calls to the Service Bus Rest API are properly authenticated using our SAS token created earlier.

So let’s get started with the creation of our API by Clicking the Manage Icon which should be visible in the menu bar at the bottom of your window.

Once you’ve clicked the Manage icon, you should automatically be redirected to the API Management portal.

Create and configure a new API

Now that you are in the Azure API Management Administration Portal you can start with creating and configuring a new API, which will virtualize your Service Bus Topic Rest Endpoints. In order to do so follow the following steps.

  1. Click on the API’s menu item on your left

  2. Click on ADD API

  3. A new window will appear, fill out the following details
    • A) Web API title
      • Public name of the API as it would appear on the developer and admin portals.
    • B) Web Service Uri
      • This should point to your Azure Service Bus Rest Endpoint. Click on this link to get more information. The format would be: http{s}://{serviceNamespace}.servicebus.Windows.net/{topic path}/messages
    • C) Web API Uri suffix
      • Last part of the API’s public URL. This URL will be used by API consumers for sending requests to the web service.
    • D) Once done, press Save
  4. Once the API has been created you will end up at the API configuration page

  5. Now click on the Settings Tab
    • A) Enter a description
    • B) Ensure to set authentication to None (we will use the SAS token later on, to authenticate)
    • C) Press Save
  6. Now click on the Operations Tab, and click on ADD Operation.
    • Note: detailed information on how to configure an operation can be found here
  7. A form will appear which will allow you to configure and add an operation to service. By default the Signature menu item is selected, so we start with configuring the signature of our operation.
    • A) HTTP verb: Choose POST as we will POST messages to our Service Bus Topic
    • B) URL Template: We will not use an URL template, so as a default enter a forward-slash ” / “
    • C) Display Name: Enter a name, which will be used to identify the operation
    • D) Description: Describe what your operation does
  8. Now click on the Body item in the menu bar on the left underneath REQUESTS (we will skip caching as we don’t want to cache anything), and fill out the following fields
    • A) Description: add a description detailing how the request body should be represented
  9. Now click on the ADD REPRESENTATION item just underneath the description part and enter Application/XML
  1. Once you’ve added the representation type, you can add a representation example.
  1. Now once again click on the ADD REPRESENTATION item just underneath the description part and enter Application/JSON
  1. Once you’ve added the representation type, you can add a representation example.
  1. Now click on the ADD item in the menu bar on the left underneath RESPONSES (We will skip Caching, Parameters and Body as we don’t need it)

  2. Start typing and select a response code you which to return once the message has been send to the service operation.

  3. Now you could add a description and add a REPRESENTATION, however in our case we will skip this as a response code 202 Accepted is all we will return.
  4. Press save

Create a new Product

Now that we have defined our API we need to make it part of a Product. Within Azure API Management this concept has been introduced as a container containing one or more API definitions to which consumers (developers) can subscribe. In short if your API is not part of a product definition consumers can not subscribe to it and use it. More information regarding the ‘Product’ definition can be found here.

In order to create a new product, we need to perform the following steps:

  1. Select the Products menu item from within the API Management Administration Portal and click on it.

  2. A new window will appear listing all products currently available, in this window click on the ADD PRODUCT item

  3. Fill out the following form items in order to create a product.
    • A) Title: Enter the name of the product
    • B) Description: Add a description of the product
    • C) Require subscription approval: Ensure to check this, as this will require any subscription requests to this product to be approved first
    • D) Press Save
  4. Now that we have created our Product it is time to see if there are any other things we need to configure before we add policies to it and publish it. In order to check the settings by clicking on the newly created product

  5. On the summary tab, click on the link ADD API TO PRODUCT

  6. A new window will pop up, select the API you want to add to the product and once done click Save

Create a new Policy for your defined API operations

At this point we have created a product but have not yet published it. We will publish it in a bit, but first we need to set up some policies for the API and the operation we’ve created earlier, In order to do this follow these steps

  1. From the menu bar on your left select the Policies item and in the main window in the policy scope section make the following selections
    • A) For the API select the API you created earlier
    • B) For the Operation select the operation you created earlier
  1. Now in the Policy definition section, click on ADD Policy
  1. At this point the Empty Policy definition is visible

Setting up your Policy definition

For our API operation to correctly function we are going to have to add a few policies. These policies should take care of the following functionality

  • Authenticate to our Service Bus Topic using our previously created SAS Token
  • Automatically convert potential JSON messages to their equivalent XML counterpart
  • Add some additional context information to the inbound message which are converted to brokered message properties when passed won to Azure Service Bus.

General information on which policies are available to you within the Azure API Management Administration Portal and how to use them can be found here

The next few steps will show you how we can add policy statements which will ensure the above mentioned functionality is added.

  1. In the Policy definition section, ensure to place your cursor after the <inbound> tag

  2. From the Policy statements, select and click on the “Set HTTP Header” statement.

  3. A “Set Header” Tag will be added to the Policy Definition area, which will leverage the Authorization Header containing our SAS token we created earlier. The steps required are listed below:
    • A) Put in the value “Authorization” for the attribute “name”
    • B) Put in the following value “skip” for the attribute “exists-action”
    • C) Now get the SAS Token you created earlier, wrap the token string in between a CDATA element and put all of this in between the “value” element

    Textual example:

    <set-header name=”Authorizationexists-action=”skip“>

    <value><![CDATA[YOUR SAS TOKEN STRING]]></value>

    </set-header>

  4. Place your cursor just after the closing tag </set-header>
  5. From the Policy statements, select and click on the “Convert JSON to XML” statement.

  6. A “json-to-xml” Tag will be added to the Policy Definition area, which contains the instructions resulting in JSON messages to be converted to XML. Ensure that the tag is configured as mentioned below:
    • A) Put in the value “content-type-json” for the attribute “apply”
    • B) Put in the following value “false” for the attribute “consider-accept-header”

    Textual example:

    <json-to-xml apply=”content-type-jsonconsider-accept-header=”false“/>

  7. Now add “set-header” policy statements adding the following headers
    • A) Header name: MessageId

      • exists-action: “skip”
      • value:“00000000-0000-0000-0000-000000000000” 
    • B) Header name: MessageAction

      • exists-action: “skip”
      • value=“Undefined” 

    textual example:

    <set-header name=”MessageIdexists-action=”skip“>

    <value>00000000-0000-0000-0000-000000000000</value>

    </set-header>

    <set-header name=”MessageActionexists-action=”skip“>

    <value>Undefined</value>

    </set-header>

  8. Once you have added all the policy statements, press the save button

Create a Group, an User and Add subscribe the user to a product

Now we have created a new product and assigned policies we need to perform some group/user related actions. This way we can set up a dedicated Group of users which is allowed to use our Product and it’s containing API.

The steps below will guide you through this process

  1. Now select the Visibility tab, and click on the MANAGE GROUPS link

  2. You will be redirected to the GROUPS page, on this page click on the ADD GROUP Link

  3. A new windows will pop up, fill out the following fields
    • A) Name: Unique name of the group
    • B) Description: General description of the group and its purpose
    • C) Click on Save
  4. After you’ve created the new Group, select the developers menu item and in the main window click on ADD User

  5. Once again a new window will pop up. In this window fill out the following fields:
    • A) Email
    • B) Password
    • C) First and last name
    • D) Press Save
  6. Now that we have created a new user, we need to make it a member of our group. In order to do so follow the following steps
    • A) Ensure to select the new user
    • B) Click on the ADD TO GROUP item and add the user to the earlier created Group
  7. Now go back to the PRODUCTS menu item and select the product you created earlier

  8. In the main window follow these steps
    • A) Click on the Visibility Tab
    • B) Allow the new group to subscribe to your product
    • C) Click Save
  9. Now Click on the summary tab and click on the PUBLISH link

  10. Now select the Developers menu item and click on the user you created earlier

  11. The main window will now change, in this window click on ADD Subscription

  12. A window will pop up, in this window ensure to put a checkmark in front of the product you want the user to subscribe to. Once done press the subscribe button

Test your API

At this point you have set up your API and now you can proceed with testing it. In order to test we will use the Azure API Management Developer Portal and we will log-on to it using the user account we set up previously.

The steps below list the steps involved:

  1. First log out of the Azure API Management Administration Portal
  2. Now login using the email and password of the user you defined earlier

  3. In the top menu, select APIS

  4. Click on the API you created earlier

  5. Click on the button “Open Console”

  6. A form will appear which allows you to send a message to the API. Follow the steps below to send a JSON formatted message to the API.
    • A) From the dropdown select a subscription-key (used to authenticate to the API)
    • B) Add two http headers
      • Content-type: application/json [indicates that the message you are about to send is formatted as JSON]
      • MessageAction: NonExisting [this will ensure that the message ends up in our Azure Service Bus subscription named Archive, as this subscription is our catch-all
      • MessageId: 11111111-1111-1111-1111-111111111111
      • MessageBatchId: SingleMessageID
    • C) Add some sample JSON
    • D) Press HTTP POST
  7. Now open up the Service Bus Explorer and connect to your service bus instance
  8. Right Click on the Archive subscription and select the option “Receive all messages”
  9. One message should be received, which should meet the following test criteria:
    • The message should be formatted in XML, although the original message we send was formatted as JSON
      • The following custom message properties should be available
      • MessageAction: NonExisting
      • MessageId: 11111111-1111-1111-1111-111111111111
      • MessageBatchId: SingleMessageID

    MY TEST RESULT: PASSED

  10. Now we will perform another test, but this time we will send an XML formatted message to the API.
    • A) From the dropdown select a subscription-key (used to authenticate to the API)
    • B) Add two http headers
      • Content-type: application/xml [indicates that the message you are about to send is XML]
      • MessageAction: BusinessActivityMonitoring [this will ensure that the message ends up in our Azure Service Bus subscription named BusinessActivityMonitoring and it will end up in our Archive subscription (as it is our catch-all)]
    • C) Add some sample XML
    • D) Press HTTP POST
  11. Right Click on the BusinessActivityMonitoring subscription and select the option “Receive all messages”. One message should be received, which should meet the following test criteria:
    • The message should be formatted in XML
    • The following custom message properties should be available
      • MessageAction: BusinessActivityMonitoring
      • MessageId: 00000000-0000-0000-0000-00000000000
      • MessageBatchId: SingleMessageID
  12. MY TEST RESULT: PASSED

  13. Now let’s receive all message from our Archive subscription (it should contain a copy of the previous message). Reason for this, is the fact that the archive subscription is our Catch-All subscription and thus all messages send to the topic end up in this subscription as well.

    MY TEST RESULT: PASSED

  • Ensure to document your API well, this makes life easier for the consumers
  • Using SAS Tokens you can fairly easy integrate fairly with Azure Service Bus Queues / Topics
  • If using a policy to set Custom Headers (which you use for setting the Authorization header) ensure to enclose the SAS token within a <![CDATA[ …… ]]> tag
  • The Azure API Management Key can be found on the developer page of the Azure API Developer Portal (f.e https://{namespace}.portal.azure-api.net/developer)
  • Code examples on how to consume an API are available from the Azure API Developer Portal, by clicking on the menu item APIS and then clicking on the API you want to test
  • Logging into the Azure API Management Administration Portal must be done using the Azure Management Portal
  • Azure Management API, in my opinion, could easily be used to virtualize your on premise internet-faced web services (BizTalk generated web services f.e). This way you have one central place to govern and manage them.

I hope this walkthrough contributed in gaining a better understanding of how we as integrators can leverage the Azure API Management Service to expose Service Bus entities. Once you’ve grasped the concepts you could easily take it a step further and for example involve Azure BizTalk Services which would process messages from certain subscriptions, do some transformations and deliver it to for example another Azure Service Bus Topic, the topic endpoint could then be incorporated into a new API which would allow your API consumers to retrieve their processed messages.

Ah well you get the idea, the possibilities are almost endless as Azure delivers all these building-blocks (services) which enable you to create some amazing stuff for your customers.

I hope to publish a new post in the coming weeks; I’ve already worked out a scenario on paper which involves Azure API Management, Azure Service Bus, Azure BizTalk Services, Azure File Services, and an Azure Website; however implementing it and writing it down might take some time and currently my spare-time is a bit on the shallow side. Ah well, just stay tuned, check my Twitter and this blog.

Until next time!

Cheers

René

Resolved issue: Sending message from Windows Azure BizTalk Services Bridge to another Bridge

Next week I will be doing a talk  on Windows Azure BizTalk Services  with regards on how one can add BAM functionality. During this talk a demo will be the ‘Pièce de résistance’. this demo is based on an article I have written earlier and which can be found on TechNet.  Well to cut to the chase… I would not be who I am if I would have not taken this article to the next level and while doing so I encountered a nice challenge.

In my demo there is a scenario in which I have a custom MessageInspector which can be configured as such that it can deliver messages to either a Servicebus Queue/Topic/Relay or BizTalk Service Bridge endpoint. While testing the various scenario’s I encountered a particular issue when I tried to send a message to another bridge. The error message which was ‘reported back stated’

Component xpathExtractor. Pipeline Runtime Error Microsoft.ApplicationServer.Integration.Pipeline.PipelineException: An error occurred while parsing EntityName. Line 5, position 99.
at Microsoft.ApplicationServer.Integration.Pipeline.Activities.XmlActivitiesExceptionHelper.Throw(String message, XmlActivityErrorCode errorCode, String activityName, String stageName, IEnumerable`1 messageProperties)

The above mentioned error was caused by the error-message which was sent back and caused a failure which I could not easily debug any further without a complete rewrite. While I had no time for this I decided to use a different approach in sending the messages. Up to that point I used the WebClient class in combination with the UploadDataAsync Method but I decided to give the OpenWrite method a go. This decision proved to be very useful as my existing exception-handling was not able ‘kick in’ without causing the parent TASK to go in to a faulted state.  Thus by using the OpenWrite method I was able to extract he ‘real’ exception. This exception message stated:

“The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.”  “The remote certificate is invalid according to the validation procedure.”

At that point I reached the Eureka moment and it all started to make sense. It all boils down to the fact that for development/test purposes we all use a self-signed certificate (created for us when we provisioned out Windows Azure BizTalk Service). This certificate is installed on our client machines, thus allowing us to make requests to the Windows Azure BizTalk Service. This explained why my test-console application did not throw any exceptions when calling the Windows Azure BizTalk Service Bridge in question. However when this code is invoked from within the message inspector which is hosted in Windows Azure.’ it runs into the fact that ‘there is a problem with the security certificate’, this makes sense as it is a self-signed certificate.

So now that I figured out the why, I needed a way to tell my code to ignore these kind of warning when encountered. Well luckily for is, this little gem is available and can be found in the System.Net.ServicePointManager class. This gem is called ServerCertificateValidationCallBack. This CallBack will return false in case the Server Certificate is not complying with the policies (in our case, it is by default not trusted as it is self-signed), so all we need to do is ensure that it always returns true and thus ignoring the security check

 please do not do this in production environments! You otherwise might be ending up sending data to a not trusted source (i.e spoofed server etc.)

Below I added the piece of code which implements the ssl validation-check bypass:

using (var client = new WebClient())
{
 
    Uri EndPointAddress = new Uri(this.Endpoint);
 
             
    //Add WRAP ACCESS TOKEN
    client.Headers[HttpRequestHeader.Authorization] = String.Format("WRAP access_token="{0}"", this.AcsToken);
 
    //Add Content Type
    client.Headers["Content-Type"] = "application/xml";
 
    //Publish  
    try
    {
 
        //The 'GEM' Ignore validation callback which cause an error
        ServicePointManager.ServerCertificateValidationCallback += (sender, certificate, chain, sslPolicyErrors) => true;
 
        using (var stream = client.OpenWrite(EndPointAddress, "POST"))
        {
            byte[] data = System.Text.Encoding.UTF8.GetBytes(messagePayload);
            stream.Write(data, 0, data.Length);
        }
        
    }
    catch (System.Net.WebException wex)
    {
        AppendExceptionsToLog(wex);
    }
    catch (ArgumentNullException anex)
    {
        AppendExceptionsToLog(anex);
    }
    catch (Exception ex)
    {
        AppendExceptionsToLog(ex);
    }
}

             
Cheers and hope to see you soon! Especially if you are going to attend the BizTalk Summit 2014  in London this March, don’t hesitate to say Hi 🙂