Ik probeer in plaats van het probleem op te lossen, de oorzaak weg te nemen
Blog Post by: AxonOlympus
Ik probeer in plaats van het probleem op te lossen, de oorzaak weg te nemen
Geef klanten wat ze nodig hebben, niet wat ze willen
Blog Post by: AxonOlympus
The Service Bus Preview does not display Azure Service Bus Relays in my hands. The Service bus is in preview in the new Azure Portal and I used it to create this service bus. I created a Azure Service bus relay but could not find it anywhere in the new portal. Without this it is […]
Blog Post by: mbrimble
One of the great features of the BizTalk Deployment Framework is the ability to use a SettingsFileGenerator file to set your environment specific settings in an excel file, and use this in your other files, so you can have generic files like portbindings and BRE rules, being updated with the correct settings for the environment we’re working on, like DEV, TEST or PROD. If you are like me, you will probably also have placed a lot of common settings which are used accross all your applications in this file, like SSO user groups, host instance names, common endpoints, webservice users, etc. This means we end up with a lot of duplicate settings accross our environment settings files, which becomes cumbersome to maintain. Fortunatly, there is a way to work around this.
The BTDF has a nice option which we can use, to have a single SettingsFileGenerator file for all our applications. In this example we have two applications, with a couple of common settings, as well as some application specific settings. The applications were already set up with BTDF, so we already have all necessary placeholders in the PortBindingsMaster file. Lets start by creating a CommonSettingsFileGenerator file which has all these settings in one place. To do this, copy the SettingsFileGenerator from one of my projects to a general Tools directory, rename it, and update it with all the common and application specific settings.
For easy access to this common file, we can add it to our solution by going to Add -> Existing Item and selecting the file we just created. Also, if you still have the old SettingsFileGenerator file in your solution, make sure to delete it, so we don’t use that by accident when updating our settings at a later time.
Now open your btdfproj file, and in the first PropertyGroup add the next line at the bottom.
Now when you use BTDF to deploy your solution locally, or you build an MSI for remote deployment, our CommonSettingsFileGenerator will be used for our settings.
Now this is really nice, as we only have a single place to maintain our settings, but if you have a lot of applications with a lot of application specific settings, this does make our SettingsFileGenerator quite large. It would be even better if we could have one settings file for our common settings, and another settings file for our application specific settings. Luckily, we also have this option, thanks to Giulio Vian. On my search for an easy way to merge two setting files, I came accross this post about a SettingsMerger project, where Giulio mentions the tool has been merged into Environment Settings Exporter, the tool which the BTDF uses to do all its goodness with the environment settings. At this moment, Giulio’s changes are not part yet of the Environment Settings Exporter which is shipped with BTDF, but we can download and build the code ourselves (or you can download the built exe from here) and use it from a custom target in our btdfproj file. The BTDF comes with a couple of custom targets out of the box, which gives us the ability to run custom commands at certain stages of the deployment or build of the MSI.
For this example we are going to use another application which has already been set up with the BizTalk Deployment Framework, so it already has the PortBindingsMaster file configured, as well as a SettingsFileGenerator with the application’s settings as well as the common settings. Make sure you have the new version of Environment Settings Exporter which we downloaded or built earlier on in your Tools directory. Now copy the application’s SettingsFileGenerator file to our Tools directory as well, and rename it to CommonSettingsFileGeneratorMerge. Now in the application’s SettingsFileGenerator remove all the common settings so this only has the settings intended for that application, and in CommonSettingsFileGeneratorMerge remove all the application specific settings so this now has the common settings which we want to use accross multiple applications.
For easy access to the common file, we can also add it to our solution by going to Add -> Existing Item and selecting the file we just created, placing it alongside SettingsFileGenerator in our solution.
Now open your btdfproj file, and in the first PropertyGroup add the next line at the bottom (if you allready added this line in the previous part, just update its value).
Next place the following code in the btdfproj file after <Import Project=”$(DeploymentFrameworkTargetsPath)BizTalkDeploymentFramework.targets” />.
<!-- Merge the settings file in case of local deployment --> <Target Name="CustomPreExportSettings" Condition="$(Configuration) != 'Server'"> <Exec Command='C:ProjectsToolsEnvironmentSettingsUtil.exe Merge /input:C:ProjectsToolsCommonSettingsFileGeneratorMerge.xml /input:.EnvironmentSettingsSettingsFileGenerator.xml /output:.EnvironmentSettingsMergedSettingsFileGenerator.xml' /> </Target> <!-- Merge the settings file in case of building MSI for remote deployment --> <Target Name="CustomPreRedist"> <Exec Command='C:ProjectsToolsEnvironmentSettingsUtil.exe Merge /input:C:ProjectsToolsCommonSettingsFileGeneratorMerge.xml /input:.EnvironmentSettingsSettingsFileGenerator.xml /output:.EnvironmentSettingsMergedSettingsFileGenerator.xml' /> </Target>
This will use the EnvironmentSettingsUtil to merge the common settings and application specific settings into a new file, which we earlier told BTDF to use in our deployments.
Now when you use BTDF to deploy your solution locally, or you build an MSI for remote deployment, you will find that the application will have access to all the common settings, as well as it’s application specific settings.
The code for the examples can be downloaded here.
In this period, I’m supporting different development teams to implement a big integration solution using different technologies such as BizTalk Server, Web API, Azure, Java and other and involving many other actors and countries.
In a situation like that is quite normal to find many different problems in many different areas like integration, security, messaging, performances, SLA and more.
One of the most important aspects that I like to consider is the productivity, writing code we spend time and time = money.
Many times a developer needs to find a solution or he needs to decide a specific pattern to solve a problem and we need to be ready to provide the most productive solution for it.
For instance, a .Net developer can decide to use a plain function to solve a specific loop instead using a quicker lambda approach, a BizTalk developer can decide to use a pure BizTalk approach instead using a quicker and faster approach using code.
To better understand this concept, I want to provide you a classic and famous sample in the BizTalk planet.
An usual requirement in any solution is, for example, the possibility to pick up data in composite pattern and cycling for each instance inside the composite batch, in BizTalk server this is the classic situation where we are able to understand how much the developer is BizTalk oriented J
By nature, the BizTalk development approach is more closed to a RAD approach rather than a pure code approach.
To cycle in a composite message in BizTalk we can use different ways and looking in internet we can find many solution, one of the most used by BizTalk developer is creating an orchestration, create the composite schema, execute the mediation, receive the composite schema in the orchestration, in the orchestration cycle trough the messages using a custom pipeline called in the orchestration.
Quite expensive approach in term of productivity and performances.
Another way is using for example a System.Collections.Specialized.ListDictionary in the orchestration.
Create a variable in the orchestration type System.Collections.Specialized.ListDictionary
Create a static class and a method named for example GetAllMessages
Inside your method write the code to pick up you messages and stream and load into the ListDictionary
Create an expression shape and execute the method to retrieve the ListDictionary
xdListTransfers = new System.Collections.Specialized.ListDictionary();
Use variable by ref to manage the result, the variable numberOfMessages
is used to cicle in the orchestration loop.
Because we use a ListDictionary we can easily get our item using a key in the list, as below using the numberOfMessagesDone variable.
xmlDoc = new System.Xml.XmlDocument();
transferMessage = new MYNAMESPACE.Common.Services.TransferMessage();
transferMessage = (MYNAMESPACE.Common.Services.TransferMessage)xdListTransfers[numberOfMessagesDone.ToString()];
is the increment value in the loop
Using this method, we keep our orchestration very light and it’s very easy and quick to develop.
I have many other samples like that, this an argument which I’m really care about because able to improve our productivity and performance and first of all the costs.
Is the Microsoft integration team “back”? It might be premature to say that Microsoft has finally figured out its app integration story, but the signs are very positive. There’s been a fresh influx of talent like Jon Fancey, Tord Glad Nordahl, and Jim Harrer, some welcome forethought into the overall Microsoft integration story, better community engagement, and a noticeable uptick in the amount of software released by these teams.
One area that’s been getting tons of focus in Azure Logic Apps. Logic Apps are a potential successor to classic on-premises application integration tools, but with a cloud-first bent. Users can visually model flows made up of built-in, or custom, activities. The initial integrations supported by Logic Apps were focused on cloud endpoints, but with the recent beta release of the Enterprise Integration Pack, Microsoft is making its move to more traditional use cases. I haven’t messed around with Logic Apps for a few months, and lots of things have changed, so I tested out both the standard and enterprise templates.
One nice thing about things like Logic Apps is that anyone can get started with just a browser. If you’re building a standard workflow (read: doesn’t require extra services or the “enterprise integration” bits), then you don’t have to install a single thing. To start with, I went the Azure Portal (the new one, not the classic one), and created a new “Logic App.”
I was then presented with a choice for how to populate the app itself. There’s the default “blank” template, or, I can start off with a few pre-canned options. Some of these are a bit contrived (“save my tweets to a SharePoint list” makes me sad), but they give you a good idea of what’s possible with the many built-in connectors.
I chose the HTTP Request-Response template since my goal was to build a simple synchronous web service. The portal showed me what this template does, and dropped me into the design canvas with the HTTP Request and HTTP Response activities in place.
I have a birthday coming and am feeling old, so I decided to build a simple service that would tell me if I was old or not. In order to easily use the fields of an inbound JSON message, I had to define a simple JSON schema inside the HTTP Request shape. This schema defines a string for the “name” and an integer for the “age.”
Before sending a response, I want to actually do something! So, I added an if-then condition to the canvas. There are other conditionals available, such as for-each and do-until. I put this if-then shape in between the Request and Response elements, and was able to choose the “age” value for my conditional check.
Here, I checked to see if “age” is greater than 40. Notice that I also had access to the “name” field, as well as the whole request body or HTTP headers. Next, I wanted to send a different HTTP response for over-40, and under-40. The brand new “compose” activity is the answer. With this, I could create a new message to send back in the HTTP response.
I simply typed a new JSON message into the Compose activity, using the variable for the “name”, and adding some text to categorize the requestor’s age.
I then did the same thing for the “no” path of the if-then and had a complete flow!
Quick and easy! The topmost HTTP Receive activity has the URL for this particular Logic App, and since I didn’t apply any security policies, it was super simple to invoke. From within my favorite API testing tool, Postman, I submitted a JSON message to the endpoint. Sure enough, I got back a response that corresponded to the provided age.
Great. But what about doing all the Enterprisey stuff? I built another new Logic App, and this time, wanted to send a comma separated payload to an HTTP endpoint and get back XML. There’s a Logic Apps template for that and when I selected it, I was told I needed an “integration account.”
So I got out of Logic Apps, and went off to create an Integration Account in the Portal. Integration Accounts are a preview service from Microsoft. These accounts hold all the integration artifacts used in enterprise integration scenarios: schemas, maps, certificates, partners, and trading agreements.
How do I get these artifacts, you ask? This is where client-side development comes in. I downloaded the Enterprise Integration Tools–which is really just Visual Studio extensions that give you the BizTalk schema editor and mapper–and fired up Visual Studio. This adds an “integration” project type to Visual Studio, and also let me add XML schemas, flat file schemas, and maps to a project.
I then set out to build some enterprise-class schemas defining a “person” (one flat file schema, one XML schema) and a map converting one format to another. I built the flat file schema using a sample comma-separated file and the provided Flat File Wizard. Hello, my old friend.
The map is super simple. It just concatenates the inbound fields into a single outbound field in the XML schema. Note that the destination field has a “max occurs” of “*” to make sure that it adds one “name” element for each set of source elements. And yes, the mapper includes the Functoids for basic calculations, logical conditions, and string manipulation.
The Azure Integration Account doesn’t take in DLLs, so I loaded in the raw XSD and map files. Note that you need to build the project to get the XSLT version of the map. The Azure portal doesn’t take the raw .btm map.
Back in my Logic App, I found the Properties page for the app and made sure to set the “integration account” property so that it saw my schemas and maps.
I then went back and spun up the VETER Logic Apps template. Because there seemed to be a lot of places where things could go wrong, I removed all the other shapes from the design canvas and just started with the flat file decoding. Let’s get that working first! Since I associated my “Integration Account” with this Logic App, it was easy to select my schema from the drop-down list. With that, I tested.
Shoot. The first call failed. Fortunately, Logic Apps comes with a pretty sweet dashboard and tracing interface. I noticed that the flat file decoding failed, and it looked like it got angry with my schema defining a carriage-return-plus-line-feed delimiter for records, when all I sent it was a line feed (via my API testing tool). So, I went back to my schema, changed the record delimiter, updated my schema (and map) in the Integration Account, and tested again.
Success! Notice that it turned my input flat file into an XML representation.
Feeling irrationally confident, I went to the Logic Apps design surface, clicked the “templates” button at the top and re-selected the VETER template to get all the activities back that I needed. However, I forgot that the “mapping” activity requires that I have an Azure Functions container set up. Apparently the maps are executed inside Microsoft’s serverless framework, Azure Functions. Microsoft’s docs are pretty cryptic about what to do here, but if you follow the links in this KB (“create container”, “add function”), you get the default mapper template as an Azure Function.
Ok, now I was set. My final Logic App configuration looked like this.
The app takes in a flat file, validates the flat file using the flat file (really, XML) schema, uses a built-in check to see that it’s a decoded flat file, executes my map within an Azure Function, and finally returns the result back. I then called the Logic App from Postman.
BAM! It worked. That’s … awesome. While some of you may have fainted in horror at the idea of using flat files and XML in a shiny new Logic App, this does show that Microsoft is trying to cater to some of the existing constraints of their customers.
Overall, I thought the Logic Apps experience was pretty darn good. The tooling has a few rough edges, but was fairly intuitive. The biggest gap is the documentation and number of public samples, but that’s to be expected with such new technology. I’d definitely recommend giving the Enterprise Integration Pack a try and see what sort of unholy flows you can come up with!