Azure Cost Optimizations: Logic App skipped triggers are billable

Azure Cost Optimizations: Logic App skipped triggers are billable

Azure Cost Optimizations is a very complex topic, and depending on the services you are using, it will require different skills and knowledge. Today, I’m going to speak about Logic App Consumption and skipped triggers.

Of course, when you think about optimizing Logic App Consumption costs, it is very important to understand all the cost calculations and how your flow design affects these costs. Once you understand those topics, you will be faced with different solutions to optimize the costs. Some of them require redesigning your workflows; others are a matter of monitoring, automation, and configuration.

But first, let’s add some context and provide a common example that I will assume that many customers will find it familiar:

  • We have one or more systems sending messages to a Service Bus Topic or Queue;
  • And we have a Logic App that is pulling messages each X amount of time;
  • Process them and send them to other systems.

I found it very common to see the configuration of the Service Bus trigger and see that the polling interval is configured to be 30 seconds.

I ask you not to get me wrong. This configuration itself is not bad, and in many scenarios, it can be exactly what we want and need. But in some cases, it may be too aggressive.

The ideal scenario is to analyze the integration run history, speak to all parties involved, and adjust these values. When I was reviewing one of these cases in one of my clients, I noticed that every day, the pulling only occurred between 5:00 AM and 5:01 AM. Actually, in many cases, it took 9 seconds to process all the messages:

At first, I didn’t give it much importance, but when I checked the pulling configuration and the trigger history, I realized that every 30 seconds, there was a skipped trigger occurrence:

Why is this important?

As I mentioned in the beginning, it is essential to understand all the cost calculations inside Logic Apps correctly. For example:

  • Per subscription, you will have the first 4,000 actions for free each day!
    • Notice that this is not 4000 per Logic App. This is the combination of all action executions of all Logic Apps within a subscription.
    • A good benefit of Logic Apps is that the action costs are quite cheap. For example, 600.000 action executions per day will cost you an average of $14,90 per month.
  • On the other hand, despite not being too expensive, connection executions are a little bit more costly, and you don’t have free executions.

Another thing that you need to be aware of is that Azure Logic Apps meters all successful and unsuccessful actions as executions. However, Logic Apps don’t meter these actions:

  • Actions that get skipped due to unmet conditions
  • Actions that get skipped due to unmet conditions

However, it is important for you to be aware that skipped triggers are billable.

Now, you may be wondering: why should I care? That’s just a few dollars a month!

Let’s do a small exercise:

  • Knowing that one day is 86400 seconds and you have one Logic App configured with the polling interval to be 30 seconds. That gives you a total of 2,880 trigger executions per day for that single Logic App.
    • That makes a total of $0.36 per month$4.32 per year. That’s nothing, right?
    • Assuming that we have 4 environments: Development, test, QA, and Production, that makes a total of $17.28 per year. It’s still very decent.
    • But that could also be $0.48 per year if properly configured!
  • Now, we know that we will have many of those scenarios, so let’s assume that we have 10 Logic Apps in this same situation.
    • 10 Logic Apps x 2,880 trigger executions each = 28800 trigger executions each per day, making a total of $3.60 per month$43.20 per year.
    • Once again, assuming that we have 4 environments: Development, test, QA, and Production, that makes a total of $172.80 per year. Well, it is already some money. It is nothing really expensive, but it is not a few dollars anymore.
    • But what if I tell you that the total cost could also be $4.80 per year if properly configured? That’sThat’s a $168 saving each year in 10 Logic Apps.
  • Now, if we go to more enterprise scenarios, where we find 100 Logic Apps doing this behavior, and trust me, it is a small number. Then you will have a different picture because:
    • 100 Logic Apps x 2,880 trigger executions each = 288000 trigger executions each per day, making a total of $36.00 per month$432.00 per year.
    • If we assume the same number of environments, then that makes a total of $1,728.00 per year.
    • That can easily be $48.00 per year if properly configured! That’s a huge saving.

So, it can make a difference just by controlling the pulling strategy.

How can we improve this and optimize the costs?

Depending on the scenarios and requirements, you may find several approaches:

  • The first and straightforward approach is to have a less aggressive pulling period, passing, for example, from every 30 seconds to every 10 minutes or more.
    • This doesn’t require any redesign of the solution, only a different configuration.
  • The second approach, for example, is that instead of having the Service Bus trigger, modify it to a Recurrence trigger with a fixed interval and get the messages from the Service Bus.
    • In this case, the Logic App will run at 5:01, 5:02, 5:03, 5:04, 5:05, 5:06, 5:07, 5:08, 5:09, 5:10, 5:11, 5:12, 5:13, 5:14 every day and try to pull 25 messages.
  • The third approach can be having the non-production environments disabled at all time and enabled when needed. Something that doesn’t happen often. Normally, all environments are running.
  • The fourth approach, and one of the most elegant approaches, is to set up a trigger condition. These conditions act as gatekeepers, allowing your Logic App to fire only when specific criteria are met.
    • On the trigger, click on the three dots () and then select the Settings option.
    • In the Settings for  panel, go down to the Trigger Conditions section and add your condition.
    • @and(greaterOrEquals(formatDateTime(utcNow(), 'HH:mm'), '15:00'), less(formatDateTime(utcNow(), 'HH:mm'), '16:00'))

How can Serverless360 help you save costs in these scenarios?

Among many other features available in Serverless360 to analyze and optimize the costs of your applications, Serveless360 brings you a feature to configure and automate the time of life of specific resources like the Logic Apps. In this particular configuration case, I’m saying that the Logic App will only be enabled daily from 8 PM to 9 PM – a total of 7 hours weekly.

The tool also provides me a total estimated savings of this particular configuration: 95.83% of savings! For me, this is absolutely awesome! And I’m not in charge of controlling costs for my customers, only helping them find ways to optimize their costs and hoping to allow them to save some money.

I hope you enjoy this cost reduction tip, and stay tuned for more content.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Server to Azure Integration Services: How do we migrate Dynamic Ports?

BizTalk Server to Azure Integration Services: How do we migrate Dynamic Ports?

Welcome again to another BizTalk Server to Azure Integration Services blog post. In my previous blog post, I discussed how to send zipped messages or files. Today, we will discuss a classic requirement in BizTalk Server solutions: How do we migrate Dynamic Ports?

It is not that hard to find BizTalk Server solutions/processes that need to send messages to partners, but in some cases, based on the type of the message or the content of these messages, in runtime, we need to define the port configurations or communication channels. Classic examples of those scenarios are:

  • Sending an email notification or the message through email. In runtime, we will specify all the properties of that channel, like To, CC, From, SMTP server, and authentication.
  • Sending a file to a partner FTP, where in runtime, we will define what the partner is and set up all the properties of that channel, like FTP Server, authentication, folder, and file name.

In BizTalk Server, we can archive that to all adapters. Basically, in BizTalk Server, there are two ways to define what communication we will be using in runtime:

  • The first option, and the most common option, is to use BizTalk Server Dynamic ports.
  • The second, and less used, is a combination of Message Box direct bound ports and filters on send ports, basically routing.

Implementing the BizTalk Server solutions

Using dynamic ports.

A BizTalk Server dynamic port is a type of port used in Microsoft BizTalk Server to send messages to various destination endpoints without having to pre-configure the specific address details in the port configuration. Unlike static ports, where the address is fixed and known at design time, dynamic ports allow BizTalk to decide at runtime where to send the message based on the message context or other runtime considerations.

Key aspects of BizTalk Server dynamic ports include:

  • Runtime Resolution: The destination address and transport properties of a dynamic port are set at runtime using message context properties. This allows for a high degree of flexibility in message routing.
  • Adaptability: Dynamic ports are particularly useful in scenarios where the destination endpoints may change frequently or when messages need to be routed to multiple endpoints based on the content of the message or business rules.
  • Orchestration Support: Normally, this type of port is used inside BizTalk orchestrations, where the orchestration can set the destination of the message dynamically based on logic implemented within the orchestration.

Dynamic ports are an essential feature for complex integration scenarios where the destination of messages cannot be determined upfront and may vary based on the message itself or the outcome of business processes.

And basically, this is how it looks our BizTalk Server solution inside the orchestration:

We will configure the adapter properties in the context of the message. And, of course, those properties will change based on the adapter we use. For example, this is for connecting to a SQL database:

Request2=Request1;  
Request2(WCF.Action)="TableOp/Insert/dbo/CustomerTable";  
Request2(WCF.BindingType)="sqlBinding";  
Request2(WCF.UserName)="myuser";  
Request2(WCF.Password)="mypass";  

SendPort(Microsoft.XLANGs.BaseTypes.Address)="mssql://SQL/INST/DB";  SendPort(Microsoft.XLANGs.BaseTypes.TransportType)="WCF-Custom";

There are also ways to create a static port with dynamic behavior, but I will leave that for another day.

Using a combination of Message Box direct bound ports and filters.

Now, some adapters can be quite difficult to configure in runtime, like the SQL Adapter. So, in order to minimize that developer effort and to better maintain the port configuration and security. We can apply the same principles without using Dynamic ports.

In this approach, we will replace that Dynamic port by:

  • Several physical send ports are configured for each partner and system we will want to exchange messages.
  • On the orchestration, the logical port, instead of using a dynamic port binding, will use a Message Box direct bound, which means we will be publishing the messages directly to the BizTalk Server Message Box – this is the database where all pub-sub magic of BizTalk Server happens.
  • In the orchestration, it will be important to promote some kind of metadata to the context of the message that will allow us to identify and subscribe the messages to each partner or system.
  • Finally, on each physical send port, we need to apply a filter to subscribe to those messages.

The real case scenario where I added to implement this approach was on a client where we had several processes running, and depending on which internal company inside the organization that messages were related to, we had to communicate with a different database, but all those databases were equal.

The challenge

The biggest challenge in implementing this type of requirements in Azure Integration Services is that, with the exception of a few connectors like the HTTP connector, connectors don’t allow us in runtime to set up the configurations of the channel dynamically. Instead, we need to have an API Connection already established. If you use BizTalk nomenclature, we can only use physical ports without dynamic behavior!

You can say: Hey Sandro, we can dynamically configure the TO, email body, CC, and other properties in the Outlook connector, for example.

Yes, you can. But you cannot dynamically define the FROM! You can use the SQL connector, but you cannot in runtime, define which SQL Server you will be using.

So, how can we migrate this to Azure Integration Services?

Building the Logic App Consumption Solution

As I always say, there are many ways to achieve the same goal. Here I’m going to present one that I think is an elegant one. But depending on the requirements, or if this communication is One-way send communication (doesn’t require a response) or two-way (request-response), you will find different approaches.

If it is a one-way send communication, you will find a solution in one of my previous blog posts: Migrating BizTalk Platform one-way routing solutions. But today, we are going to address a most complex scenario, which is the one I present above, in which we need to communicate with different equal databases. How do we migrate those scenarios if SQL Connector doesn’t allow us to specify the server dynamically?

The solution I will present will be inspired by the SQL Server ports in BizTalk Server, where we specify what is the SQL Server, the database, authentication, and many other properties of that channel in order for us to perform that communication but also what are the operations we will be doing on that database.

So, to start this solution migration we will be creating two or more Logic Apps that will act as the BizTalk Server SQL Send Port. I have called them:

  • LA-NC-SQL-Connector-POC
  • And LA-SP-SQL-Connector-POC

These Logic Apps will have:

  • A Request – When a HTTP request is received trigger.
  • And based on the operation we will define, we will perform the corresponding SQL operation.

Of course, each Logic App is configured with a different SQL Server database, but they will implement the same business logic.

Now that we have our “SQL Server custom connector” or “SQL Server ports,” we need to create the main process to use them dynamically based on the context of the message.

I call this process LA-ProcessOrderRequests-POC. Now, knowing that:

  • The only connector that really allows us to configure dynamically is the HTTP connector.
  • This is a request-response communication, so we cannot use the Service Bus to route these messages. (we may, but it will be complex and expensive)
  • And that our “SQL Server custom connector” or “SQL Server ports” Logic Apps are triggered by HTTP calls.

We need to find a place to store those trigger URLs in order to fetch those In runtime based on some metadata of the message, in the context of the message, or in the body, and route them inside the Logic App. For this, we could use:

  • Or, because those URLs contain some secrets, to increase security, we could use Azure Key Vault to store that information.

In this sample, we decided that the following payload will be received by the main Logic App:

{
     "Company": "SP",
     "Operation": "Insert",
     "Data": {…} 
}

Of course, the data property is dynamic.

And our main process will be something like this:

  • Based on the company, it will read a secret from Key Vault that contains the URL of the child Logic App.
    • The secrets name convention is composed of static and dynamic data and, in our case, will always be “LA–URL
  • If we successfully retrieve the URL from Key Vault we will route the request to the correct database.
    • Otherwise, we will send an error message back to the system.

And this is how we bring some BizTalk Server out-of-the-box capabilities to Azure. And also why it is important for you to have a good knowledge of BizTalk Server to perform those migrations.

I hope you find these architecture samples useful, and stay tuned for more BizTalk Server to Azure Integration Services.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Cost Optimizations: Push vs Pull may significantly impact the final solution cost

Azure Cost Optimizations: Push vs Pull may significantly impact the final solution cost

It is really important for companies to have implemented strategies and policies defined before starting to implement integration solutions on Azure or give decision design power to external consultant companies, such as:

  • Which services are part of the middleware solution in Azure?
  • What are they for, and when can we use them?
  • How can we handle requirement exceptions?
  • Naming conventions, CI/CD, and tag policies
  • And so on.

I’m about to tell a real story about how I saved more than 20K a year for one of my clients and why those reasons above are so important. I should have asked for a bonus since they have already saved more than 100K because of me.

One of my clients decided to do a POC on Azure Integration using Logic Apps and Service Bus, and during that period, I was busy working on a critical application for that client. So, they decided not to initially involve me and delegated this POC to another partner. The requirements were quite simple:

  • They would like to have an async process to integrate HR information between the two systems.
  • They would want to use a no-code, low-code approach using Azure Integration Services as much as possible.

And they did it! They actually implemented a very good solution, looking only at the technical implementation. They decided to design a push implementation that allowed an almost real-time integration.

Once my client contacted me to help them deploy it to production, I only knew that this was a POC and, if everything went well, would be used in production as a real solution. So, if they were asking me to help them deploy to production, that meant at that point that:

  • Everything worked well and as expected.
  • The requirements were accomplished.
  • And they already accepted that solution.

However, I asked if I could do a minor assessment of the solution, mainly to understand how the pieces were built and how we could deploy them to production, and since I was doing it, validate if everything was according to best practices.

And after analyzing the solution, I became curious, so I went to my client and asked the following questions:

  • Is this a mission-critical application?
    • Client: No, this is not a mission-critical application.
  • Do you need almost real-time integration?
    • Client: No. Even if they are not integrated today, they will be resent tomorrow or in the next synchronization.
  • On average, how many messages do you send per day?
    • Typically, close to 100 or less.
  • Are they big messages (more than 256 KB)?
    • No, they are small.

Of course, I already expected this type of response and had my homework done. Tools like Serverless360 are used to analyze the cost of the application, and the Azure Calculator is used to predict the costs. I had a rough estimation of how much this solution would cost my client monthly and yearly.

I know they are a big company, but I asked the most obviously logical question in this situation:

  • Do you know how much this solution costs to you?
    • Client: Not really, but I know that we have already exceeded the planned budget. Do you know how much?

They were shocked when I told them: ~$677 per month.

But that was worse because they have four environments: DEV, TEST, QA, and PROD, so they need to multiply that by 4, meaning ~2.7K per month, which means close to 32.5K per year! To process 100 messages per day on a non-critical application!

Is there a better approach?

Well, I don’t say that there is a better approach. An approach is good when it will fulfill the technical and financial expectations. In many cases, the previous approach will be the better approach. Especially if you already have Service Bus premium in your organization and several applications are using that.

Now, if you are starting your journey with Azure or want to control the cost, you should opt for a significantly cheaper approach that also fulfills all the technical requirements.

So, how can we reduce the costs of this solution by 99%?

Yes, you are reading correctly. Actually, it is more than 99% since by the time we end refactoring, the estimated monthly cost was $2.68 (don’t forget we had four environments) or $128.64 per year!

Basically, the only drastic difference in the design of the solution was to transform it into a pull architecture.

  • Because we only needed a queue, we have downgraded our Service Bus from Premium to Basic.
  • We removed the Event Grid from the equation since it requires Service Bus Premium.
  • And finally, we change the trigger of the Logic App to pull messages from the Service Bus queue each X minutes.

All the requirements were accomplished with this new architecture. The costs were surprisingly cheap, which motivated the client, who was already concerned about Azure costs, to adopt more cloud integration solutions.

The solution is still running in production without any changes or improvements.

I hope you enjoy this cost reduction tip, and stay tuned for more content.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Server to Azure Integration Services: Validate XML messages before executing the business logic

BizTalk Server to Azure Integration Services: Validate XML messages before executing the business logic

Welcome again to another BizTalk Server to Azure Integration Services blog post, this time on my personal blog post. Today, we will address a common scenario and requirement in Enterprise Integration solutions that are quite simple to accomplish with BizTalk Server solutions: How do we validate XML messages before executing the business logic or processing them?

For simplicity, let’s assume that the BizTalk Server internally processes only XML messages, which is not completely true. That means that when we receive a message through an adapter specified in a specific receiving location, the first thing we need to do is normalize data from various formats to XML. For that, we will need to use a BizTalk Server artifact called Pipeline.Pipelines are software components that can process messages, either as the messages are received or just before they are sent out through a send port. A pipeline divides processing into categories of work called processing stages and specifies the sequence in which each stage of work is performed. Each stage of a pipeline contains one or more pipeline components (Microsoft .NET objects or COM objects) that can be configured to work with the specific requirements of the messaging solution or orchestrated business process.

In this particular case, we are interested in using a Receive Pipeline to validate messages against known schema(s). The Receive Pipeline is composed of four stages. Each of the four stages in a receive pipeline performs a specific function and can contain only components specified for use in that stage. Each receive pipeline stage can contain up to 255 components, which will all be executed in order, with the exception of the disassemble stage, in which only one component will execute. The four stages are as follows:

  • Decode: This stage is used for components that decode or decrypt messages. For example, there is a built-in MIME/SMIME decoder pipeline component that can be used to decode MIME-encoded messages. Custom components for this stage could include a component to decode a compressed (zipped) file before further processing.
  • Disassemble: Use this stage if you need to parse or disassemble the inbound message. The components within this stage probe the message to see if the message format is recognized, and then, if the message format is recognized, one of the components disassembles the message. Tasks performed in this stage include conversions of flat-file messages to XML format and splitting of messages. In order for property promotion to occur, an appropriate (flat-file or XML) disassembler must be specified at this stage.
  • Validate: In this stage, messages are validated against a collection of schemas. Pipelines process only messages that conform to the schemas specified in this component, if present. If a message whose schema is not associated with any component in the pipeline is received by the pipeline, the message is not processed. Depending on the adapter, the message is either suspended or an error is issued to the sender. This stage runs once per message created by the Disassemble stage. The built-in validate component can be used in this stage as well as in other stages.
  • Resolve Party: In this stage, the certificate associated with the sender’s security identifier (SID) is mapped to the corresponding configured BizTalk Server party. If the message was digitally signed, the component uses the signature to look up a Microsoft Windows® identity in the BizTalk Server 2010 Configuration database. If the message carries the authenticated SID of a Windows user, this identity is used. If neither mechanism succeeds, the sender is assigned a default anonymous identity. Party resolution is an important feature for managing trading partner relationships. Not all adapters support party resolution.

In simpler terms, BizTalk Pipelines are the helpful assistants that take care of the stuff coming into BizTalk, tidy it up, and hand it over to other parts of the system so everything runs smoothly. They’re like the backstage crew making sure the show (or data flow) goes on without a hitch.

BizTalk Server solution: Create a custom receive pipeline to validate messages against known schema(s)

Of course, as always, there are multiple ways to accomplish this solution. But one of the most simple and elegant solutions to achieve this is by:

  • Create a custom receive pipeline that contains at least these two out-of-the-box components:
    • The XML Disassembler pipeline component in the Disassemble stage. In this scenario, its primary function will be to promote the content properties from individual document levels to the message context.
    • Then, the XML Validator pipeline component in the Validate stage. This component validates the message against the specified schema or schemas, and if the message does not conform to these schemas, the component raises an error, and Messaging Engine places the message in the suspended queue.

Assuming that we already have created our BizTalk Server project and our schema representing this simple message:


 Sandro
 Pereira
 1978-04-04
 
Portugal
4415

Where all fields are mandatory, with the exception of ZipCode, which is optional. Also, DateOfBirth is a xs:date. The rest are simple strings.

In order to accomplish our requirements inside the BizTalk Server, we just need in our solution to create a custom receive pipeline. To do so, we need to:

  • Right-click on the project name and select the option Add > New Item…
  • On the Add New Item window, on the left tree, select BizTalk Project Items and then select Receive Pipeline. Give it a proper name and click Add.
  • From the Toolbox, drag and drop the XML Disassembler pipeline component in the Disassemble stage and the XML Validator pipeline component in the Validate stage.
  • In this scenario, we will leave the default configuration of the XML Disassembler pipeline component.
  • Now, select the XML Validator pipeline component, and in the Properties windows (normally present on the right side of the Visual Studio Editor), click on the three dots () of the Document schemas property.
  • On the Shema Collection Property Editor window, on the Available schemas panel, select your schema, in this case, Person, and then click Add. Click OK.
  • Save your pipeline. Build your project and deploy it. That’s it!

Now, when you deploy the solution into your BizTalk Server environment and configure a physical receive port and location associating this receive pipeline you have created, what happens is that:

  • If the port receives a Person XML, it will deeply validate the instance of that message against the schema, and if the message does not conform to these schemas, the component will raise an error.
  • If we receive any other type of XML message, the port will consume them without deeply validating the instance against the schema since it is not a Person.

The challenge

This is a common request that we will find in many solutions, and it is fully supported inside Logic Apps. It is not in the same way as BizTalk Serve since we do not have a concept of Receive Ports and Receive Location in Logic App or any other service in Azure, but it is still fairly easy to accomplish.

The main challenge here is that depending on which type of Logic App you will use, we will have different approaches:

  • If we use Consumption to have XML “premium” capabilities, we need to have an Integration Account to store the schemas and perform validation (and many other features). However, this will have an additional cost.
  • If we use Standard, we already have built-in XML capabilities, and for these requirements, we will not need an Integration Account.

The good news is that BizTalk Server Schemas are fully supported in Logic Apps, both Consumption and Standard, so we don’t need to worry about generating or recreating them. We can just copy them from our BizTalk Server solution and use them in our Azure Integration Service (AIS) solution.

Building the Logic App Consumption Solution

In this Proof-of-Concept (POC), first, we need to copy our schema to our Integration Account. If you don’t have one, you can create it by:

  • In the Azure portal search box, enter Integration accounts, and select Integration accounts.
  • Under Integration accounts, select Create.
  • On the Create an integration account pane, provide the following information about your integration account like, Subscription, Resource Group, Integration Account Name, Pricing Tier, Storage account, and so on.
  • When you’re done, select Review + create.

To import the schema into our Integration Account, we need to:

  • Access the integration account, and under the Settings section, select Schemas and then click + Add.
  • On the Add Schema panel, browse for our XML Schema, leave the default settings, and click OK.

Now that we have our schema, the next thing we need to do is create our Logic App that will act as our BizTalk Server Receive Pipeline. To accomplish that, you need to:

  • On the Azure Portal, create a new Logic App Consumption and give it a proper name, in our case: LA-SA-XMLValidation-POC.
  • Before we start to create our logic, we first need to associate the Integration account with this Logic App by selecting Workflow settings under the Settings section. In the Integration account property, select your Integration account.
  • Click Save.
  • Now, click on the Logic App designer under the Development Tools section. From the Templates, select Blank Logic App.

Note: We could use any connector as input, like File or OneDrive, but for simplicity of this POC, we will use the Request.

  • For the trigger, select Request > When a HTTP request is received trigger and leave it as default.
  • Next, click on + New step, select the XML connector, and then the XML Validation action.
  • On the XML validation action, perform the following configurations:
    • On the Content property, set it to be the body of the When a HTTP request is received trigger.
    • On the Schema name property, select the correct schema from your integration account. In our case, Person.
  • And, of course, now you need to implement your business logic or call a child Logic App.
  • I endup creating a try-catch statement just to prove this functionality.
  • Now, you just need to test your solution.

Building the Logic App Standard Solution

Making the same solution in Standard will be easier since Logic App Standard already has out-of-the-box support for XML Schemas and maps. We just need to add them to our logic app resource. Once again, the BizTalk Server XML Schemas will be fully supported, so we just need to copy them from our BizTalk Server solution.

Let’s create our Logic App Standard Solution. To do that, you need the following:

  • In Visual Studio Code, close all open folders.
  • In the Azure window, on the Workspace section toolbar, from the Azure Logic Apps menu, select Create New Project.
  • Define the folder of the project or browse to the location where you created your project folder, select that folder, and continue.
  • From the templates list that appears, select either Stateful Workflow or Stateless Workflow. This example selects Stateful Workflow.
  • Provide a name for your workflow and press Enter. This example uses LA-XMLValidation-POC as the name.
  • From the Visual Studio Activity Bar, open the Explorer pane if it is not already open.

Now, the first thing we will do is add our Schema to our project. To do that, you need to:

  • Access the project folder using File Explorer, enter the Artifacts > Schemas folder, and copy the schema into this folder. And you will automatically see it on the Visual Studio Code Explorer.

Now, we need to create the business logic of our Logic App.

  • Right-click on our LA-XMLValidation-POC workflow.json file and select Open Designer.
  • And we are going to implement the same business logic we did with Consumption.
  • Start adding a Request > When a HTTP request is received trigger and leave it as default.
  • Next, click on + Add an action, select the XML connector, and then the XML Validation action.
  • On the XML validation action, perform the following configurations:
    • On the Content property, set it to be the body of the When a HTTP request is received trigger.
    • On the Source property, leave LogicApp.
    • On the Schema name property, select the correct schema from your project. In our case, Person.
  • And, of course, now you need to implement your business logic or call a child Logic App.
  • I endup creating a try-catch statement just to prove this functionality.
  • Now, you just need to test your solution.

I hope you find these architecture samples useful, and stay tuned for more BizTalk Server to Azure Integration Services.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira