BizTalk Server Visual Studio project: attempted re-targeting of the project has been canceled.

BizTalk Server Visual Studio project: attempted re-targeting of the project has been canceled.

Yesterday while I was trying to configure a new BizTalk Server RosettaNet project in a client, I found a curious new error/issue while copying the BizTalk Server solution into the client development environment. After I copied the project to the development environment, I tried to open the BizTalk Server Visual Studio solution, and I got the following warning:

The C# project “ is targeting “.NETFramework, Version=v4.7.2”, which is not installed on this machine. To proceed, select an option below.

Knowing that we didn’t have .NET Framework 4.7.2, I chose the first option, but we endup getting the following error:

Attempted re-targeting of the project has been canceled. BizTalk Server 2016 Developer Tools only supports targeting ‘.Net.Framework 4.6.x’ and above.

Cause

This error/issue first happened because I thought our environment (client environment) was already using BizTalk Server 2020, and I developed the project locally using that version. BizTalk Server 2020 uses .NET Framework 4.7.3 or above by default. This was not the case. Our environment is still BizTalk Server 2016, and that is why when I tried to open the BizTalk Server Visual Studio solution, it asked me to re-target the project.

Normally, we don’t have a problem doing this if we can move from a previous version to a higher version, like, for example, from BizTalk Server 2016 to BizTalk Server 2020. In this case, it was a downgrade, and I don’t really know why he presented me with 4.5.2 instead of 4.6, which is the “official” version of BizTalk Server 2016.

Nevertheless, I was not able to do it or workaround this directly on Visual Studio.

Solution

To solve this issue is quite very simple:

  • Go to the project folder and open the *.btproj file with Notepad or Notepad++
  • And change the TargetFrameworkVersion from 4.7.2 to 4.6.

After these steps, I was able to open the BizTalk Server Visual Studio solution in our dev environment.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Oracle Adapter error: The assembly required for type mapping not found.

BizTalk Oracle Adapter error: The assembly required for type mapping not found.

Today while I was developing a new BizTalk Server solution that communicates with the Oracle database, I encountered a familiar issue that I forgot to document in the past:

Microsoft.ServiceModel.Channels.Common.MetadataException: The assembly required for type mapping not found.

That forced me to once again lose time not only to remember but to find how I could solve this.

Cause

This error happens when you try to call an Oracle Procedure, function, or package that contains User-Defined Types or UDTs. The UDTs can be present in the following artifacts:

  • Interface tables and interface views containing UDT columns.
  • Database tables and views containing UDT columns.
  • Packages, stored procedures, and functions containing UDT parameters.

Oracle UDTs help represent complex entities as a “single” object that can be shared among the applications.

BizTalk Server supports Oracle UDTs, but unlike what happens with SQL Server, which natively supports these types, in Oracle, we need to configure some more properties and generate a UDT DLL.

When this error occurs, two things may happen or maybe the reason:

  • You forgot to configure, in the Schema generation, the following properties:
    • GeneratedUserTypesAssemblyFilePath
    • GeneratedUserTypesAssemblyKeyFilePath
  • Or you forgot in runtime (aka receive location or send port) to configure the following property:
    • userAssembliesLoadPath

Solution

To solve this issue, we need to guarantee to perform the following steps:

  • It is necessary to create a signed assembly (DLL) of the User-Defined Types (UDTs) created in Oracle and that correspond to those interpreted by the WCF-Oracle Adapter. To do this, when creating the schemas from the Consume Adapter Service option, these assemblies must be created specifying:
    • On the GeneratedUserTypesAssemblyFilePath property, we need to provide a full path and name of the DLL that the wizard will create for us.
    • And on the GeneratedUserTypesAssemblyKeyFilePath property, the strong name key (.snk) path that the wizard will use to sign the DLL.

Both these properties are present in the UDT .NET Type Generation – Design Time section of the Binding Properties.

Once again, this will create for us a UDT DLL on the path we define and that we need to use in runtime:

Once we deploy our schemas and create the receive location or send port, we then need to make sure that we configure the following property with the path to the UDT DLL:

  • userAssembliesLoadPath 

Notice: even if you deploy the UDT DLL to GAC (what is advisable), you still need to configure the path to the UDT DLL in this property,

After these steps, you can successfully communicate with Oracle using an Oracle Procedure, function, or package that contains User-Defined Types.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk WCF-SAP Adapter error: Loading property information list by namespace failed or property not found in the list. Verify that the schema is deployed properly.

BizTalk WCF-SAP Adapter error: Loading property information list by namespace failed or property not found in the list. Verify that the schema is deployed properly.

A client reported to me an error that we were getting in our BizTalk Server production environment this week. We had deployed a new orchestration and a new SAP receive port to handle a new partner, something we had done several times with success, but this time, we were getting the following warning each time we sent a message from SAP to BizTalk Server:

The adapter “WCF-Custom” raised an error message. Details “System.Exception: Loading property information list by namespace failed or property not found in the list. Verify that the schema is deployed properly.

   at Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkAsyncResult.End()
at Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkServiceInstance.EndOperation(IAsyncResult result)
at Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkServiceInstance.Microsoft.BizTalk.Adapter.Wcf.Runtime.ITwoWayAsyncVoid.EndTwoWayMethod(IAsyncResult result)
at AsyncInvokeEndEndTwoWayMethod(Object , Object[] , IAsyncResult )
at System.ServiceModel.Dispatcher.AsyncMethodInvoker.InvokeEnd(Object instance, Object[]& outputs, IAsyncResult result)
at System.ServiceModel.Dispatcher.DispatchOperationRuntime.InvokeEnd(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage7(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.MessageRpc.Process(Boolean isOperationContextSet)”.

First, it was strange to be logged as a warning since the BizTalk Server did not receive or process the message.

Also, the message can be misleading because the “error” message says Verify that the schema is deployed properly. These types of errors typically indicate that the schema or specific schema versions are not deployed in the environment, although when that happens, the error message clearly specifies the schema name and version, which is not the case in this error message.

Cause

Official documentation states that the following exception is encountered while receiving an IDOC with the EnableBizTalkCompatibilityMode binding property set to true. And if the binding property EnableBizTalkCompatibilityMode is set to true, you must add the BizTalk property schema DLL for the SAP adapter as a resource in your BizTalk application, that is, the application in which your project is deployed.

However, our receive location didn’t have the EnableBizTalkCompatibilityMode set to true. Instead, it was set to false.

With a lack of better ideas, we decided to try to apply the same solution and add the BizTalk property schema DLL for the SAP adapter as a resource in your BizTalk application, and it solved all of our problems,

Solution

So, to solve this issue, we need to add BizTalk property schema DLL for the SAP adapter called Microsoft.Adapters.SAP.BiztalkPropertySchema.dll as a resource in our BizTalk application. This DLL can be found Microsoft BizTalk Adapter Pack folder that normally is under:

  • : Program FilesMicrosoft BizTalk Adapter Packbin

or in BizTalk Server 2020 under:

  • :Program Files (x86)Microsoft BizTalk Server

You must perform the following tasks to add this assembly as a resource in your BizTalk application:

  • Start the BizTalk Server Administration console.
  • In the console tree, expand BizTalk Group, expand Applications, and then the application to which you want to add a BizTalk assembly.
  • Expand Applications and the application to which you want to add a BizTalk assembly.
  • Right-click Resources, point to Add, and then click BizTalk Assemblies.
  • Click Add, navigate to the folder containing the BizTalk assembly file, select the BizTalk assembly file, and then click Open.
  • In Options, specify the options for installing the BizTalk assembly to the GAC, and then click OK.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Remove XML Empty Nodes Pipeline Component

Remove XML Empty Nodes Pipeline Component

Another day, another BizTalk Server Pipeline Component! Today, I decided to release a brand new component called the Remove XML Empty Nodes Pipeline Component.

For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receiving and sending pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.

Remove XML Empty Nodes Pipeline Component

As the name mentions, the Remove XML Empty Nodes Pipeline Component is a pipeline component that can be used to remove empty nodes present in an XML message. You can use this component in any stage of a receive or send pipeline.

This component has a single property that requires you to setup:

  • DisableRemoveBOM (boolean): This allows you to enable or disable the process of removing empty nodes from an XML message.

How to install it

As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:

  • C:Program Files (x86)Microsoft BizTalk ServerPipeline Components

In this particular component, we need to have this  DLL

  • BizTalk.PipelineComponents.RemoveXmlEmptyNodes.dll

How to use it

Like all previous, to use the pipeline component, I recommend you create generic or several generic pipelines that can be reused by all your applications and add this pipeline component in any required stage of a send or receive pipeline.

Download

THIS COMPONENT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.

You can download the Remove XML Empty Nodes Pipeline Component from GitHub here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Send File To a Date-Based Structure Encoder Pipeline Component

Send File To a Date-Based Structure Encoder Pipeline Component

Time to get back to BizTalk Server and publish new resources on this amazing product and also return to one of my old pet projects: the BizTalk Pipeline Components Extensions Utility Pack.

Today, I decided to create a brand new component called the Send File To a Date-Based Structure Encoder Pipeline Component.

For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receiving and sending pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.

Send File To a Date-Based Structure Encoder Pipeline Component

The Send File To a Date-Based Structure Encoder Pipeline Component is a pipeline component that can be used in a send pipeline, as the name mentioned inside the Encode stage, and it allows you to send an outbound file to a dynamic folder path organized by date tree:

  • yyyyMMdd

In other words, you will define the base path on the adapter URI, and then this component will use that base path to add a dynamic structure inside that path based on the date.

This component doesn’t require any property configuration.

How to install it

As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:

  • C:Program Files (x86)Microsoft BizTalk ServerPipeline Components

In this particular component, we need to have this  DLL:

  • BizTalk.PipelineComponents.SendFileToDateBasedStructure.dll

How to use it

Like all previous, to use the pipeline component, I recommend you create generic or several generic pipelines that can be reused by all your applications and add the pipeline component in the Encode stage. The component can be used only on the send pipelines.

Download

THIS COMPONENT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.

You can download Send File To a Date-Based Structure Encoder Pipeline Component from GitHub here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Server to Azure Integration Services: How do we migrate Dynamic Ports?

BizTalk Server to Azure Integration Services: How do we migrate Dynamic Ports?

Welcome again to another BizTalk Server to Azure Integration Services blog post. In my previous blog post, I discussed how to send zipped messages or files. Today, we will discuss a classic requirement in BizTalk Server solutions: How do we migrate Dynamic Ports?

It is not that hard to find BizTalk Server solutions/processes that need to send messages to partners, but in some cases, based on the type of the message or the content of these messages, in runtime, we need to define the port configurations or communication channels. Classic examples of those scenarios are:

  • Sending an email notification or the message through email. In runtime, we will specify all the properties of that channel, like To, CC, From, SMTP server, and authentication.
  • Sending a file to a partner FTP, where in runtime, we will define what the partner is and set up all the properties of that channel, like FTP Server, authentication, folder, and file name.

In BizTalk Server, we can archive that to all adapters. Basically, in BizTalk Server, there are two ways to define what communication we will be using in runtime:

  • The first option, and the most common option, is to use BizTalk Server Dynamic ports.
  • The second, and less used, is a combination of Message Box direct bound ports and filters on send ports, basically routing.

Implementing the BizTalk Server solutions

Using dynamic ports.

A BizTalk Server dynamic port is a type of port used in Microsoft BizTalk Server to send messages to various destination endpoints without having to pre-configure the specific address details in the port configuration. Unlike static ports, where the address is fixed and known at design time, dynamic ports allow BizTalk to decide at runtime where to send the message based on the message context or other runtime considerations.

Key aspects of BizTalk Server dynamic ports include:

  • Runtime Resolution: The destination address and transport properties of a dynamic port are set at runtime using message context properties. This allows for a high degree of flexibility in message routing.
  • Adaptability: Dynamic ports are particularly useful in scenarios where the destination endpoints may change frequently or when messages need to be routed to multiple endpoints based on the content of the message or business rules.
  • Orchestration Support: Normally, this type of port is used inside BizTalk orchestrations, where the orchestration can set the destination of the message dynamically based on logic implemented within the orchestration.

Dynamic ports are an essential feature for complex integration scenarios where the destination of messages cannot be determined upfront and may vary based on the message itself or the outcome of business processes.

And basically, this is how it looks our BizTalk Server solution inside the orchestration:

We will configure the adapter properties in the context of the message. And, of course, those properties will change based on the adapter we use. For example, this is for connecting to a SQL database:

Request2=Request1;  
Request2(WCF.Action)="TableOp/Insert/dbo/CustomerTable";  
Request2(WCF.BindingType)="sqlBinding";  
Request2(WCF.UserName)="myuser";  
Request2(WCF.Password)="mypass";  

SendPort(Microsoft.XLANGs.BaseTypes.Address)="mssql://SQL/INST/DB";  SendPort(Microsoft.XLANGs.BaseTypes.TransportType)="WCF-Custom";

There are also ways to create a static port with dynamic behavior, but I will leave that for another day.

Using a combination of Message Box direct bound ports and filters.

Now, some adapters can be quite difficult to configure in runtime, like the SQL Adapter. So, in order to minimize that developer effort and to better maintain the port configuration and security. We can apply the same principles without using Dynamic ports.

In this approach, we will replace that Dynamic port by:

  • Several physical send ports are configured for each partner and system we will want to exchange messages.
  • On the orchestration, the logical port, instead of using a dynamic port binding, will use a Message Box direct bound, which means we will be publishing the messages directly to the BizTalk Server Message Box – this is the database where all pub-sub magic of BizTalk Server happens.
  • In the orchestration, it will be important to promote some kind of metadata to the context of the message that will allow us to identify and subscribe the messages to each partner or system.
  • Finally, on each physical send port, we need to apply a filter to subscribe to those messages.

The real case scenario where I added to implement this approach was on a client where we had several processes running, and depending on which internal company inside the organization that messages were related to, we had to communicate with a different database, but all those databases were equal.

The challenge

The biggest challenge in implementing this type of requirements in Azure Integration Services is that, with the exception of a few connectors like the HTTP connector, connectors don’t allow us in runtime to set up the configurations of the channel dynamically. Instead, we need to have an API Connection already established. If you use BizTalk nomenclature, we can only use physical ports without dynamic behavior!

You can say: Hey Sandro, we can dynamically configure the TO, email body, CC, and other properties in the Outlook connector, for example.

Yes, you can. But you cannot dynamically define the FROM! You can use the SQL connector, but you cannot in runtime, define which SQL Server you will be using.

So, how can we migrate this to Azure Integration Services?

Building the Logic App Consumption Solution

As I always say, there are many ways to achieve the same goal. Here I’m going to present one that I think is an elegant one. But depending on the requirements, or if this communication is One-way send communication (doesn’t require a response) or two-way (request-response), you will find different approaches.

If it is a one-way send communication, you will find a solution in one of my previous blog posts: Migrating BizTalk Platform one-way routing solutions. But today, we are going to address a most complex scenario, which is the one I present above, in which we need to communicate with different equal databases. How do we migrate those scenarios if SQL Connector doesn’t allow us to specify the server dynamically?

The solution I will present will be inspired by the SQL Server ports in BizTalk Server, where we specify what is the SQL Server, the database, authentication, and many other properties of that channel in order for us to perform that communication but also what are the operations we will be doing on that database.

So, to start this solution migration we will be creating two or more Logic Apps that will act as the BizTalk Server SQL Send Port. I have called them:

  • LA-NC-SQL-Connector-POC
  • And LA-SP-SQL-Connector-POC

These Logic Apps will have:

  • A Request – When a HTTP request is received trigger.
  • And based on the operation we will define, we will perform the corresponding SQL operation.

Of course, each Logic App is configured with a different SQL Server database, but they will implement the same business logic.

Now that we have our “SQL Server custom connector” or “SQL Server ports,” we need to create the main process to use them dynamically based on the context of the message.

I call this process LA-ProcessOrderRequests-POC. Now, knowing that:

  • The only connector that really allows us to configure dynamically is the HTTP connector.
  • This is a request-response communication, so we cannot use the Service Bus to route these messages. (we may, but it will be complex and expensive)
  • And that our “SQL Server custom connector” or “SQL Server ports” Logic Apps are triggered by HTTP calls.

We need to find a place to store those trigger URLs in order to fetch those In runtime based on some metadata of the message, in the context of the message, or in the body, and route them inside the Logic App. For this, we could use:

  • Or, because those URLs contain some secrets, to increase security, we could use Azure Key Vault to store that information.

In this sample, we decided that the following payload will be received by the main Logic App:

{
     "Company": "SP",
     "Operation": "Insert",
     "Data": {…} 
}

Of course, the data property is dynamic.

And our main process will be something like this:

  • Based on the company, it will read a secret from Key Vault that contains the URL of the child Logic App.
    • The secrets name convention is composed of static and dynamic data and, in our case, will always be “LA–URL
  • If we successfully retrieve the URL from Key Vault we will route the request to the correct database.
    • Otherwise, we will send an error message back to the system.

And this is how we bring some BizTalk Server out-of-the-box capabilities to Azure. And also why it is important for you to have a good knowledge of BizTalk Server to perform those migrations.

I hope you find these architecture samples useful, and stay tuned for more BizTalk Server to Azure Integration Services.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Server to Azure Integration Services: Validate XML messages before executing the business logic

BizTalk Server to Azure Integration Services: Validate XML messages before executing the business logic

Welcome again to another BizTalk Server to Azure Integration Services blog post, this time on my personal blog post. Today, we will address a common scenario and requirement in Enterprise Integration solutions that are quite simple to accomplish with BizTalk Server solutions: How do we validate XML messages before executing the business logic or processing them?

For simplicity, let’s assume that the BizTalk Server internally processes only XML messages, which is not completely true. That means that when we receive a message through an adapter specified in a specific receiving location, the first thing we need to do is normalize data from various formats to XML. For that, we will need to use a BizTalk Server artifact called Pipeline.Pipelines are software components that can process messages, either as the messages are received or just before they are sent out through a send port. A pipeline divides processing into categories of work called processing stages and specifies the sequence in which each stage of work is performed. Each stage of a pipeline contains one or more pipeline components (Microsoft .NET objects or COM objects) that can be configured to work with the specific requirements of the messaging solution or orchestrated business process.

In this particular case, we are interested in using a Receive Pipeline to validate messages against known schema(s). The Receive Pipeline is composed of four stages. Each of the four stages in a receive pipeline performs a specific function and can contain only components specified for use in that stage. Each receive pipeline stage can contain up to 255 components, which will all be executed in order, with the exception of the disassemble stage, in which only one component will execute. The four stages are as follows:

  • Decode: This stage is used for components that decode or decrypt messages. For example, there is a built-in MIME/SMIME decoder pipeline component that can be used to decode MIME-encoded messages. Custom components for this stage could include a component to decode a compressed (zipped) file before further processing.
  • Disassemble: Use this stage if you need to parse or disassemble the inbound message. The components within this stage probe the message to see if the message format is recognized, and then, if the message format is recognized, one of the components disassembles the message. Tasks performed in this stage include conversions of flat-file messages to XML format and splitting of messages. In order for property promotion to occur, an appropriate (flat-file or XML) disassembler must be specified at this stage.
  • Validate: In this stage, messages are validated against a collection of schemas. Pipelines process only messages that conform to the schemas specified in this component, if present. If a message whose schema is not associated with any component in the pipeline is received by the pipeline, the message is not processed. Depending on the adapter, the message is either suspended or an error is issued to the sender. This stage runs once per message created by the Disassemble stage. The built-in validate component can be used in this stage as well as in other stages.
  • Resolve Party: In this stage, the certificate associated with the sender’s security identifier (SID) is mapped to the corresponding configured BizTalk Server party. If the message was digitally signed, the component uses the signature to look up a Microsoft Windows® identity in the BizTalk Server 2010 Configuration database. If the message carries the authenticated SID of a Windows user, this identity is used. If neither mechanism succeeds, the sender is assigned a default anonymous identity. Party resolution is an important feature for managing trading partner relationships. Not all adapters support party resolution.

In simpler terms, BizTalk Pipelines are the helpful assistants that take care of the stuff coming into BizTalk, tidy it up, and hand it over to other parts of the system so everything runs smoothly. They’re like the backstage crew making sure the show (or data flow) goes on without a hitch.

BizTalk Server solution: Create a custom receive pipeline to validate messages against known schema(s)

Of course, as always, there are multiple ways to accomplish this solution. But one of the most simple and elegant solutions to achieve this is by:

  • Create a custom receive pipeline that contains at least these two out-of-the-box components:
    • The XML Disassembler pipeline component in the Disassemble stage. In this scenario, its primary function will be to promote the content properties from individual document levels to the message context.
    • Then, the XML Validator pipeline component in the Validate stage. This component validates the message against the specified schema or schemas, and if the message does not conform to these schemas, the component raises an error, and Messaging Engine places the message in the suspended queue.

Assuming that we already have created our BizTalk Server project and our schema representing this simple message:


 Sandro
 Pereira
 1978-04-04
 
Portugal
4415

Where all fields are mandatory, with the exception of ZipCode, which is optional. Also, DateOfBirth is a xs:date. The rest are simple strings.

In order to accomplish our requirements inside the BizTalk Server, we just need in our solution to create a custom receive pipeline. To do so, we need to:

  • Right-click on the project name and select the option Add > New Item…
  • On the Add New Item window, on the left tree, select BizTalk Project Items and then select Receive Pipeline. Give it a proper name and click Add.
  • From the Toolbox, drag and drop the XML Disassembler pipeline component in the Disassemble stage and the XML Validator pipeline component in the Validate stage.
  • In this scenario, we will leave the default configuration of the XML Disassembler pipeline component.
  • Now, select the XML Validator pipeline component, and in the Properties windows (normally present on the right side of the Visual Studio Editor), click on the three dots () of the Document schemas property.
  • On the Shema Collection Property Editor window, on the Available schemas panel, select your schema, in this case, Person, and then click Add. Click OK.
  • Save your pipeline. Build your project and deploy it. That’s it!

Now, when you deploy the solution into your BizTalk Server environment and configure a physical receive port and location associating this receive pipeline you have created, what happens is that:

  • If the port receives a Person XML, it will deeply validate the instance of that message against the schema, and if the message does not conform to these schemas, the component will raise an error.
  • If we receive any other type of XML message, the port will consume them without deeply validating the instance against the schema since it is not a Person.

The challenge

This is a common request that we will find in many solutions, and it is fully supported inside Logic Apps. It is not in the same way as BizTalk Serve since we do not have a concept of Receive Ports and Receive Location in Logic App or any other service in Azure, but it is still fairly easy to accomplish.

The main challenge here is that depending on which type of Logic App you will use, we will have different approaches:

  • If we use Consumption to have XML “premium” capabilities, we need to have an Integration Account to store the schemas and perform validation (and many other features). However, this will have an additional cost.
  • If we use Standard, we already have built-in XML capabilities, and for these requirements, we will not need an Integration Account.

The good news is that BizTalk Server Schemas are fully supported in Logic Apps, both Consumption and Standard, so we don’t need to worry about generating or recreating them. We can just copy them from our BizTalk Server solution and use them in our Azure Integration Service (AIS) solution.

Building the Logic App Consumption Solution

In this Proof-of-Concept (POC), first, we need to copy our schema to our Integration Account. If you don’t have one, you can create it by:

  • In the Azure portal search box, enter Integration accounts, and select Integration accounts.
  • Under Integration accounts, select Create.
  • On the Create an integration account pane, provide the following information about your integration account like, Subscription, Resource Group, Integration Account Name, Pricing Tier, Storage account, and so on.
  • When you’re done, select Review + create.

To import the schema into our Integration Account, we need to:

  • Access the integration account, and under the Settings section, select Schemas and then click + Add.
  • On the Add Schema panel, browse for our XML Schema, leave the default settings, and click OK.

Now that we have our schema, the next thing we need to do is create our Logic App that will act as our BizTalk Server Receive Pipeline. To accomplish that, you need to:

  • On the Azure Portal, create a new Logic App Consumption and give it a proper name, in our case: LA-SA-XMLValidation-POC.
  • Before we start to create our logic, we first need to associate the Integration account with this Logic App by selecting Workflow settings under the Settings section. In the Integration account property, select your Integration account.
  • Click Save.
  • Now, click on the Logic App designer under the Development Tools section. From the Templates, select Blank Logic App.

Note: We could use any connector as input, like File or OneDrive, but for simplicity of this POC, we will use the Request.

  • For the trigger, select Request > When a HTTP request is received trigger and leave it as default.
  • Next, click on + New step, select the XML connector, and then the XML Validation action.
  • On the XML validation action, perform the following configurations:
    • On the Content property, set it to be the body of the When a HTTP request is received trigger.
    • On the Schema name property, select the correct schema from your integration account. In our case, Person.
  • And, of course, now you need to implement your business logic or call a child Logic App.
  • I endup creating a try-catch statement just to prove this functionality.
  • Now, you just need to test your solution.

Building the Logic App Standard Solution

Making the same solution in Standard will be easier since Logic App Standard already has out-of-the-box support for XML Schemas and maps. We just need to add them to our logic app resource. Once again, the BizTalk Server XML Schemas will be fully supported, so we just need to copy them from our BizTalk Server solution.

Let’s create our Logic App Standard Solution. To do that, you need the following:

  • In Visual Studio Code, close all open folders.
  • In the Azure window, on the Workspace section toolbar, from the Azure Logic Apps menu, select Create New Project.
  • Define the folder of the project or browse to the location where you created your project folder, select that folder, and continue.
  • From the templates list that appears, select either Stateful Workflow or Stateless Workflow. This example selects Stateful Workflow.
  • Provide a name for your workflow and press Enter. This example uses LA-XMLValidation-POC as the name.
  • From the Visual Studio Activity Bar, open the Explorer pane if it is not already open.

Now, the first thing we will do is add our Schema to our project. To do that, you need to:

  • Access the project folder using File Explorer, enter the Artifacts > Schemas folder, and copy the schema into this folder. And you will automatically see it on the Visual Studio Code Explorer.

Now, we need to create the business logic of our Logic App.

  • Right-click on our LA-XMLValidation-POC workflow.json file and select Open Designer.
  • And we are going to implement the same business logic we did with Consumption.
  • Start adding a Request > When a HTTP request is received trigger and leave it as default.
  • Next, click on + Add an action, select the XML connector, and then the XML Validation action.
  • On the XML validation action, perform the following configurations:
    • On the Content property, set it to be the body of the When a HTTP request is received trigger.
    • On the Source property, leave LogicApp.
    • On the Schema name property, select the correct schema from your project. In our case, Person.
  • And, of course, now you need to implement your business logic or call a child Logic App.
  • I endup creating a try-catch statement just to prove this functionality.
  • Now, you just need to test your solution.

I hope you find these architecture samples useful, and stay tuned for more BizTalk Server to Azure Integration Services.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

My Journey at the INTEGRATE Conference and why should you attend this event

My Journey at the INTEGRATE Conference and why should you attend this event

INTEGRATE 2024 was recently announced, and although we still don’t have a list of speakers, I’m super excited about it. The event will occur between 10 & 11 June 2024 at Kings Place, London. You can do an early register and know more about the event here: https://www.serverless360.com/events/integrate-2024.

This is one of my favorite tech conferences, and talking about it is like taking a memory lane full of great memories, and it all started because of a pizza! At a time when there weren’t too many international tech events, a group of 5 tech guys from different countries in Europe – Nino Crudele (Italy), Saravana Kumar (UK), Steef-Jan Wiggers (Netherlands), Tord Glad Nordahl (Norway) and me (Portugal) – decided to gather in Piacenza to eat a pizza and since we were there why not do an event about integration. At least this way, we had an excuse/reason to go. And that, my friends, was the premise and the real reason behind one of the world’s biggest events on integration!

That pizza reunion gave birth to what we call “The BizTalk Crew” and our first event on May 24, 2012, in the OVERNET office in Milan, Italy (check this blog post). Four months later, we were traveling again, this time To Stavanger, Norway, to do our second event, where we had our first guest speaker, Lex Hegt. (see blog post). We call these events BID’s (BizTalk Innovation Day)!

As the UK was a more central and strategic country, Saravana convinced us to do a major event that we would call the BizTalk Summit (now known as INTEGRATE), and on January 16, 2013, we did our first BizTalk Summit at Microsoft Office in London, Victoria where we had 128 attendees, more than 70 different companies across 16 countries (Austria, Belgium, Denmark, France, Germany, Italy, India, Ireland, Netherlands, Norway, Portugal, Spain, Sweden, Switzerland, UK, USA), more than 10 Microsoft Integration MVP’s present (speakers, Q&A members and attendees) and 3 members of Microsoft Product group.

Fact: I was the first speaker to speak about cloud integration technologies at INTEGRATE in a session about BizTalk Azure Service EAI/EDI capabilities

We made this round trip in several countries (Portugal, Italy, the UK, and Norway) for a few years until we dedicated it to this single event: INTEGRATE. Saravana and his team took charge of this event and put it to another level! The rest is the story that most of us already know.

Here is a glimpse of my story on the INTEGRATE conference:

Why should you attend this event?

Learn from the experts in the field.

The #1 reason people should come to this event is to make themselves more valuable to their companies by learning new ideas and techniques from experts in the field. I’m not only talking about the speakers but also the attendees, your peers with real-world implementation experience from around the world, with similar or different approaches and needs. This conference will provide valuable information on integration topics, new trends, and technologies that will benefit your firm and your clients.

As I usually say, this conference has a 360 coverage on enterprise integration topics! There is something for everyone regardless of job role: developers, administrators, architects, decision-makers, and so on, both on brand new topics on the cloud and more legacy components on-premise like BizTalk Server and Host Integration Server.

Last year, over 700 people worldwide joined INTEGRATE, so get insight and answers to your questions from these real-world experts.

The knowledge and experience of all the attendees, speakers, and product group members at these events was unreal!!! You will not find an opportunity like this every day.

At these events, speakers are usually available to chat and answer questions. And if you are bold, remember to pull out your camera and ask if you can take a photo with them.

Take this opportunity to connect face-to-face with experts and peers who will give you valuable insight into industry trends, meet influential colleagues, and generate new solutions to common problems, like increasing profitability and solving staffing challenges.

I love being around great thinkers…it can really get your mind flowing. We always have something new to learn from one another and take back with us.

Interact with Microsoft Product Group and Most Value Professionals.

There is no other Integration conference in the world with 12+ Microsoft Product Group members and 13+ Microsoft Most Value Professionals (MVPs) focused on Integration in a single conference.

DO NOT BE SHY and take this opportunity to have one-on-one inspiring conversations with the product group – they are eager to get your feedback – and with MVPs,

Network, Connect, and Reconnect with Colleagues.

Take this fantastic opportunity to meet the people you have been following on Twitter and blogs and network with others interested in the same things you are. This is another chance for return attendees to hang out with the most intelligent people you know – and I’m not talking about the speakers! – and meet new ones.

Please don’t be afraid or shy, don’t wait for them and take your chances. Engage the people you want to meet by easily saying, “Hi, my name is… “. This experience can be a great morale booster for you. Lifelong friendships and connections have evolved from such conferences – I’m speaking from personal experience!

The next big thing…

What we can expect for the future… Connect directly with the Microsoft Product Team to gain exclusive insights into Microsoft’s upcoming plans. Engage in conversation and ask your burning questions. Discover how Microsoft is shaping the future.

Refresh and recharge… and have some fun.

This will be two days off busy work schedules to learn, refresh, and recharge. And most importantly, have some fun!

Once you’ve soaked in all the knowledge from the speakers and breakout sessions, your day will be far from over. Use the evenings to connect with colleagues and experts in a relaxed, fun environment like a Bar, speaker dinner, or restaurant. Sometimes, the most interesting conversations happen at the end of each day in the Lobby Bar.

And then return to work reinvigorated.

This is one conference you can’t afford to miss! What are you waiting for? Go ahead and book it!

I will be there… and don’t hesitate to reach me and say hi! It doesn’t matter that it is to say hi to get to know each other, to ask a question, or a business opportunity.

See you there!

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Friday Fact: Is BizTalk Server dead? Hell, no!

Friday Fact: Is BizTalk Server dead? Hell, no!

This will probably be one of the most polemic Friday Facts, but this topic brings miss feelings depending on who you are. On the other hand, I work in both worlds, enjoy this mix of capabilities, and always try to tell the real picture to my clients. But I will never say something untrue simply because it is neither polite nor correct. We already live in a work of too much fake news! Let’s try to avoid a new “old” one.

No, BizTalk Server is not dead!

While there has been a shift towards cloud-based integration solutions, such as Azure Logic Apps and Azure Service Bus, BizTalk Server continues to be actively used and maintained by organizations worldwide.

BizTalk Server remains a reliable and mature integration platform that offers a wide range of capabilities for connecting systems, orchestrating business processes, and managing enterprise-level integrations. Many organizations still rely on BizTalk Server for complex integration requirements, especially in industries with prevalent legacy systems and on-premises integrations.

As with everything in life, at some point, everything will have an end, but don’t try to bury something that is still very much alive!

This rumor that the BizTalk Server is dead is made mainly by:

  • Other Microsoft competitors. Those who want to increase their business and gain opportunities with customers who are unduly influenced not to use BizTalk Server or migrate BizTalk Server to other middleware products.
  • Consultant companies that don’t have, never had, or have minimal technical expertise in BizTalk Server and take this opportunity to use Azure Service instead or other technologies.

All this is combined with Microsoft’s marketing strategy, which clearly has Azure as its primary target, which is OK. All vendors these days are cloud-driven.

The first time I heard BizTalk Server was dead was approximately in 2006! Since then, many new products/services have been the future of integration and BizTalk Server replacements. Where are they?

I can give some samples:

  • Project OSLO, status: dead.
  • Windows Workflow foundation, status: dead.
  • Azure BizTalk Services (aka Windows Azure Service Bus EAI and EDI Labs), status: dead.
  • Integration Service Environment (ISE) in Azure Logic Apps, status: in-life support – On August 31, 2024, the ISE resource will retire due to its dependency on Azure Cloud Services (classic), which retires at the same time.

Ok, but it is the latest product version, and Microsoft is not investing in it!

First, the current version of BizTalk Server has Mainstream support for more than 4 years! Maybe more than many products that you know!

Second, there has yet to be an official announcement from Microsoft that this is the last version of the product! Which there isn’t!

Third, I don’t see any investment from Microsoft in Logic App Consumption these last years; does it mean Logic App Consumption is dead? Hell, NO! It implies they heavily invest in Logic App Standard to close the missing feature gaps.

The time of products having a long roadmap that companies could rely on is over; get used to it. But that doesn’t mean that products are dead or deprecated.

The future is the cloud, and the cloud is where Microsoft is investing, with new features released almost daily. Microsoft has provided a clear path for this migration to customers who want to take this approach or if it makes sense to move. But Microsoft is not forcing them to shift away from BizTalk Server. The effort required to make this migration journey will be reduced as the capabilities of Microsoft Azure increases.

And as I wrote in my book, maybe, or maybe not, the core issue is that BizTalk Server has already survived a decade longer than people intended or expected. In 2010, the word was clearly shared that the “End of BizTalk Server” was nigh, and the community should prepare for this event. Here we stand in 2024 with no clear guidance defining the end of BizTalk Server and still being a very valuable product… get used to it!

BizTalk Server boasts nearly 24 years of resilience. Despite numerous attempts to bring it down, it has proven to be an unstoppable force. The secret lies in the fact that BizTalk Server is nothing short of a powerhouse! In the spirit of T-800, also known as The Terminator, it echoes the iconic words:

I’ll be back

To lazy to read? We’ve got you covered! Check out our video version of this content!

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Server error: BizTalk Server cannot access SQL Server

BizTalk Server error: BizTalk Server cannot access SQL Server

A client called me this week to help with their BizTalk Server production environment. BizTalk was not running, and they needed to know the reason why. Quickly, while we investigated the issue, we saw on the BizTalk Server administration console the following error:

BizTalk Server cannot access SQL Server. This could be due to one of the following reasons:

  1. Access permissions have been denied to the current user. Either log on as a user that has been granted permissions to SQL and try again, or grant the current user permission to access SQL Server.
  2. The SQL Server does not exist, or an invalid database name has been specified. Check the name entered for the SQL Server and database to make sure they are correct as provided during SQL Server installation.
  3. The SQL Server exists, but is not currently running. Use the Windows Service Control Manager or SQL Enterprise Manager to start SQL Server, and try again.
  4. A SQL database file with the same name as the specified database already exists in the Microsoft SQL Server data folder.

Internal error from OLEDB provider: “A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 – Could not open a connection to SQL Server)” (WinMgmt)

Cause

In this case, the error message clearly specifies perfect paths to troubleshoot and fix the issue. We knew that the first two and the last one didn’t fit our issue because SQL Server exists, and now one has changed access permission.

So, we immediately focus on point number three: The SQL Server exists, but is not currently running. We had the SQL Server Management Console open, and it appeared to be running, but when we checked the services, we realized that the SQL Server (BIZTALK) was not running but Starting.

But any attempt on our part to quickly try to get the service running was futile. Even restarting the machine was unsuccessful.

This SQL Server behavior surprised me – to be clear, at this point, we knew that this was not a BizTalk Server issue but a SQL Server issue that was affecting BizTalk Server – and that forced me to investigate one of the obvious reasons that everyone says they monitor, but… the free space on the hard drive! And guess what? We had 0 free space on C drive.

And that was the main reason for this issue in our case.

Solution

So, to solve this issue, we had to:

  • First, of course, the quick win approach was to free some space on the hard drive – we were able to clean 5GB.
  • Then, start the SQL Server (BIZTALK) service and dependencies again. After freeing up disk space, we didn’t find any issues in getting this started.
  • And, of course, we asked the IT team to increase the C drive with extra disc space.
  • Finally, we implemented a monitoring script to notify us about disk space issues: Monitoring disk spaces in your BizTalk environment with PowerShell

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira