Another day, another BizTalk Server Pipeline Component! Today, I decided to release a brand new component called the Remove XML Empty Nodes Pipeline Component.
For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receiving and sending pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.
Remove XML Empty Nodes Pipeline Component
As the name mentions, the Remove XML Empty Nodes Pipeline Component is a pipeline component that can be used to remove empty nodes present in an XML message. You can use this component in any stage of a receive or send pipeline.
This component has a single property that requires you to setup:
DisableRemoveBOM (boolean): This allows you to enable or disable the process of removing empty nodes from an XML message.
How to install it
As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:
Like all previous, to use the pipeline component, I recommend you create generic or several generic pipelines that can be reused by all your applications and add this pipeline component in any required stage of a send or receive pipeline.
Download
THIS COMPONENT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.
You can download the Remove XML Empty Nodes Pipeline Component from GitHub here:
Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
Time to get back to BizTalk Server and publish new resources on this amazing product and also return to one of my old pet projects: the BizTalk Pipeline Components Extensions Utility Pack.
Today, I decided to create a brand new component called the Send File To a Date-Based Structure Encoder Pipeline Component.
For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receiving and sending pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.
Send File To a Date-Based Structure Encoder Pipeline Component
The Send File To a Date-Based Structure Encoder Pipeline Component is a pipeline component that can be used in a send pipeline, as the name mentioned inside the Encode stage, and it allows you to send an outbound file to a dynamic folder path organized by date tree:
yyyyMMdd
In other words, you will define the base path on the adapter URI, and then this component will use that base path to add a dynamic structure inside that path based on the date.
This component doesn’t require any property configuration.
How to install it
As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:
Like all previous, to use the pipeline component, I recommend you create generic or several generic pipelines that can be reused by all your applications and add the pipeline component in the Encode stage. The component can be used only on the send pipelines.
Download
THIS COMPONENT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.
You can download Send File To a Date-Based Structure Encoder Pipeline Component from GitHub here:
Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
I’m back to another blog post about BizTalk Server! I know that my latest post has been about Azure Logic Apps, and you may count on seeing many more in the future. And the reason is that I work both on-premise with BizTalk Server and in the cloud with Azure Integration Services… but relax, I will continue to post many things about BizTalk Server. BizTalk Server is not dead. It is well alive, contrary to what many think!
For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receive and sent pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.
CSV Structure Validation Pipeline Component
The CSV Structure Validation Pipeline Component is a pipeline component that can be used to validate the structure of a basic CSV or flat file component before being converted to an XML message. Of course, this same strategy can be used in more complex scenarios.
This is the list of properties that you can set up on the CSV Structure Validation pipeline component:
Property Name
Description
Sample Values
DelimiterChar
Define what is the delimiter char inside a line.
;
NumberFieldsPerLine
Number of fields expected per line
3
Note that this component takes as granted that the line delimiter is the CRLF (Carriage ReturnLine Feed).
If we take this example:
one;two;t
1;2;2
Then we need to configure the port in the following way:
If we receive an invalid file, then the component will raise an error suspending the message in the BizTalk Server Administration Console. For example, with the following error message:
Invalid format data in the document. Line number 3 has 2 fields, and it should be expected 3 fields
If you are wondering why create a Pipeline component to validate the structure of the CSV or flat-file document? Can we use instead the Flat-File Schema to do this structure validation?
And the answer is Yes and No! In many cases and with many requirements, we don’t need to create a custom pipeline component. Using a Flat-File schema can be used to address the goals but in other scenarios doing a CSV Validation with the Flat-File schema is not enough. However, I will leave that “discussion” to my next BizTalk Server Best practices, tips and tricks.
How to install it
As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:
Like all previous, to use the pipeline component, I recommend you to create generic or several generic pipelines that can be reused by all your applications and add the Message Archive Pipeline Component in the stage you desire. The component can be used in a stage of the receive and send pipelines.
Download
THIS COMPONENT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.
You can download CSV Structure Validation Pipeline Component from GitHub here:
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
And once again, the Azure Logic Apps team is looking to learn about how are you using BizTalk Server artifacts, this time: what kind of BizTalk Server Custom Pipeline Components are you using in your Enterprise Applications to better understand how Microsoft can best address future integration needs within the Azure Integration Services platform and maybe bring identical capabilities into it.
No matter if you are considering migrating in the future to Azure Integration Services or staying on-premises for a couple of more years (or forever) this is ANOTHER great opportunity to provide feedback to the Microsoft Integration team and positively influence the outcome of new features.
So, for customers that are using BizTalk Server or were using BizTalk Server in the past, this survey is for you! If you are a consulting company providing services to clients using BizTalk Server, this surveyis also for you! The Azure Logic Apps team would love to hear from you!!! They want AGAIN to enter and poke around your brain, your world and gather all the feedback regarding BizTalk Server:
How many Custom Pipeline Components do you have?
Why do you use Custom Pipeline Components?
Within your Custom Pipeline Components, do you leverage message streams?
For messages that are processed by your Custom Pipeline Components, what is the largest file you process?
What is your comfort level with the Custom Pipeline Components that your organization has?
As we consider how to best support Custom Pipeline Component functionality (capabilities) in Azure Integration Services, is there anything that you would like us to consider?
Once again, don’t complain in the future about the lack of features that will suit you better. The team is interested in how they can best support customers transitioning integration workloads to Azure Integration Services (AIS)… you are not going to move to Azure? The survey does not take that long to respond to, so if you are using BizTalk Server try to respond nevertheless, by doing that you can help other customers on their journey.
I did my part!
Please fill out the following survey to help Azure Logic Apps:
And ONCE AGAIN! For the bad mouths of this universe or for those who like to create rumors because they have nothing to say… this survey does not mean that BizTalk Server is dead! The goal is to help Microsoft prioritize upcoming investments in Azure Integration Services (AIS).
This time I decided to create a brand new component called the Message Archive Pipeline Component.
For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receive and sent pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.
Message Archive Pipeline Component
The Message Archive Pipeline Component is a pipeline component that can be used to arch incoming/outgoing messages from any adapters into a local or shared folder. It is very identical and provides the same capabilities as the already existing BizTalk Server Local Archive pipeline component:
It can be used in any stage of a receive pipeline or send pipeline;
It can be used in multiple stages of a receive pipeline or send pipeline;
It provides an option for you to specify the location path for where you want to save the message: local folder, shared folder, network folder.
It can be used from any adapter:
If the adapter provides the ReceivedFileName property promoted like the File adapter or FTP adapter the component will take this value into consideration and save the message with the same name;
Otherwise, it will use the MessageID, saving the file with the MessageID has its name without extension.
So what are the differences between them?
The significant differences between these two components are that the Message Archive Pipeline Component allows you to:
Set the filename using macros like %datetime%, %ReceivePort%, %Day%, etc.
For example, %ReceivePort%_%MessageID%.xml
Set the archive file path once again using macros:
for example C:BizTalkPortsArchiveARCHIVE%Year%%Month%%Day%
If you don’t want to overwrite existing files, you can specify an additional Macro to distinguish them.
For example _%time%
You can set up this component for high performance using forward-only streaming best practices.
In short, this means developing your pipeline components so that they do their logic either as a custom stream implementation or by reacting to the events available to you through the Microsoft.BizTalk.Streaming.dll stream classes. Without ever keeping anything except the small stream buffer in Memory and without ever seeking the original stream. This is best practice from the perspective of resource utilization, both memory and processor cycles.
This is the list of properties that you can set up on the archive pipeline component:
Property Name
Description
Sample Values
OverwriteExistingFile
Define if the archive file is to be overwritten if already exists
true/false
ArchivingEnabled
Define if the archive capabilities are enabled or disabled
true/false
ArchiveFilePath
Archive folder path. You can use macros to dynamically define the path.
C:Archive%Year%%Month%%Day%
ArchiveFilenameMacro
File name template. If empty the source file name or MessageId will be used. You can use macros to dynamically define the filename.
%ReceivePort%_%MessageID%.xml
AdditionalMacroIfExists
If a file already exists and OverwriteExistingFile is set to false, a suffix can be added. If empty the MessageId will be used. You can use macros to dynamically define this suffix.
_%time%
OptimizeForPerformance
Setting to apply high performance on the archive
true/false
Available macros
This is the list of macros that you use on the archive pipeline component:
Property Name
Description
%datetime%
Coordinated Universal Time (UTC) date time in the format YYYY-MM-DDThhmmss (for example, 1997-07-12T103508).
%MessageID%
Globally unique identifier (GUID) of the message in BizTalk Server. The value comes directly from the message context property BTS.MessageID.
%FileName%
Name of the file from which the File adapter read the message. The file name includes the extension and excludes the file path, for example, Sample.xml. When substituting this property, the File adapter extracts the file name from the absolute file path stored in the FILE.ReceivedFileName context property. If the context property does not have a value the MessageId will be used.
%FileNameWithoutExtension%
Same of the %FileName% but without extension.
%FileNameExtension%
Same of the %FileName% but in this case only the extension with a dot: .xml
%Day%
UTC Current day.
%Month%
UTC Current month.
%Year%
UTC Current year.
%time%
UTC time in the format hhmmss.
%ReceivePort%
Receive port name.
%ReceiveLocation%
Receive location name.
%SendPort%
Send port name.
%InboundTransportType%
Inbound Transport Type.
%InterchangeID%
InterchangeID.
How to install it
As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:
In this particular component, we need to have this DLL:
BizTalk.PipelineComponents.MessageArchive.dll
How to use it
Like all previous, to use the pipeline component, I recommend you to create a generic or several generic pipelines that can be reused by all your applications and add the Message Archive Pipeline Component in the stage you desire. The component can be used in a stage of the receive and send pipelines.
Download
THIS COMPONENT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.
You can download Message Archive Pipeline Component from GitHub here:
For those who are not familiar, this project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in received and sent pipelines, which will extend BizTalk’s out-of-the-box pipeline capabilities.
Receive Location Name Property Promotion Pipeline Component
Receive Location Name Property Promotion Pipeline Component is a simple pipeline component to promote the Receive Location Name (ReceiveLocationName) property to the context of the message. Several BizTalk Server context properties are not promoted by default with BizTalk Server, which means that they are not available for routing.
One such property is the ReceiveLocationName property. While the ReceivePortName property is available in the BTS namespace, the ReceiveLocationName property is not promoted. It cannot be used for routing nor access it from inside an orchestration.
My team and I kept that behavior creates this project as a proof-of-concept to explain how you can promote properties to the context of the message.
Create a Property schema
To promote properties to the context of the message, we will need a Property Schema with these properties. In our case, we will add only one property called: ReceiveLocationName.
Property Name
Date Type
Property Schema Base
ReceiveLocationName
xs:string
MessageContextPropertyBase
Note: A MessageContextPropertyBase property means that the XPath may or may not exist.
Create a Pipeline Component
To actually promote the properties to the context of the message, we need to create a pipeline component to do the trick.
Once you deploy the property schema and a receive pipeline component containing the Receive Location Name Property Promotion Pipeline Component, you can start to apply Content-based routing using the Receive Location Name.
Download
THIS COMPONENT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.
You can download Receive Location Name Property Promotion Pipeline Component from GitHub here:
For those who are pt familiar, this project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in received and sent pipelines, which will extend BizTalk’s out-of-the-box pipeline capabilities.
BizTalk PDF2Xml Pipeline Component
BizTalk PDF2Xml Pipeline Component is, as the name mentioned, a decode component that transforms the content of a PDF document to an XML message that BizTalk can understand and process. The component uses the itextsharp library to extract the PDF content. The original source code was available on the CodePlex (pdf2xmlbiztalk.codeplex.com). Still, I couldn’t validate who was the original creator. So, the component first transforms the PDF content to HTML, and then using an external XSLT, will apply a transformation to convert the HTML into a know XML document that BizTalk Server can process.
My team and I kept that behavior, but we extended this component and added the capability also to, by default, convert it to a well know XML without the need for you to use an XSTL transformation directly on the pipeline.
How does this component work?
This is the list of properties that you can set up on the PDF2XML pipeline component:
Property Name
Description
Sample Values
InternalProcessToHTML
Value to decide if you want the component to transform the PDF content to HTML or XML
True/False
IsToApplyTrasnformation
Value to decide if you want to apply a transformation on the pipeline component or not
True/False
XsltFilePath
Path to an XSLT transformation file
C:transfmymap.xslt
Once you pass the PDF by this component and depending on how you configure it, the outcome can be:
All PDF content in an HTML format;
All PDF content in an XML format;
Part of the PDF content on an XML format (if you apply a transformation)
Unfortunately, on my initial tests, this component works well with some PDF files, but others simply ignore its content. Nevertheless, I make it available as a prof-of-concept.
Download
THIS COMPONENT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.
You can download BizTalk PDF2Xml Pipeline Component from GitHub here:
You may already know my BizTalk Pipeline Components Extensions Utility Pack project available on GitHub for those who follow me. The project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in received and sent pipelines, which will extend BizTalk’s out-of-the-box pipeline capabilities.
This month my team and I update this project with another new component: ODBC File Decoder Pipeline Component.
ODBC File Decoder Pipeline Component
ODBC File Decoder Pipeline Component is, as the name mentioned, a decode component that you can use in a receive pipeline to process DBF or Excel files. Still, it can be possible to process other ODBC types (maybe requiring minor adjustments). The component uses basic ADO.NET to parse the incoming DBF or Excel files into an XML document.
If consuming DBF files is not a typical scenario, we can’t say the same for Excel files. Yet, we often find these requirements, and there isn’t any out-of-the-box way to process these files.
Honestly, I don’t know the original creator of this custom component. I came across this old project that I found interesting while organizing my hard drives. However, when I tested it in BizTalk Server 2020, it wasn’t working correctly, so my team and I improved and organized the structure of the code of this component to work as expected.
How does this component work?
If we take has an example and Excel File (.xls) that has a table with:
FirstName
LastName
Address
Position
We can use the ODBC File Decoder Pipeline Component to process these documents. First, we need to create a custom pipeline component and add this component to the decode stage. Once we publish this pipeline, we can configure it as follows to be able to process these types of Excel documents:
ConnectionString: ODBC Connection String
For Excel documents: Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=Excel 8.0;
For DBF: Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=dBASE IV;
DataNodeName: Rows node name for the generated XML message
For example: Line
Filter: Filter for Select Statement
Leave it empty
NameSpace: Namespace for the generated XML message
For example: http://ODBCTest.com
RootNodeName: Root node name for the generated XML message
For example: TesteXMLResult
SqlStatement: Select Statement to Read ODBC Files
For example: SELECT * FROM [Sheet1$]
TempDropFolderLocation: Support temp folder for processing the ODBC Files
For example: c:Tempodbcfiles
TypeToProcess: Type of file being Processed
0 to process Excel
1 to process DBF
The outcome result will be an Excel similar to this:
<?xml version="1.0" encoding="utf-8"?><ns0:TesteXMLResult xmlns:ns0="http://ODBCTest.com">
<Line>
<FirstName>Fred</FirstName>
<LastName>Black</LastName>
<Address>187 Main Street</Address>
<Position>Sales Lead</Position>
</Line>
<Line>
<FirstName>John</FirstName>
<LastName>Smith</LastName>
<Address>182 Front Street</Address>
<Position>Marketing</Position>
</Line>
<Line>
<FirstName>Sally</FirstName>
<LastName>White</LastName>
<Address>183 Main Street</Address>
<Position>Marketing</Position>
</Line>
</ns0:TesteXMLResult>
Does it work with Xlsx files?
Honestly, I didn’t try it yet. I didn’t have that requirement, and I only remember this scenario now that I’m writing this post, but it should be able to process it. The only thing I know is that we need to use a different connection string, something similar to this:
Continuous Integration/Continuous Delivery is a development practice that enables you to accelerate your deployments and delivery time to the customer, by reliably releasing software at any time and without manual intervention.
For this post series, I will explain how to enable this practice, oriented to Logic Apps and Azure Pipelines.
We will start by Building the Logic App, using Visual Studio. I will not approach Logic Apps Preview, because since it’s still a preview feature, many changes can happen and render all this useless.
As you may know, to create Logic Apps in Visual Studio, there are a few requirements, such as:
Visual Studio 2015, 2017, 2019 or greater, if available
Azure SDK
Azure Logic Apps Tools for Visual Studio Extension (if using VS)
An active Azure subscription
Time, will and patience.
After you have all this installed, you can begin to create and let your creativity flow!
We’ll start from scratch. Open you VS and start a new Project, by selecting the Azure Resource Group C# template and the Logic App template after that.
You will end with a new Project, and Solution if it’s the case, with 3 files. The PowerShell file is the deployment script that VS uses to automate the ARM deployment. Only in a special case do you need to fiddle with this file.
The other two files are the Logic App code and the Parameters file. You will need to create a new one, to be used as a Template for the Azure Pipeline. So go ahead and copy the Parameters file and change the name to LogicApp.parameters.template.json .
You should end with something like this.
This Parameters Template file will contain our Tokens, which will be replaced in the Pipeline using the “Replace Tokens” Task. In the coming posts, I will explain how it works and why we’re using it.
For the sake of simplicity, I’ll just use the Service Bus connector, where depending on the input, I’ll send a message to the Queue with the provided information.
After creating the connection, you will see that, in the back code, several parameters and a Resource node were created as well, that contain the link and inputs for this connection.
Even when working in a single Resource Group, it is a good practice to prepare this for CI CD, because even though it’s static, connections change and instead of having to re-do all of it, you just need to re-deploy the pipeline with the new configurations.
We will not be making any changes to the Resource node, but to the action path and parameters. This will define that instead of having a fixed value, it will point to the parameter itselft, making it possible to have an ARM parameter configurable in the Pipeline.
BizTalk Server 2020 – 20 days, 20 posts – day 19. To finalize this topic about the BizTalk Pipeline Components Extensions UtilityPack project, here another brand new component: XML Namespace Stripper Pipeline Component. I actually create this component for a need in a recent RosettaNet project.
XML Namespace Management Pipeline Component
This custom XML Namespace Stripper Pipeline Component is a pipeline component for BizTalk Server which can be used in a Send Pipeline (Encode stage) to remove all namespaces and prefix from an XML message.
Once again, the goal of this component is to clean up all namespaces and prefix present in XML outbound messages, transforming the the message from this:
To use this pipeline component in your projects you just copy the NamespaceStripper.dll file into the Pipeline Components folder that exists in the BizTalk Server installation directory:
You do not need to add this custom pipeline component to be used by the BizTalk Runtime to the Global Assembly Cache (GAC).
What is BizTalk Pipeline Components Extensions Utility Pack?
BizTalk Pipeline Components Extensions Utility Pack is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in received and sent pipelines, which will provide an extension of BizTalk out-of-the-box pipeline capabilities.