BizTalk Oracle Adapter error: The assembly required for type mapping not found.

BizTalk Oracle Adapter error: The assembly required for type mapping not found.

Today while I was developing a new BizTalk Server solution that communicates with the Oracle database, I encountered a familiar issue that I forgot to document in the past:

Microsoft.ServiceModel.Channels.Common.MetadataException: The assembly required for type mapping not found.

That forced me to once again lose time not only to remember but to find how I could solve this.

Cause

This error happens when you try to call an Oracle Procedure, function, or package that contains User-Defined Types or UDTs. The UDTs can be present in the following artifacts:

  • Interface tables and interface views containing UDT columns.
  • Database tables and views containing UDT columns.
  • Packages, stored procedures, and functions containing UDT parameters.

Oracle UDTs help represent complex entities as a “single” object that can be shared among the applications.

BizTalk Server supports Oracle UDTs, but unlike what happens with SQL Server, which natively supports these types, in Oracle, we need to configure some more properties and generate a UDT DLL.

When this error occurs, two things may happen or maybe the reason:

  • You forgot to configure, in the Schema generation, the following properties:
    • GeneratedUserTypesAssemblyFilePath
    • GeneratedUserTypesAssemblyKeyFilePath
  • Or you forgot in runtime (aka receive location or send port) to configure the following property:
    • userAssembliesLoadPath

Solution

To solve this issue, we need to guarantee to perform the following steps:

  • It is necessary to create a signed assembly (DLL) of the User-Defined Types (UDTs) created in Oracle and that correspond to those interpreted by the WCF-Oracle Adapter. To do this, when creating the schemas from the Consume Adapter Service option, these assemblies must be created specifying:
    • On the GeneratedUserTypesAssemblyFilePath property, we need to provide a full path and name of the DLL that the wizard will create for us.
    • And on the GeneratedUserTypesAssemblyKeyFilePath property, the strong name key (.snk) path that the wizard will use to sign the DLL.

Both these properties are present in the UDT .NET Type Generation – Design Time section of the Binding Properties.

Once again, this will create for us a UDT DLL on the path we define and that we need to use in runtime:

Once we deploy our schemas and create the receive location or send port, we then need to make sure that we configure the following property with the path to the UDT DLL:

  • userAssembliesLoadPath 

Notice: even if you deploy the UDT DLL to GAC (what is advisable), you still need to configure the path to the UDT DLL in this property,

After these steps, you can successfully communicate with Oracle using an Oracle Procedure, function, or package that contains User-Defined Types.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption

PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption

A few days ago, Luis Rigueira created a Kusto query to identify all Logic Apps Consumptions that use SQL V1 actions and triggers that will soon be deprecated (end of March 2024). The query also tries to identify all actions that are using those actions/triggers. This query actually works decently if a Logic App has actions or triggers in the first level (not inside if, switch, scope’s, and do on), but if we have Logic App with nested actions, which is quite common, then that query tries as a best effort to identify those actions – it will identify all Logic Apps, but it will not provide the name of the actions. The reason why is because it is quite difficult to loop to all the logic app definitions (JSON) with a Kusto query.

Don’t get me wrong. That query is awesome for you to identify all Logic Apps that we need to address to fix those actions. Now, if we need to identify all actions with that “problem” to estimate the work involved better, then that Kusto query will not be the best option. To solve this problem, we decided to create a PowerShell script that not only identifies all Logic Apps Consumption using SQL V1 actions and triggers but also identifies the names of those actions and triggers for you, providing a good report that you can use to plan and estimate your work.

# Function to extract actions recursively
function Get-ActionsAndTriggers {
    param (
        $node
    )
    $actionsAndTriggers = @()
    foreach ($key in $node.psobject.Properties.Name) {
        if ($node.$key.type -eq "ApiConnection") {
            if ($node.$key.inputs.path -like "*/datasets/default*" -and $node.$key.inputs.host.connection.name -like "*sql*") {
                $actionsAndTriggers += $key
            }
        } elseif ($node.$key -is [System.Management.Automation.PSCustomObject]) {
            $actionsAndTriggers += Get-ActionsAndTriggers -node $node.$key
        }
    }
    return $actionsAndTriggers
}
 
# Retrieve all Logic Apps within the subscription
$logicApps = Get-AzResource -ResourceType Microsoft.Logic/workflows
 
# Iterate through each Logic App and extract actions and triggers
foreach ($logicApp in $logicApps) {
    # Retrieve Logic App definition
    $logicAppDefinition = Get-AzResource -ResourceId $logicApp.ResourceId -ExpandProperties
 
    # Extract actions and triggers from the Logic App definition
    $allActionsAndTriggers = Get-ActionsAndTriggers -node $logicAppDefinition.Properties.Definition.triggers
    $allActionsAndTriggers += Get-ActionsAndTriggers -node $logicAppDefinition.Properties.Definition.actions
 
    # Display the Logic App name if filtered actions and triggers were found
    if ($allActionsAndTriggers.Count -gt 0) {
        Write-Host "Logic App: $($logicApp.Name) - RG: $($logicApp.ResourceGroupName)" -ForegroundColor Red
        # Display the list of filtered actions and triggers
        Write-Host "Filtered Actions and Triggers:"
        $allActionsAndTriggers
        Write-Host ""
    }
}

The great thing about this script is that it also identifies all nested actions (actions inside other actions like If, Scopes, Switch, and so on)

If you are wondering, can we do the same and identify all using SQL V2 actions and triggers? Don’t worry, we have your back covered, check the download section. In fact, with small changes to this script you can use for all types of connectors.

Download

THESE COMPONENTS ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.

You can download the PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption from GitHub here:

If you want to identify SQL V2 actions and triggers, then download this script from GitHub:

Huge thanks to Luis Rigueira for working with me on these scripts.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for my son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Friday (funny) Fact: There is no size limit for the Logic App parameter name

Friday (funny) Fact: There is no size limit for the Logic App parameter name

In Azure Logic Apps, you can abstract values that might change in workflows across development, test, and production environments by defining parameters. A Logic App parameter stores values that can be reused throughout a Logic App workflow. These parameters allow for a more flexible and maintainable configuration of logic apps, making it easier to update values without changing the actual workflow’s logic.

Parameters can store various types of data, such as strings, secure strings, boolean, arrays, or any other data that might need to be used multiple times within the Logic App or may change based on the environment (development, test, production, etc.). They can also be defined at deployment time using CI/CD pipelines.

By using parameters, you can easily update these values in one place without needing to edit the logic in multiple actions or triggers throughout the app.

In practice, you define parameters in the Logic App’s definition and can then use them in expressions or directly in actions throughout the app. When the Logic App is deployed or executed, these parameters are evaluated and used accordingly. This approach helps you manage and deploy Logic Apps across different environments, making the workflows more dynamic and easier to configure.

The funny fact about parameters is that they are probably the only Logic App “component” that doesn’t have a size limit regarding the name. For example:

  • Logic App Consumption name has a maximum limit of 80 characters.
  • Logic App Standard Workflow name has a maximum limit of 43 characters
  • A trigger or action name has a maximum limit of 80 characters.
  • and so on.

But the Logic App parameter name is unlimited! To prove that and for fun, I have created a parameter with this name:

p_material_availability_changed_range_hours_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa

To be honest, I think this is crazy! They should fix this and set up a limit because giving that amount of power to developers (and I’m a developer, too) is insane, we can do some nasty stuff!

To lazy to read? We’ve got you covered! Check out our video version of this content!

Hope you find this helpful! If you enjoyed the content or found it useful and wish to support our efforts to create more, you can contribute towards purchasing a Star Wars Lego for my son!

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

How to document an integration architecture

How to document an integration architecture

Mar 08, 2024

One could argue that creating good documentation for an integration solution is harder than with other types of software projects. The reason is that integration solutions almost always span over many systems and services and that we often talk about hundreds of integrations, touching almost every part of your architecture.

Keeping track of such a complex system is literally impossible without the right documentation. ?

An example of bad documentation …

Basic principles of good documentation for an integration architecture

1. Visualizations and diagrams are more important than textual documentation.

Textual documentation has its place and is important. But when it comes to trying to describe complex scenarios, visualizations in the form of architectural diagrams are easier to quickly understand and therefore arguably the most important part of your documentation.

2. Focus on the high-level diagrams.

As developers, we’re good at documenting all the little technical details in our solutions. But we often document things we end up looking up in the live code and configurations. Focus on high-level documentation, the overall picture, and how the different parts of your solutions relate to each other.

3. Make it readable for everyone.

UML and complex notations are great – if you understand them fully. The C4 model formalized the idea of a much simpler notation, based on boxes and arrows – easier to understand and work with.

Use a dead simple notation that everybody can write and read.

4. Make it useful for everyone.

Besides a simple notation, the C4 model describes a few different diagrams with increasing levels of detail. Using different levels makes it possible to drill down from an overview into more detailed diagrams. Similar to zooming in and out of a digital map.

Utilizing the levels and the simple notation in the C4 model, diagrams can have information useful for non-technical business users and describe large parts of architecture without cluttering them down with too many details.

The C4 levels in an integration context

Let’s look at the three levels that make up good integration documentation.

C4 model diagrams levels with increasing detail.

Level 1 (Context) – The Overall picture

For quickly understanding an integration architecture the most important part of documentation is the high-level documentation. It shows all the systems and the data-flows between them. Not the nitty-gritty details, just what systems are involved and what data is transferred between them.

In smaller architectures, it’s possible to describe all systems and data-flows in a single level 1 diagram. In large ones, we need to choose a context to avoid too large and hard-to-read diagrams. In this example, the context is the e-Commerce part of the architecture.

Image description
Level 1, Overall picture example – https://my.revision.app/diagram/6idZKQTeD8C9.

It’s important to keep these diagrams as simple as possible. It’s often tempting to include more details than necessary. Try to skip everything besides concrete systems and what type of data that’s transferred. The reason for this is that these diagrams otherwise get too complicated, hard to work with and hard to understand.

This part of the documentation is important. This is information that often is hard to understand by looking at the code and configuration.

This is also the part of the documentation that business users will understand. As we all know, getting the business users involved is critical for making the right decisions and in the end creating solutions that actually help the business in the right way.

Even though most people would agree that the overall picture is important to have documented, I’ve found that this is the part most developers tend to skip.

Level 2 (Container) – The Integration Processes

It’s hard to define an integration. In the overall picture product data are moved between several systems. Is the integration the part that moves the products between InRiver PIM and Dynamics 365 PO? Or is the integration the overall process that makes sure the products end up in all the destination systems?

Leaving that discussion aside we can agree that the purpose of an integration is to support some sort of business process. In this case, the handling of product master data.

Documenting how we support a business process is the second level of diagrams.

The difference between this and the level 1 diagram is that we’ll zoom in to a specific area and that we’ll here include more details – things that we didn’t want to include in the overall picture to keep it easy to understand. In the example, we’re showing that product master data actually is distributed using a product service. We’re also including other details we didn’t show in the level 1 diagram.

Level 2, Integration Process example – https://my.revision.app/diagram/bwoG8AXEhhxD. Shows how product master data flows between systems.

Again this is information that is hard to get from reading code and configuration and that is important for getting the business users to understand what we’re actually building.

It’s important to try and keep purely technical details out of this diagram and try and focus on more business rules like information. Information that is useful for a business user.

Example of information we might include on level 2;

  • what’s triggering the transfer
  • duplication checks
  • conversions
  • transformations of data, and so on

Level 3 – The Integration Details

This is probably the diagram that needs the least explanation. This is the part I often find well-documented and maintained. Here we document all the technical components and implementation details.

This is however also the part that doesn’t always require that much documentation. Much of the details at this level can easily be found in code and configuration and it’s easy to spend time documenting things that in the end developers will look up code and configuration instead.

Level 3, Integration Details example – https://my.revision.app/diagram/RmXsxDtuAEHy. Shows a detailed view of how product master data in moved between one system to another.
BizTalk WCF-SAP Adapter error: Loading property information list by namespace failed or property not found in the list. Verify that the schema is deployed properly.

BizTalk WCF-SAP Adapter error: Loading property information list by namespace failed or property not found in the list. Verify that the schema is deployed properly.

A client reported to me an error that we were getting in our BizTalk Server production environment this week. We had deployed a new orchestration and a new SAP receive port to handle a new partner, something we had done several times with success, but this time, we were getting the following warning each time we sent a message from SAP to BizTalk Server:

The adapter “WCF-Custom” raised an error message. Details “System.Exception: Loading property information list by namespace failed or property not found in the list. Verify that the schema is deployed properly.

   at Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkAsyncResult.End()
at Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkServiceInstance.EndOperation(IAsyncResult result)
at Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkServiceInstance.Microsoft.BizTalk.Adapter.Wcf.Runtime.ITwoWayAsyncVoid.EndTwoWayMethod(IAsyncResult result)
at AsyncInvokeEndEndTwoWayMethod(Object , Object[] , IAsyncResult )
at System.ServiceModel.Dispatcher.AsyncMethodInvoker.InvokeEnd(Object instance, Object[]& outputs, IAsyncResult result)
at System.ServiceModel.Dispatcher.DispatchOperationRuntime.InvokeEnd(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage7(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.MessageRpc.Process(Boolean isOperationContextSet)”.

First, it was strange to be logged as a warning since the BizTalk Server did not receive or process the message.

Also, the message can be misleading because the “error” message says Verify that the schema is deployed properly. These types of errors typically indicate that the schema or specific schema versions are not deployed in the environment, although when that happens, the error message clearly specifies the schema name and version, which is not the case in this error message.

Cause

Official documentation states that the following exception is encountered while receiving an IDOC with the EnableBizTalkCompatibilityMode binding property set to true. And if the binding property EnableBizTalkCompatibilityMode is set to true, you must add the BizTalk property schema DLL for the SAP adapter as a resource in your BizTalk application, that is, the application in which your project is deployed.

However, our receive location didn’t have the EnableBizTalkCompatibilityMode set to true. Instead, it was set to false.

With a lack of better ideas, we decided to try to apply the same solution and add the BizTalk property schema DLL for the SAP adapter as a resource in your BizTalk application, and it solved all of our problems,

Solution

So, to solve this issue, we need to add BizTalk property schema DLL for the SAP adapter called Microsoft.Adapters.SAP.BiztalkPropertySchema.dll as a resource in our BizTalk application. This DLL can be found Microsoft BizTalk Adapter Pack folder that normally is under:

  • : Program FilesMicrosoft BizTalk Adapter Packbin

or in BizTalk Server 2020 under:

  • :Program Files (x86)Microsoft BizTalk Server

You must perform the following tasks to add this assembly as a resource in your BizTalk application:

  • Start the BizTalk Server Administration console.
  • In the console tree, expand BizTalk Group, expand Applications, and then the application to which you want to add a BizTalk assembly.
  • Expand Applications and the application to which you want to add a BizTalk assembly.
  • Right-click Resources, point to Add, and then click BizTalk Assemblies.
  • Click Add, navigate to the folder containing the BizTalk assembly file, select the BizTalk assembly file, and then click Open.
  • In Options, specify the options for installing the BizTalk assembly to the GAC, and then click OK.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Remove XML Empty Nodes Pipeline Component

Remove XML Empty Nodes Pipeline Component

Another day, another BizTalk Server Pipeline Component! Today, I decided to release a brand new component called the Remove XML Empty Nodes Pipeline Component.

For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receiving and sending pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.

Remove XML Empty Nodes Pipeline Component

As the name mentions, the Remove XML Empty Nodes Pipeline Component is a pipeline component that can be used to remove empty nodes present in an XML message. You can use this component in any stage of a receive or send pipeline.

This component has a single property that requires you to setup:

  • DisableRemoveBOM (boolean): This allows you to enable or disable the process of removing empty nodes from an XML message.

How to install it

As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:

  • C:Program Files (x86)Microsoft BizTalk ServerPipeline Components

In this particular component, we need to have this  DLL

  • BizTalk.PipelineComponents.RemoveXmlEmptyNodes.dll

How to use it

Like all previous, to use the pipeline component, I recommend you create generic or several generic pipelines that can be reused by all your applications and add this pipeline component in any required stage of a send or receive pipeline.

Download

THIS COMPONENT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.

You can download the Remove XML Empty Nodes Pipeline Component from GitHub here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Send File To a Date-Based Structure Encoder Pipeline Component

Send File To a Date-Based Structure Encoder Pipeline Component

Time to get back to BizTalk Server and publish new resources on this amazing product and also return to one of my old pet projects: the BizTalk Pipeline Components Extensions Utility Pack.

Today, I decided to create a brand new component called the Send File To a Date-Based Structure Encoder Pipeline Component.

For those who aren’t familiar with it, the BizTalk Pipeline Components Extensions Utility Pack project is a set of custom pipeline components (libraries) with several custom pipeline components that can be used in receiving and sending pipelines. Those pipeline components provide extensions of BizTalk’s out-of-the-box pipeline capabilities.

Send File To a Date-Based Structure Encoder Pipeline Component

The Send File To a Date-Based Structure Encoder Pipeline Component is a pipeline component that can be used in a send pipeline, as the name mentioned inside the Encode stage, and it allows you to send an outbound file to a dynamic folder path organized by date tree:

  • yyyyMMdd

In other words, you will define the base path on the adapter URI, and then this component will use that base path to add a dynamic structure inside that path based on the date.

This component doesn’t require any property configuration.

How to install it

As always, you just need to add these DLLs on the Pipeline Components folder that in BizTalk Server 2020 is by default:

  • C:Program Files (x86)Microsoft BizTalk ServerPipeline Components

In this particular component, we need to have this  DLL:

  • BizTalk.PipelineComponents.SendFileToDateBasedStructure.dll

How to use it

Like all previous, to use the pipeline component, I recommend you create generic or several generic pipelines that can be reused by all your applications and add the pipeline component in the Encode stage. The component can be used only on the send pipelines.

Download

THIS COMPONENT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.

You can download Send File To a Date-Based Structure Encoder Pipeline Component from GitHub here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Seamlessly Adding Tags to Azure Function Apps via Visual Studio: A Guide for Enhanced Resource Management

Seamlessly Adding Tags to Azure Function Apps via Visual Studio: A Guide for Enhanced Resource Management

Have you ever wondered how to add tags to your Function App through Visual Studio?

Let’s break it down, but first, here’s a quick overview of how you would do it in the Azure Portal:

  • On your Function App overview page, under the Essentials information on the left, you’ll find “Tags” with an “Edit” button next to it.
  • Clicking on it allows you to add new tags to your function app. These tags essentially function as meta tags, consisting of key and value pairs, such as Name and Value.

But why do I need tags? You might be wondering.

Overall, tags offer a flexible and customizable way to manage and govern resources in Azure, enabling better organization, cost management, monitoring, and governance across your environment.

  • Organization and Categorization: Tags allow you to categorize and organize resources based on different criteria, such as department, project, environment (e.g., production, development), or cost center. This makes it easier to locate and manage resources, especially in larger deployments with numerous resources.
  • Cost Management: Tags can be used for cost allocation and tracking. By assigning tags to resources, you can easily identify the costs associated with specific projects, teams, or departments. This helps in budgeting, forecasting, and optimizing resource usage to control costs effectively.
  • Monitoring and Reporting: Tags provide metadata that can be used for monitoring and reporting purposes. You can use tags to filter and aggregate data in monitoring tools, allowing you to gain insights into resource usage, performance, and operational trends across different categories.
  • Access Control and Governance: Tags can also be leveraged for access control and governance purposes. By tagging resources based on their sensitivity, compliance requirements, or ownership, you can enforce policies, permissions, and compliance standards more effectively.

Now that we already describe the importance of tags and how you can add them from the Azure Portal, let’s dive into it with Visual Studio:

  • After you’ve published your Azure Function, or if you’re working with an existing published one, head over to the Solution Explorer and right-click on your solution.
  • From there, go to Add -> New Project. Now, search for Azure Resource Group and give it a double click.
  • You’ll be prompted to name your project. You can leave the location as is since it’s the project you’re currently working on. Click on Create once you’re done.
  • Now, in the Solution Explorer, you’ll spot a new project. Inside, you’ll find two .json files:
    • azuredeploy.json
    • azuredeploy.parameters.json
  • The file we’re interested in is azuredeploy.json. Double-click on it and replace its content with the provided JSON. Don’t forget to customize it with the tags you need and also your Function App Name. For now, let’s use these tags for our proof of concept:
{
  "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "functionAppName": {
      "type": "string",
      "metadata": {
        "description": "Name of the Azure Function App"
      },
      "defaultValue": "YOUR-FUNCTION-APP-NAME"
    }
  },
  "resources": [
    {
      "type": "Microsoft.Web/sites",
      "apiVersion": "2020-12-01",
      "name": "[parameters('functionAppName')]",
      "location": "West Europe",
      "properties": {
        "siteConfig": {
          // Define site configuration properties here
        }
      },
      "tags": {
        "Environment": "POC",
        "Project": "PdfMerger",
        "Company": "DevScope",
        "Year": "2024"
      }
    }
  ],
  "outputs": {}
}
  • Back in the Solution Explorer, right-click on the project you’ve just created and select Deploy -> New.
  • You’ll then need to choose your subscription and resource group. Finally, hit Deploy.

Once the deployment finishes smoothly without any errors, it’s time to inspect your Function App. You’ll notice that all your tags are now displayed on the Function App overview page.

Adding tags to your function app through Visual Studio provides a streamlined way to organize, manage, and govern your resources in Azure by categorizing resources based on criteria such as environment, project, company, etc.

Tags facilitate easier navigation and management, particularly in complex deployments. Moreover, tags play a crucial role in cost allocation, monitoring, reporting, and access control, offering valuable insights and enhancing governance across your environment.

While both methods, Visual Studio and the Azure Portal, offer ways to manage tags for resources like function apps, for simple solutions that don’t require having multiple environments, there are certain advantages to using Visual Studio for this task:

  • Automation and Consistency: Visual Studio allows you to automate the deployment of resources along with their tags using Infrastructure as Code (IaC) principles. This ensures consistency across deployments and reduces the chance of human error compared to manually adding tags in the Azure Portal.
  • Version Control: When managing your Azure resources through Visual Studio, you can maintain version control over your infrastructure code. This means you can track changes to your tags along with other resource configurations, making it easier to revert to previous versions if needed.
  • Integration with Development Workflow: For teams that primarily work within Visual Studio for development tasks, integrating tag management into the development workflow streamlines processes. Developers can manage both code and resource configurations in a unified environment, enhancing collaboration and efficiency.
  • Scalability: Visual Studio is well-suited for managing tags across multiple resources or environments. With the ability to define and deploy resource templates containing tags programmatically, scaling tag management becomes more manageable, especially in large-scale deployments.
  • Consolidated Management: Using Visual Studio for tag management allows you to centralize the configuration of tags alongside other resource settings. This consolidated approach simplifies overall resource management, providing a single interface for configuring and deploying resources and their associated tags.

It is important to note that the choice between Visual Studio and the Azure Portal ultimately depends on your specific requirements, preferences, and existing workflows. While Visual Studio offers certain advantages for tag management, the Azure Portal provides a user-friendly interface that may be more accessible for simple or ad-hoc tag assignments. This way, organizations should evaluate their needs and capabilities to find the most suitable approach for managing tags in their Azure environment.

Of course, in the end, the best solution is to use CI/CD pipelines to accomplish this task.

Hope you find this helpful! If you enjoyed the content or found it useful and wish to support our efforts to create more, you can contribute towards purchasing a Star Wars Lego for Sandro’s son!

Friday Fact: XML to JSON Conversion in API Management and Logic Apps have different behaviors

Friday Fact: XML to JSON Conversion in API Management and Logic Apps have different behaviors

I don’t know the reason why two products from the same family – Azure Integration Services – have completely different behaviors while converting XML to JSON, but that is the current reality. It is a fact! API Management and Logic Apps have different behaviors while applying this conversion, and that is one of the main reasons that I decided to create an Azure Function to convert XML into JSON, to keep the consistency between these two products.

While using API Management, we can use the xml-to-json policy to convert a request or response body from XML to JSON. However, when dealing with XML namespaces and prefixes, which is quite normal when working with XML messages, the policy has, in my opinion, a strange conversion behavior:

  • It converts the prefixes that in XML are represented by prefix:MyField into prefix$MyField. In order words, it replaces the colon character (:) with the dollar character ($).

Let’s take this XML sample in order for you to see the upcome result of that xml-to-json policy:

Book-Signing Event Convert it to JSON

The result will be:

{
    "section": {
        "@xmlns": "http://www.test.com/events",
        "@xmlns$bk": "urn:loc.gov:books",
        "@xmlns$pi": "urn:personalInformation",
        "@xmlns$isbn": "urn:ISBN:0-999-99999-9",
        "title": "Book-Signing Event",
        "signing": {
            "bk$author": {
                "@pi$title": "Mr",
                "@pi$name": "My Name"
            },
            "book": {
                "@bk$title": "How cool is XML",
                "@isbn$number": "9999999999"
            },
            "comment": {
                "@xmlns": "",
                "#text": "Convert it to JSON"
            }
        }
    }
}

I think this behavior is strange and incorrect.

Now, if we take the same XML payload and try to convert it inside Logic Apps using the json() expression that returns the JSON type value or object for a string or XML. In this case, using, for example, the following expression:

  • json(xml(triggerBody()))

The result will be:

{
    "section": {
        "@xmlns": "http://www.test.com/events",
        "@xmlns:bk": "urn:loc.gov:books",
        "@xmlns:pi": "urn:personalInformation",
        "@xmlns:isbn": "urn:ISBN:0-999-99999-9",
        "title": "Book-Signing Event",
        "signing": {
            "bk:author": {
                "@pi:title": "Mr",
                "@pi:name": "My Name"
            },
            "book": {
                "@bk:title": "How cool is XML",
                "@isbn:number": "9999999999"
            },
            "comment": {
                "@xmlns": "",
                "#text": "Convert it to JSON"
            }
        }
    }
}

In this case, the json() expression does not replace the colon character (:) with the dollar character ($) in the prefixes. It’s maintaining, which I think is the correct behavior.

To lazy to read? We’ve got you covered! Check out our video version of this content!

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Functions to validate XML against DTD

Azure Functions to validate XML against DTD

After the release of a set of Azure Functions that will help us minimize or completely remove the need for an Integration Account:

Today, I’m going to release a new function – validate XML against DTD – that will bring additional capabilities to Logic App Consumption and Standard since this functionality is not currently supported in either of the tiers nor with the support of the Integration Account.

DTD? What is a DTD?

Yes, this is probably old school, which is not often used nowadays. But DTD, which stands for Document Type Definition, allows you to define the structure and the legal elements and attributes of an XML document.

This is a sample of a DTD file:





And this is a sample of an XML message with a reference to a DTD:




   My stock
   nine
   (099) 999-9999

The DOCTYPE declaration above contains a reference to a DTD file.

Although the use of DTDs is not very frequent to see these days, it is still very common to encounter this in RosettaNet PIPs.

Validate XML against DTD

A Document Type Definition (DTD) is a document that describes the structure of an XML document, what elements and attributes it contains, and what values it may have. DTDs form part of the W3C’s XML Standard but are typically considered to be a separate schema technology and are not typically used in conjunction with other schema formats like XSD and so on.

A DTD document can be embedded within an XML file or can exist on its own. When it is not embedded, normally, there are two ways to reference the DTD:

  • Using the PUBLIC keyword: This format is generally used to declare publicly available DTDs, standard character sets, and commonly used notations

  • Or using the SYSTEM keyword: These entities are not assumed to be known to a receiving system. Thus, such entities require a full declaration of system identification (path, etc.) when they are exchanged.
    • The SYSTEM identifier specifies the location of the DTD file. Since it does not
      start with a prefix like http:/or file:/, the path is relative to the location of
      the XML document.

This Azure Function allows you to perform XML validations against a DTD file. The function only accepts DTDs defined using the SYSTEM keyword.

To trigger this function, you need to:

  • In the Body, the XML payload that you want to be validated.
  • You should specify the following mandatory headers:
    • Content-Type as text/xml (or application/xml).
    • DTDFileName with the name of the DTD file present in the storage account.

The response will be a:

  • 200 OK – Validation successful. If it’s a valid message.
  • Or 400 Bad Request with a list of errors if there is something invalid.

Where can I download it?

You can download the complete Azure Functions source code here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira