Logic App Consumption deployment: The secret of KeyVault parameter cannot be retrieved. Http status code: ‘Forbidden’. Error message: ‘Access denied to first party service

Logic App Consumption deployment: The secret of KeyVault parameter cannot be retrieved. Http status code: ‘Forbidden’. Error message: ‘Access denied to first party service

Recently, a client asked me for help rectifying some existing logic apps in their environment because their resource had left the company. Not only to rectify the project but to put it in a better shape and use all good best practices.

One of the tasks we decided to do was reference secrets in Key Vault for the deployment process, whether through CI/CD or directly through Visual Studio. We had administrator access to the Key Vault in the dev environment, so we were able to easily create those secrets and reference them in the Logic App parameter file, for example, an Azure Service Bus connection string. For those who are not aware, we can archive that by using the code below:

"arm_servicebus_connectionString": {
      "reference": {
        "keyVault": {
          "id": "/subscriptions//resourceGroups//providers/Microsoft.KeyVault/vaults/"
        },
        "secretName": "KVS-SB-ConnectionString"
      }
    }

The problem was that when we tried to deploy the solution through Visual Studio, we got the following error:

Logic app visual studio deployment Multiple error occurred: Forbidden,Forbidden,Forbidden. Please see details.

Without any more detail. After some analysis, we realized that the number of Forbidden words in the messages matched the number of key vault secrets we were trying to reference. When we commented on them all and only left one, then we got an error message with more detail:

The secret of KeyVault parameter ‘name’ cannot be retrieved. Http status code: ‘Forbidden’.
Error message: ‘Access denied to first party service.
Caller: name=ARM;tid=;appid=…
Vault:;location=’. Please see https://aka.ms/arm-keyvault for usage details.

Initially, I thought that was a Key Vault access permission issue, even though I was a Key Vault administrator. However, sometimes, we also need some RBAC permission. In the end, I ended up giving Administrator, Reader, and Secret User permission access at the key vault, resource group, and subscription level:

Still, I was getting the same error!

Cause

When you are developing a Logic App Consumption, this is, in fact, an ARM template project and an ARM template deployment. So, when we reference a Key Vault secret in the LogicApp.parameters.json file, we are referencing a secure parameter that will be used during the ARM template deployment.

The problem is that to be able to access the key vault by the resource manager, you need to change the access policy to allow Azure Resource Manager for template deployment.

You can see this on the official documentation here:

When you need to pass a secure value (like a password) as a parameter during deployment, you can retrieve the value from an Azure Key Vault. To access the Key Vault when deploying Managed Applications, you must grant access to the Appliance Resource Provider service principal. The Managed Applications service uses this identity to run operations. To successfully retrieve a value from a Key Vault during deployment, the service principal must be able to access the Key Vault.

Solution

To solve this issue is quite very simple:

  1. Sign in to the Azure Portal.
  2. Open your key vault. Enter key vaults in the search box or select Key vaults.
  3. On the Key Vault, select Access configuration under the Settings section
  1. Select Azure Resource Manager for template deployment under Resource access. Then, select Apply.

Now, you will be able to successfully reference the Key Vault secure parameter and deploy the Logic App Consumption solution from Visual Studio.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption

PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption

A few days ago, Luis Rigueira created a Kusto query to identify all Logic Apps Consumptions that use SQL V1 actions and triggers that will soon be deprecated (end of March 2024). The query also tries to identify all actions that are using those actions/triggers. This query actually works decently if a Logic App has actions or triggers in the first level (not inside if, switch, scope’s, and do on), but if we have Logic App with nested actions, which is quite common, then that query tries as a best effort to identify those actions – it will identify all Logic Apps, but it will not provide the name of the actions. The reason why is because it is quite difficult to loop to all the logic app definitions (JSON) with a Kusto query.

Don’t get me wrong. That query is awesome for you to identify all Logic Apps that we need to address to fix those actions. Now, if we need to identify all actions with that “problem” to estimate the work involved better, then that Kusto query will not be the best option. To solve this problem, we decided to create a PowerShell script that not only identifies all Logic Apps Consumption using SQL V1 actions and triggers but also identifies the names of those actions and triggers for you, providing a good report that you can use to plan and estimate your work.

# Function to extract actions recursively
function Get-ActionsAndTriggers {
    param (
        $node
    )
    $actionsAndTriggers = @()
    foreach ($key in $node.psobject.Properties.Name) {
        if ($node.$key.type -eq "ApiConnection") {
            if ($node.$key.inputs.path -like "*/datasets/default*" -and $node.$key.inputs.host.connection.name -like "*sql*") {
                $actionsAndTriggers += $key
            }
        } elseif ($node.$key -is [System.Management.Automation.PSCustomObject]) {
            $actionsAndTriggers += Get-ActionsAndTriggers -node $node.$key
        }
    }
    return $actionsAndTriggers
}
 
# Retrieve all Logic Apps within the subscription
$logicApps = Get-AzResource -ResourceType Microsoft.Logic/workflows
 
# Iterate through each Logic App and extract actions and triggers
foreach ($logicApp in $logicApps) {
    # Retrieve Logic App definition
    $logicAppDefinition = Get-AzResource -ResourceId $logicApp.ResourceId -ExpandProperties
 
    # Extract actions and triggers from the Logic App definition
    $allActionsAndTriggers = Get-ActionsAndTriggers -node $logicAppDefinition.Properties.Definition.triggers
    $allActionsAndTriggers += Get-ActionsAndTriggers -node $logicAppDefinition.Properties.Definition.actions
 
    # Display the Logic App name if filtered actions and triggers were found
    if ($allActionsAndTriggers.Count -gt 0) {
        Write-Host "Logic App: $($logicApp.Name) - RG: $($logicApp.ResourceGroupName)" -ForegroundColor Red
        # Display the list of filtered actions and triggers
        Write-Host "Filtered Actions and Triggers:"
        $allActionsAndTriggers
        Write-Host ""
    }
}

The great thing about this script is that it also identifies all nested actions (actions inside other actions like If, Scopes, Switch, and so on)

If you are wondering, can we do the same and identify all using SQL V2 actions and triggers? Don’t worry, we have your back covered, check the download section. In fact, with small changes to this script you can use for all types of connectors.

Download

THESE COMPONENTS ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.

You can download the PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption from GitHub here:

If you want to identify SQL V2 actions and triggers, then download this script from GitHub:

Huge thanks to Luis Rigueira for working with me on these scripts.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for my son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Friday (funny) Fact: There is no size limit for the Logic App parameter name

Friday (funny) Fact: There is no size limit for the Logic App parameter name

In Azure Logic Apps, you can abstract values that might change in workflows across development, test, and production environments by defining parameters. A Logic App parameter stores values that can be reused throughout a Logic App workflow. These parameters allow for a more flexible and maintainable configuration of logic apps, making it easier to update values without changing the actual workflow’s logic.

Parameters can store various types of data, such as strings, secure strings, boolean, arrays, or any other data that might need to be used multiple times within the Logic App or may change based on the environment (development, test, production, etc.). They can also be defined at deployment time using CI/CD pipelines.

By using parameters, you can easily update these values in one place without needing to edit the logic in multiple actions or triggers throughout the app.

In practice, you define parameters in the Logic App’s definition and can then use them in expressions or directly in actions throughout the app. When the Logic App is deployed or executed, these parameters are evaluated and used accordingly. This approach helps you manage and deploy Logic Apps across different environments, making the workflows more dynamic and easier to configure.

The funny fact about parameters is that they are probably the only Logic App “component” that doesn’t have a size limit regarding the name. For example:

  • Logic App Consumption name has a maximum limit of 80 characters.
  • Logic App Standard Workflow name has a maximum limit of 43 characters
  • A trigger or action name has a maximum limit of 80 characters.
  • and so on.

But the Logic App parameter name is unlimited! To prove that and for fun, I have created a parameter with this name:

p_material_availability_changed_range_hours_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa_assaasasasasasasasasassaassaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassaasassaasasaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaassasasasasasaassasa

To be honest, I think this is crazy! They should fix this and set up a limit because giving that amount of power to developers (and I’m a developer, too) is insane, we can do some nasty stuff!

To lazy to read? We’ve got you covered! Check out our video version of this content!

Hope you find this helpful! If you enjoyed the content or found it useful and wish to support our efforts to create more, you can contribute towards purchasing a Star Wars Lego for my son!

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Seamlessly Adding Tags to Azure Function Apps via Visual Studio: A Guide for Enhanced Resource Management

Seamlessly Adding Tags to Azure Function Apps via Visual Studio: A Guide for Enhanced Resource Management

Have you ever wondered how to add tags to your Function App through Visual Studio?

Let’s break it down, but first, here’s a quick overview of how you would do it in the Azure Portal:

  • On your Function App overview page, under the Essentials information on the left, you’ll find “Tags” with an “Edit” button next to it.
  • Clicking on it allows you to add new tags to your function app. These tags essentially function as meta tags, consisting of key and value pairs, such as Name and Value.

But why do I need tags? You might be wondering.

Overall, tags offer a flexible and customizable way to manage and govern resources in Azure, enabling better organization, cost management, monitoring, and governance across your environment.

  • Organization and Categorization: Tags allow you to categorize and organize resources based on different criteria, such as department, project, environment (e.g., production, development), or cost center. This makes it easier to locate and manage resources, especially in larger deployments with numerous resources.
  • Cost Management: Tags can be used for cost allocation and tracking. By assigning tags to resources, you can easily identify the costs associated with specific projects, teams, or departments. This helps in budgeting, forecasting, and optimizing resource usage to control costs effectively.
  • Monitoring and Reporting: Tags provide metadata that can be used for monitoring and reporting purposes. You can use tags to filter and aggregate data in monitoring tools, allowing you to gain insights into resource usage, performance, and operational trends across different categories.
  • Access Control and Governance: Tags can also be leveraged for access control and governance purposes. By tagging resources based on their sensitivity, compliance requirements, or ownership, you can enforce policies, permissions, and compliance standards more effectively.

Now that we already describe the importance of tags and how you can add them from the Azure Portal, let’s dive into it with Visual Studio:

  • After you’ve published your Azure Function, or if you’re working with an existing published one, head over to the Solution Explorer and right-click on your solution.
  • From there, go to Add -> New Project. Now, search for Azure Resource Group and give it a double click.
  • You’ll be prompted to name your project. You can leave the location as is since it’s the project you’re currently working on. Click on Create once you’re done.
  • Now, in the Solution Explorer, you’ll spot a new project. Inside, you’ll find two .json files:
    • azuredeploy.json
    • azuredeploy.parameters.json
  • The file we’re interested in is azuredeploy.json. Double-click on it and replace its content with the provided JSON. Don’t forget to customize it with the tags you need and also your Function App Name. For now, let’s use these tags for our proof of concept:
{
  "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "functionAppName": {
      "type": "string",
      "metadata": {
        "description": "Name of the Azure Function App"
      },
      "defaultValue": "YOUR-FUNCTION-APP-NAME"
    }
  },
  "resources": [
    {
      "type": "Microsoft.Web/sites",
      "apiVersion": "2020-12-01",
      "name": "[parameters('functionAppName')]",
      "location": "West Europe",
      "properties": {
        "siteConfig": {
          // Define site configuration properties here
        }
      },
      "tags": {
        "Environment": "POC",
        "Project": "PdfMerger",
        "Company": "DevScope",
        "Year": "2024"
      }
    }
  ],
  "outputs": {}
}
  • Back in the Solution Explorer, right-click on the project you’ve just created and select Deploy -> New.
  • You’ll then need to choose your subscription and resource group. Finally, hit Deploy.

Once the deployment finishes smoothly without any errors, it’s time to inspect your Function App. You’ll notice that all your tags are now displayed on the Function App overview page.

Adding tags to your function app through Visual Studio provides a streamlined way to organize, manage, and govern your resources in Azure by categorizing resources based on criteria such as environment, project, company, etc.

Tags facilitate easier navigation and management, particularly in complex deployments. Moreover, tags play a crucial role in cost allocation, monitoring, reporting, and access control, offering valuable insights and enhancing governance across your environment.

While both methods, Visual Studio and the Azure Portal, offer ways to manage tags for resources like function apps, for simple solutions that don’t require having multiple environments, there are certain advantages to using Visual Studio for this task:

  • Automation and Consistency: Visual Studio allows you to automate the deployment of resources along with their tags using Infrastructure as Code (IaC) principles. This ensures consistency across deployments and reduces the chance of human error compared to manually adding tags in the Azure Portal.
  • Version Control: When managing your Azure resources through Visual Studio, you can maintain version control over your infrastructure code. This means you can track changes to your tags along with other resource configurations, making it easier to revert to previous versions if needed.
  • Integration with Development Workflow: For teams that primarily work within Visual Studio for development tasks, integrating tag management into the development workflow streamlines processes. Developers can manage both code and resource configurations in a unified environment, enhancing collaboration and efficiency.
  • Scalability: Visual Studio is well-suited for managing tags across multiple resources or environments. With the ability to define and deploy resource templates containing tags programmatically, scaling tag management becomes more manageable, especially in large-scale deployments.
  • Consolidated Management: Using Visual Studio for tag management allows you to centralize the configuration of tags alongside other resource settings. This consolidated approach simplifies overall resource management, providing a single interface for configuring and deploying resources and their associated tags.

It is important to note that the choice between Visual Studio and the Azure Portal ultimately depends on your specific requirements, preferences, and existing workflows. While Visual Studio offers certain advantages for tag management, the Azure Portal provides a user-friendly interface that may be more accessible for simple or ad-hoc tag assignments. This way, organizations should evaluate their needs and capabilities to find the most suitable approach for managing tags in their Azure environment.

Of course, in the end, the best solution is to use CI/CD pipelines to accomplish this task.

Hope you find this helpful! If you enjoyed the content or found it useful and wish to support our efforts to create more, you can contribute towards purchasing a Star Wars Lego for Sandro’s son!

Friday Fact: XML to JSON Conversion in API Management and Logic Apps have different behaviors

Friday Fact: XML to JSON Conversion in API Management and Logic Apps have different behaviors

I don’t know the reason why two products from the same family – Azure Integration Services – have completely different behaviors while converting XML to JSON, but that is the current reality. It is a fact! API Management and Logic Apps have different behaviors while applying this conversion, and that is one of the main reasons that I decided to create an Azure Function to convert XML into JSON, to keep the consistency between these two products.

While using API Management, we can use the xml-to-json policy to convert a request or response body from XML to JSON. However, when dealing with XML namespaces and prefixes, which is quite normal when working with XML messages, the policy has, in my opinion, a strange conversion behavior:

  • It converts the prefixes that in XML are represented by prefix:MyField into prefix$MyField. In order words, it replaces the colon character (:) with the dollar character ($).

Let’s take this XML sample in order for you to see the upcome result of that xml-to-json policy:

Book-Signing Event Convert it to JSON

The result will be:

{
    "section": {
        "@xmlns": "http://www.test.com/events",
        "@xmlns$bk": "urn:loc.gov:books",
        "@xmlns$pi": "urn:personalInformation",
        "@xmlns$isbn": "urn:ISBN:0-999-99999-9",
        "title": "Book-Signing Event",
        "signing": {
            "bk$author": {
                "@pi$title": "Mr",
                "@pi$name": "My Name"
            },
            "book": {
                "@bk$title": "How cool is XML",
                "@isbn$number": "9999999999"
            },
            "comment": {
                "@xmlns": "",
                "#text": "Convert it to JSON"
            }
        }
    }
}

I think this behavior is strange and incorrect.

Now, if we take the same XML payload and try to convert it inside Logic Apps using the json() expression that returns the JSON type value or object for a string or XML. In this case, using, for example, the following expression:

  • json(xml(triggerBody()))

The result will be:

{
    "section": {
        "@xmlns": "http://www.test.com/events",
        "@xmlns:bk": "urn:loc.gov:books",
        "@xmlns:pi": "urn:personalInformation",
        "@xmlns:isbn": "urn:ISBN:0-999-99999-9",
        "title": "Book-Signing Event",
        "signing": {
            "bk:author": {
                "@pi:title": "Mr",
                "@pi:name": "My Name"
            },
            "book": {
                "@bk:title": "How cool is XML",
                "@isbn:number": "9999999999"
            },
            "comment": {
                "@xmlns": "",
                "#text": "Convert it to JSON"
            }
        }
    }
}

In this case, the json() expression does not replace the colon character (:) with the dollar character ($) in the prefixes. It’s maintaining, which I think is the correct behavior.

To lazy to read? We’ve got you covered! Check out our video version of this content!

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for Sandro’s son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Functions to validate XML against DTD

Azure Functions to validate XML against DTD

After the release of a set of Azure Functions that will help us minimize or completely remove the need for an Integration Account:

Today, I’m going to release a new function – validate XML against DTD – that will bring additional capabilities to Logic App Consumption and Standard since this functionality is not currently supported in either of the tiers nor with the support of the Integration Account.

DTD? What is a DTD?

Yes, this is probably old school, which is not often used nowadays. But DTD, which stands for Document Type Definition, allows you to define the structure and the legal elements and attributes of an XML document.

This is a sample of a DTD file:





And this is a sample of an XML message with a reference to a DTD:




   My stock
   nine
   (099) 999-9999

The DOCTYPE declaration above contains a reference to a DTD file.

Although the use of DTDs is not very frequent to see these days, it is still very common to encounter this in RosettaNet PIPs.

Validate XML against DTD

A Document Type Definition (DTD) is a document that describes the structure of an XML document, what elements and attributes it contains, and what values it may have. DTDs form part of the W3C’s XML Standard but are typically considered to be a separate schema technology and are not typically used in conjunction with other schema formats like XSD and so on.

A DTD document can be embedded within an XML file or can exist on its own. When it is not embedded, normally, there are two ways to reference the DTD:

  • Using the PUBLIC keyword: This format is generally used to declare publicly available DTDs, standard character sets, and commonly used notations

  • Or using the SYSTEM keyword: These entities are not assumed to be known to a receiving system. Thus, such entities require a full declaration of system identification (path, etc.) when they are exchanged.
    • The SYSTEM identifier specifies the location of the DTD file. Since it does not
      start with a prefix like http:/or file:/, the path is relative to the location of
      the XML document.

This Azure Function allows you to perform XML validations against a DTD file. The function only accepts DTDs defined using the SYSTEM keyword.

To trigger this function, you need to:

  • In the Body, the XML payload that you want to be validated.
  • You should specify the following mandatory headers:
    • Content-Type as text/xml (or application/xml).
    • DTDFileName with the name of the DTD file present in the storage account.

The response will be a:

  • 200 OK – Validation successful. If it’s a valid message.
  • Or 400 Bad Request with a list of errors if there is something invalid.

Where can I download it?

You can download the complete Azure Functions source code here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Function to Apply XML Validation (Advanced)

Azure Function to Apply XML Validation (Advanced)

After the release of our previous XML Validation Functions: 

It is now time to release our last Azure Function under the same context: an Azure Function to Apply XML Validation (Advanced).

As I explained in my previous posts, all the out-of-the-box Azure Integration Services capabilities to validate XMLs have a huge limitation: they don’t allow us to have a chain of XML Schemas! As I also mentioned, this is a common feature present in many Enterprise XML Schemas definitions. EDI or RosettaNet Schemas may have 2 or more schemas that define the overall structures of the messages. Blocking this way many enterprise scenarios that we need to address in our Azure integration solutions.

Our previous Azure Function allows us to solve many of those scenarios, and it can be used inside Logic Apps Consumption or Standard or even inside API Management. However, it also has a limitation: it only allows first-level chain support for XML Schemas (meaning that it will only take into consideration all the import schemas of the main XML Schema).

However, for example, in RosettaNet Schemas, it is very common that our main schema imports a “child” XML Schema and that child schema imports or includes other XML Schemas itself.

Apply XML Validation (Advanced)

What does this Azure Function do?

This Azure Function allows you to perform XML validations against an XML Schema, including support for all chains of XML Schemas. That means that it will take into consideration all depth of importation for a specific type of message. It will recursively include or import all XML Schemas, supporting this way all types of XML message validation.

To trigger this function, you need to:

  • In the Body, the XML payload that you want to be validated.
  • You should specify the following mandatory headers:
    • Content-Type as text/xml (or application/xml).
    • SchemaFileName with the name of the XML Schema (XSD) file present in the storage account.

The response will be a:

  • 200 OK – Validation successful. If it’s a valid message.
  • Or 400 Bad Request with a list of errors if there is something invalid.

Notice that by default, if you send a message to be validated against a schema that doesn’t have the same target namespace and root node, the response will be an OK. This function also validates that type of message, so if you send a message that doesn’t correspond to that schema, you will get a 400 Bad Request.

Where can I download it?

You can download the complete Azure Functions source code here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Thanks to my team member Luís Rigueira for helping me realize and implement this idea.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Function to Apply XML Validation (Intermedium)

Azure Function to Apply XML Validation (Intermedium)

After the release of our Azure Function to Apply XML Validation (Basic), it is now time for another Azure Function under the same context: an Azure Function to Apply XML Validation (Intermedium).

You may be wondering why a new Azure Function is required to archive the same? And why not only one with all capabilities?

Both are good questions that I will be happy to respond to. First of all, in my last blog post, I mentioned that, in the next few days, we will be releasing two additional versions of this function with more functionalities/capabilities. And to answer the second question, the main reason why I decided to do 3 different versions is performance. The basic function has fewer capabilities, but it will have better performance. Of course, the advanced function will have all the capabilities, but it will have a small overhead in the overall performance.

The previous Azure Function is great for basic validations, similar to what we can archive using the default out-of-the-box capabilities inside:

  • Logic Apps Standard;
  • Logic Apps Consumption using the Integration Account;
  • Or in API Management

Of course, the basic XML Validation Function it is a good approach to replace the dependency of the Integration Account in Logic App Consumption.

But all of them have a huge limitation! None of those services allow a chain of XML Schemas!

What do you mean by a chain of XML Schemas?

XML Schema provides mechanisms to include or import other XML Schema documents, enabling the reuse and extension of schema definitions across multiple files. This capability is essential for managing complex schemas in a modular and maintainable manner.

  1. Include: The include element is used when you want to incorporate definitions from another schema that is in the same target namespace. By using include, you can split your schema definitions into separate, smaller files for better manageability and readability while treating them as part of a single schema during validation. The included schema essentially becomes a subset of the including schema, allowing for the extension or redefinition of elements and types within the same namespace.
  2. Import: The import element is used to incorporate definitions from another schema that is in a different target namespace or from no namespace into the current schema. This allows you to reference and use types and elements defined in an external schema within your current schema document. Importing is crucial when you need to integrate or reference types defined in a completely separate schema, possibly managed by a different organization or standard body.

Both include and import mechanisms facilitate the construction of complex XML schemas from modular components, promoting reuse and simplifying the management of schema definitions. They enable schema designers to build upon existing standards and to organize their schema definitions logically and efficiently.

This a common feature present in many Enterprise XML Schemas definitions. EDI or RosettaNet Schemas may have 2 or more schemas that define the overall structures of the messages.

Apply XML Validation (Intermedium)

What does this Azure Function do?

This Azure Function allows you to perform XML validations against an XML Schema, including first-level chain support for XML Schemas. That means that it will take into consideration all the import schemas of the main XML Schema

To trigger this function, you need to:

  • In the Body, the XML payload that you want to be validated.
  • You should specify the following mandatory headers:
    • Content-Type as text/xml (or application/xml).
    • SchemaFileName with the name of the XML Schema (XSD) file present in the storage account.

The response will be a:

  • 200 OK – Validation successful. If it’s a valid message.
  • Or 400 Bad Request with a list of errors if there is something invalid.

Notice that by default, if you send a message to be validated against a schema that doesn’t have the same target namespace and root node, the response will be an OK. This function also validates that type of message, so if you send a message that doesn’t correspond to that schema, you will get a 400 Bad Request.

Where can I download it?

You can download the complete Azure Functions source code here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Thanks to my team member Luís Rigueira for helping me realize and implement this idea.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Function to Apply XML Validation (Basic)

Azure Function to Apply XML Validation (Basic)

After the release of our two previous Azure Functions that will help us minimize or completely remove the need for an Integration Account:

Today is the correct time to release another Azure Function that will replace another Integration Account functionality: an Azure Function to Apply XML Validation.

Of course, you can apply out-of-the-box this functionality in:

  • Logic App Standard (without the need for an Integration Account)

    
        
    
    

However, Logic App Consumption requires an Integration Account to provide those same capabilities out of the box.

Once again, our main objective in creating this specific version of this function was to use it inside Logic Apps Consumption to avoid needing an Integration Account. But that we will address later on in another blog post.

If you are wondering what I mean by this specific version of this function, well, in the next few days, we will be releasing two additional versions of this function with more functionalities. But, once again, that we will address later on in another blog post.

Apply XML Validation (Basic)

XML (Extensible Markup Language) validation is the process of checking an XML document against a set of rules to ensure its structure and content adhere to a specific format or standard. This process is crucial for ensuring that the XML document is both well-formed and valid.

  1. Well-formed XML: This means that the XML document follows the basic syntax rules laid out by the XML specification. These rules include proper nesting of elements, correct use of opening and closing tags, attribute value quoting, and more. A well-formed XML document is one that can be correctly parsed and understood by an XML parser.
  2. Valid XML: Beyond being well-formed, a valid XML document also adheres to a specific schema or Document Type Definition (DTD) that defines the structure, content, and relationships within the document. Validation against a schema or DTD ensures that the XML document contains the expected elements, attributes, and data types, and that these components are organized in a defined way.

There are several schema languages used for XML validation, with the most common being:

  • DTD (Document Type Definition): An older schema language that defines the structure and allowed content within an XML document. – Not supported in the Azure Function.
  • XML Schema (also known as XSD): A more powerful and expressive schema language that allows for more detailed specifications of the content and structure, including data types and namespace support.

XML validation is performed using XML parsers or validation tools, which can programmatically check a document against its DTD or XSD to ensure compliance. This is a critical step in many data exchange, configuration management, and content authoring workflows, ensuring that the data is correctly structured and interpretable by receiving systems or applications.

What does this Azure Function do?

This Azure Function allows you to perform basic XML validations against an XML Schema.

To trigger this function, you need to:

  • In the Body, the XML payload that you want to be validated.
  • You should specify the following mandatory headers:
    • Content-Type as text/xml (or application/xml).
    • SchemaFileName with the name of the XML Schema (XSD) file present in the storage account.

The response will be a:

  • 200 OK – Validation successful. If it’s a valid message.
  • Or 400 Bad Request with a list of errors if there is something invalid.

Notice that by default, if you send a message to be validated against a schema that doesn’t have the same target namespace and root node, the response will be an OK. This function also validates that type of message, so if you send a message that doesn’t correspond to that schema, you will get a 400 Bad Request.

Where can I download it?

You can download the complete Azure Functions source code here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Thanks to my team member Luís Rigueira for helping me realize and implement this idea.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Logic App Consumption Bulk Failed Runs Resubmit Tool

Logic App Consumption Bulk Failed Runs Resubmit Tool

Last week, we posted a Logic App, Best Practices, Tips, and Tricks, about the ability to resubmit multiple runs at once and how that process can be a tedious and sometimes complicated process. Luckily for us, new features appeared recently, and that process was somehow minimized. Nevertheless, a little confused.

At the time we were investigating those capabilities, we were finalizing the development of a tool to achieve that goal. Since this tool had already been developed, we decided to make it available.

Logic App Consumption Bulk Failed Runs Resubmit Tool

This is a simple .NET Windows application that allows you to easily resubmit multiple Logic App Consumption runs at once.

To archive that, you need to be already authenticated on your Azure Portal, and you need to provide the following parameters:

  • Logic App name;
  • Resource Group;
  • and Subscription ID;
  • Optionally, you can select a DateTime range to filter the failed runs you need to resume.
    • If you don’t select a range, all failed runs until a max of 250 will be presented.

Where can I download it?

You can download the complete Logic App Consumption Bulk Failed Runs Resubmit tool source code here:

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Thanks to my team member Luís Rigueira for being the mentor of this idea.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira