BizTalk Server WCF-SQL: The certificate chain was issued by an authority that is not trusted.

BizTalk Server WCF-SQL: The certificate chain was issued by an authority that is not trusted.

Today, while trying to generate a new WCF-SQL Schema from Visual Studio, we got the following error:

Connecting to the LOB system has failed. 
A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 – The certificate chain was issued by an authority that is not trusted.).

Cause

That error occurred because we had previously tried to generate an SQL schema connecting to an Azure SQL Database, and for that, we needed to set the Encrypt property to true.

  • Encrypt property specifies whether SQL Server (with a valid certificate installed) uses SSL encryption for all data transfers between SQL Server and the client. The default value is false.

Why am I referring to this? The Consume Adapter Service wizard tool caches the settings that you have set up on the previous execution, except if this is the first time you have executed the tool since you opened the Visual Studio Project. That means:

  • When you open the Visual Studio project, the first time you execute the Consume Adapter Service wizard tool, it has the default settings.
  • While staying with that Visual Studio instance open. In previous runs of the tool, you see the previous settings configurations, except for the passwords.
  • If you close Visual Studio, those settings are “clean”.

In our case, we were trying to connect to a local database, which doesn’t use certificates, so the Encrypt setting needs to be turned off.

Solution

The solution to this problem is quite simple but not easy to understand, given the error information and the Visual Studio behavior of caching settings.

  • On the Consume Adapter Service wizard tool, click on Configure.
  • On the Configure Adapter window, click on the Binding Properties tab and set the Excrypt property to False.

After that, we were, once again, able to successfully connect to SQL Server.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Logic App Transformation Error: InvalidXml. The XML validation failed. Data at the root level is invalid.

Logic App Transformation Error: InvalidXml. The XML validation failed. Data at the root level is invalid.

In this saga of Errors and Warnings, Causes, and Solutions in Azure – if you follow me, you know that this is one of my favorite topics – I got a fancy one. If you work in the Azure Integration space, you may be aware that the Integration Services Environment (ISE) will be retired on 31 August 2024. That means, if you have ISE, you must migrate your solutions to Logic App Consumption and/or Logic App Standard.

One of the tasks we are doing these days is, in fact, migrating that ISE solution. We are also taking this opportunity to improve our current solutions. And in one of these processes, we got the following error in one of our Logic Apps while trying to apply a transformation:

InvalidXml. The XML validation failed. Data at the root level is invalid. Line 1, position 1.

Cause

One of the main issues while migrating transformations from the old Logic App solution is the fact that all types of transformations, especially Liquid Transformations:

  • Transform JSON to JSON
  • Transform JSON to TEXT
  • Transform XML to TEXT
  • and Transform XML to JSON

has the same icon, which makes it virtually impossible for us to understand in the Logic App designer what type of transformation we are applying without going to the code view.

The only thing that is different is the Transform XML action

But this only happens in the new designer. If you are working inside Visual Studio, you only have the old designer, then this is even worse because all transformations use the same icon:

Normally, when this problem occurs, it means that you have chosen the incorrect Liquid Transformation type. Or the type of the map is incorrect.

In my case, I used the Transform XML action, and I should have used the Transform JSON to JSON.

Solution

Of course, the solution is quite simple. We have to select the correct action! But the main question is how to know what action to use.

If you try to see the action code view, it doesn’t tell you anything, as you can see in the picture below:

Instead, you have to switch to code view to really understand what you are doing inside that action:

Type Xslt means that this is a Transform XML from the XML connector.

Now, the second question you may ask is: can I fix the existing action without having to delete it, add a new one, and reconfigure it again?

Yes, there is, but not in the Logic App designer. You can fix this by switching to Code View and specifying:

  • The type property as Liquid.
  • When you specify the transformation as Liquid, you need to add a type of liquid transformation. by adding the following property:
    • king as, for example, JsonToJson.

While you are migrating those solutions, especially these transformation actions, checking the type of transformation in the code view will help you avoid future problems.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

API Management Terraform deployment error: A resource with the ID already exists

API Management Terraform deployment error: A resource with the ID already exists

Following my last blog post: API Management Terraform deployment error: The user, group or application ‘does not have secrets get permission on key vault. Last week, while developing and deploying an API Management Terraform Visual Studio solution into our development environment, we got another error:

Error: A resource with the ID “/subscriptions//resourceGroups/RG-name/providers/Microsoft.ApiManagement/service//apis//diagnostics/applicationinsights” already exists – to be managed via Terraform this resource needs to be imported into the State. Please see the resource documentation for “azurerm_api_management_api_diagnostic” for more information.

Cause

The error message you’re encountering during your Terraform deployment indicates that the resource you’re trying to create already exists in your infrastructure but isn’t recognized in Terraform’s state file. This situation typically occurs if the resource was created outside of Terraform or if the state file was modified or lost.

In my case, two of us were working on the same project without knowing. I fixed some issues on the project and successfully deployed the solution. When I realized that my team member was also working on the project, adding extra functionalities that I had asked for, I uncommit my changes, and we replicated the small fixes I had made on his side.

When he tried to deploy the solution, he got the error above.

Solution

The solution to this problem is simple, you’ll need to import the existing resource into your Terraform state file (terraform.tfstate) so that Terraform can manage it. In this particular case, we have to run the following command:

terraform import azurerm_api_management_api_diagnostic.api_sap "/subscriptions//resourceGroups/RG-name/providers/Microsoft.ApiManagement/service//apis//diagnostics/applicationinsights"

After we imported this resource to the Terraform state file, we were able to deploy our solution once again.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

API Management Terraform deployment error: The user, group or application ‘does not have secrets get permission on key vault

API Management Terraform deployment error: The user, group or application ‘does not have secrets get permission on key vault

Last week, I got, once again, the following error while trying to deploy an API on API Management using a Terraform Visual Studio project:

Error: making Read request on Azure KeyVault Secret KVS-Secret-Name: keyvault.BaseClient#GetSecret: Failure responding to request: StatusCode=403 — Original Error: autorest/azure: Service returned an error. Status=403 Code=”Forbidden” Message=”The user, group or application ‘appid=;oid=;numgroups=69;iss=https://sts.windows.net//’ does not have secrets get permission on key vault ‘KeyVault-Name;location=location’. For help resolving this issue, please see https://go.microsoft.com/fwlink/?linkid=2125287? InnerError={“code”:”AccessDenied”}

I say once again because I usually forget to ask for permission. We don’t have access to Key Vaults by default on this client. Instead, we need to ask permission for a certain amount of time.

Cause

The message is quite clear for you to know what is the cause. The error message you’re encountering during the Terraform deployment related to API management indicates a permissions issue with Azure Key Vault. Specifically, the user, group, or application attempting the operation does not have the necessary permission to retrieve secrets from the Azure Key Vault.

In my case, I forgot to ask PIM permission to access the Key Vault, but the biggest issue with this error is that after I asked for permission and they were approved. No matter what I did, I was still getting the same error! I double check, I was able to access the Key Vault using the Azure Portal but I was not able to deploy it using Visual Studio.

The reason why that happens is because when you do an Az login, your “access policies” get cached on your machine at that point in time. Even if you change them by adding more privileges on the Azure Portal, they will not be reflected.

Solution

Of course, this problem has two solutions:

  • First, you need to make sure that you have access to the Key Vault. For Terraform deployments, in these cases, your user typically needs get permission.
  • Second, if the error persists, clear the cache by using the following set of commands:
az logout
az account clear
az login

The az account clear command is used with the Azure CLI (Command-Line Interface) to clear all subscriptions from the local cache. This effectively logs out the user from all the accounts in the current Azure CLI session. Here’s what happens when you use this command:

  1. Clearing Cached Credentials: It removes the cached credentials, which means any subsequent Azure CLI commands requiring authentication will prompt you to log in again.
  2. Session Reset: It effectively resets the Azure CLI session, ensuring that no residual or potentially incorrect or outdated authentication details are used in future commands within the same session.
  3. Security Practice: This can be a good security practice, especially when using shared or public machines, to ensure that no sensitive information or access tokens are left accessible to others after your session.
  4. Use Case: You might use this command when you need to switch accounts or ensure that no session data persists for security reasons.

Using az account clear is straightforward but should be done with the understanding that you will need to re-authenticate to continue interacting with Azure resources via the CLI.

Thanks Michael Stephenson for this tip!

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Friday Fact: It is possible to create an XML first-class experience inside Logic Apps.

Friday Fact: It is possible to create an XML first-class experience inside Logic Apps.

By default, Logic Apps provides a good first-class experience for JSON, that means when we receive a JSON message, we can tokenize the elements by using a JSON schema inside, for example, a Parse JSON action:

This will basically create shortcuts or pointers to the fields of the message that we can easily use in other actions without worrying about the query path to each field:

These tokens or points are, in fact, shortcuts to the query path of those fields providing a good first-class experience:

  • body(‘Parse_JSON’)?[‘firstName’]
  • body(‘Parse_JSON’)?[‘lastName’]

These two are simple, but they can be a little bit more complex:

  • outputs(‘Compose_Response_JSON’)[‘Result’][‘Address’][‘PostalCode’])

Unfortunately, many people complain that we don’t have the same capabilities for XML messages. And that’s true. By default, XML messages don’t have this first-class experience to access the fields since the message, but that doesn’t mean that we cannot have that experience, either!

A quick and simple workaround to implement a first-class experience for XML messages inside Logic Apps is to use the JSON () expression to convert the XML into its JSON equivalent. Know more details about how you can apply this conversion here: Logic App Best Practices, Tips, and Tricks: #41 How to convert XML into JSON inside Logic Apps

For example, if we have the following XML message:


	  Sandro
	  Pereira

We can use the following expression to convert the XML message to JSON:

  • json(xml(outputs(‘Compose_XML’)))

Of course, then we need to provide a JSON Schema that corresponds to that message. and that will allow you to have a first-class experience for that XML message

Just a simple trick that can help you in many processes.

To lazy to read? We’ve got you covered! Check out our video version of this content!

Join us next Friday for another Friday Fact!

Hope you find this helpful! If you enjoyed the content or found it useful and wish to support our efforts to create more, you can contribute towards purchasing a Star Wars Lego for my son!

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Logic App Consumption Visual Studio Deployment: Establishing connection with the service failed with code ‘DMTS_EncryptClusterCredentialsErrorCode’

Logic App Consumption Visual Studio Deployment: Establishing connection with the service failed with code ‘DMTS_EncryptClusterCredentialsErrorCode’

This week while trying to deploy a Logic App Consumption project to the DEV environment using Visual Studio 2019 deployment, I kept getting the following error:

Status Message: Establishing connection with the service failed with code ‘DMTS_EncryptClusterCredentialsErrorCode’.

I knew that this was not “my code’s” fault since I had successfully deployed that solution several times, and this time, I was just reconfiguring API connection settings.

Cause

If you receive a DMTS_EncryptClusterCredentialsErrorCode error code, it means that your On-premises Data Gateway (OPDG) is not working. For that reason, it cannot deploy the API Connection that your Logic Apps are using.

Of course, this error only occurs if you are using OPDG in some of your connections. In my case, one of my API Connections uses an On-Premise Data Gateway to connect to a system, in this case, an SAP system.

Solution

Solving this issue is quite simple. You need to make sure that your On-premises Data Gateway (OPDG) is running properly.

In my case, the VM where the OPDG agent was installed was turned off. Turning on the VM and waiting a few minutes for the gateway to become online was everything I needed to be able to once again deploy my solution successfully.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk WCF-SQL Adapter Error: Retrieval of Operation Metadata has failed while building WSDL at TypedProcedure – Invalid Object name

BizTalk WCF-SQL Adapter Error: Retrieval of Operation Metadata has failed while building WSDL at TypedProcedure – Invalid Object name

Following yesterday’s blog post, on the same note, when we tried to generate a WCF-SQL stored procedure schema for a new BizTalk Server solution, we got the following error inside Visual Studio:

Error while retrieving or generating the WSDL. Adapter message: Retrieval of Operation Metadata has failed while building WSDL at ‘TypedProcedure/dbo/’

Once again, this is quite a generic error, and to gain more insight, you need to click on Details.

Microsoft.ServiceModel.Channels.Common.MetadataException: Retrieval of Operation Metadata has failed while building WSDL at ‘TypedProcedure/dbo/’ —> System.Data.SqlClient.SqlException: Invalid object name ‘table-name.column-name’.
   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

Cause

This error normally tells you that something is broken inside your stored procedure. In other words, your stored procedure is not working properly, and it gives an error while you execute it.

In my case, the stored procedure was doing an inner join with a table that belonged to another database, and someone forgot to reference that database, something like [database-name].[table-name].[column-name].

Solution

Solving this issue is quite simple:

  • Open SQL Server Management Studio (SSMS) and execute the stored procedure you want to use to generate the schema.
  • Check what the error is on the output window and fix it..
  • Repeat this process until you are able to successfully execute the stored procedure.

After doing that, I was able to successfully generate the SQL schema for that stored procedure.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk WCF-SQL Adapter Error: Retrieval of Operation Metadata has failed while building WSDL at TypedPolling

BizTalk WCF-SQL Adapter Error: Retrieval of Operation Metadata has failed while building WSDL at TypedPolling

Today while importing/generating a WCF-SQL schema for a new BizTalk Server solution, we got the following error inside Visual Studio:

Error while retrieving or generating the WSDL. Adapter message: Retrieval of Operation Metadata has failed while building WSDL at ‘TypedPolling’

This is quite a generic error, and for you to have more insight, you need to click on Details.

Microsoft.ServiceModel.Channels.Common.MetadataException: Retrieval of Operation Metadata has failed while building WSDL at ‘TypedPolling’ —> System.Data.SqlClient.SqlException: Could not find stored procedure ‘dbo.’.
   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

Cause

As the description states, the generator tool was not able to find the stored procedure in the SQL database.

Because this was an inbound operation (polling), the first task I performed was to validate the existence of the stored procedure and determine whether there was a spelling error in the configuration of the polling statement. No spelling error was observed, and the stored procedure existed in the database.

The second thing you should perform is to Disconnect and make sure that the URI configuration is correct with regard to the SQL Server and instance name as well as the database name.

In my case, I forgot to specify the database name, which means the InitialCatalog property.

Because we configure the Server and InstanceName properties, we are able to connect the server, but when the tool tries to generate the Schemas, it doesn’t know where to connect.

Solution

Solving this issue is quite simple:

  • You need to make sure that you configure properly the URI properties:
    • Server property, with the name or network address of the host machine on which SQL Server resides.
    • InstanceName property, with the instance name of SQL Server to connect to. If you need to connect to the default instance, you need to leave this property empty.
    • InitailCatalog property, with the name of the database to which to connect.

After doing that, I was able to successfully generate the SQL schemas

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Logic App Best Practices, Tips, and Tricks: #42 How to convert JSON into XML inside Logic Apps

Logic App Best Practices, Tips, and Tricks: #42 How to convert JSON into XML inside Logic Apps

Welcome again to another Logic Apps Best Practices, Tips, and Tricks. In my previous blog posts, I discussed a basic trick for developers that was to convert XML into JSON inside Logic Apps. Today we do and speak about the inverse process: How to convert JSON into XML inside Logic Apps.

If you need to convert JSON messages into XML inside your workflows, you don’t need to create an Azure Function, an API, or any other service! You can use out-of-the-box capabilities.

How to convert JSON into XML inside Logic Apps

Once again, today, I want to share something simple.

It is possible to convert a JSON message into XML inside Logic Apps using only expressions. This means that we don’t need to perform an API Management API call, an Azure Function, or any other external service to accomplish this translation, similar to what happens with the opposite process: converting XML into JSON.

Let’s examine a sample to better understand it. The idea is to convert the following JSON into his equivalent XML format:

{
  "person": {
    "name": "Sandro Pereira",
    "city": "Portugal"
  }
}

To try this, we can create a simple Logic App that accepts an HTTP XML request and returns is equivalent in JSON. To do that, we need the following:

  • Create a Logic App and add a When a HTTP request is received trigger.
    • Leave the trigger as is.
  • For better visibility and perception, let’s add an Initialize Variable action – this is optional
    • And set the following expression on the Value property:
      • xml(triggerBody())
  • Finally, add a Response action and set the body to be the variable we initialized before.
    • Note that the previous step is optional because we can use the expression directly in the response.
  • Save your Logic App.

Now, if you try to send a JSON request using, for example, POSTMAN. You will notice that the output will be the message you sent in an XML format:


	Sophia Owen
	Seattle

In case, er have a simple JSON message without an element that can serve as an XML root node like, for example, this one:

{
    "id": "xxxx-xxxx-xxxx",
    "name": "LA-FridayFactTest-POC",
    "type": "Microsoft.Logic/workflows",
    "location": "westeurope",
    "run": {
        "id": "/subscriptions/xxxx-xxxx-xxxx/resourceGroups/xxxx/providers/Microsoft.Logic/workflows/LA-FridayFactTest-POC/runs/xxxxxxxxxxxxxxxxxxxx",
        "name": "xxxxxxxxxxxxxxxxxxxx",
        "type": "Microsoft.Logic/workflows/runs"
    }
}

The runtime will add a default root node called root:


	xxxx-xxxx-xxxx
	LA-FridayFactTest-POC
	Microsoft.Logic/workflows
	westeurope
	
		/subscriptions/xxxx-xxxx-xxxx/resourceGroups/xxxx/providers/Microsoft.Logic/workflows/LA-FridayFactTest-POC/runs/xxxxxxxxxxxxxxxxxxxx
		xxxxxxxxxxxxxxxxxxxx
		Microsoft.Logic/workflows/runs
	

If we want to provide a custom root name for that XML message, we can add a compose action or a variable and add the desired root name by composing a new JSON message like this:

And on the Response action, we will convert the output of the compose action into XML:

Note: don’t try to use the json() expression alongside the xml() expression like this:

  • xml(json(triggerBody()))

Because it is more likely you will get an error message similar to this one:

Unable to process template language expressions in action ‘Compose’ inputs at line ‘0’ and column ‘0’: ‘The template language function ‘json’ expects its parameter to be a string or an XML. The provided value is of type ‘Object’. Please see https://aka.ms/logicexpressions#json for usage details.’.

I hope you enjoy this developer tip and stay tuned for the following Logic App Best practices, Tips, and Tricks.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for my son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Logic App Best Practices, Tips, and Tricks: #41 How to convert XML into JSON inside Logic Apps

Logic App Best Practices, Tips, and Tricks: #41 How to convert XML into JSON inside Logic Apps

Welcome again to another Logic Apps Best Practices, Tips, and Tricks. In my previous blog posts, I discussed one good trick for administrators. Today we focus on a best practice trip and trick for developers: How to convert XML into JSON inside Logic Apps.

Sometimes, we tend to overcomplicate things, do not follow direct paths, or use obscure shortcuts. I like to keep things clean and simple and use strategies that can improve and speed up our work. I also think about the best way to manage those processes in the long term without overcomplicating things.

How to convert XML into JSON inside Logic Apps

Today I want to share something simple.

It is possible to convert an XML message into JSON inside Logic Apps using expressions. This means that we don’t need to perform an API Management API call, an Azure Function, or any other external service to accomplish this translation.

Let’s examine a sample to better understand it. The idea is to convert the following XML into his equivalent JSON format:


  Sandro Pereira
  961098121
  A1245FD5
  5
  12345
  SN9019
  2015-09-11
  0
  0
  0
  10
  
    
      1
      1
      10
      10
    
    
      1
      2
      5
      11
    
  
  PreOrder
  
  0
  
    Crestuma
    Portugal
    Crestuma
    sandro.pereira@devscope.net
    sandro-pereira@live.com.pt
    
    Porto
    4415
    961098121
  

To try this, we can create a simple Logic App that accepts an HTTP XML request and returns is equivalent in JSON. To do that, we need the following:

  • Create a Logic App and add a When a HTTP request is received trigger.
    • Leave the trigger as is.
  • For better visibility and perception, let’s add an Initialize Variable action – this is optional
    • And set the following expression on the Value property:
      • json(xml(triggerBody()))
  • Finally, add a Response action and set the body to be the variable we initialized before.
    • Note that the previous step is optional because we can use the expression directly in the response.
  • Save your Logic App.

Now, if you try to send an XML request using, for example, POSTMAN. You will notice that the output will be the message you sent in a JSON format:

{
  "ns0:CustomerOrder": {
    "@xmlns:ns0": "http://Northwind.BusinessSolution.Schemas.CustomerOrder",
    "CustomerName": "Sandro Pereira",
    "CustomerContact": "961098121",
    "CustomerPONumber": "A1245FD5",
    "Priority": "5",
    "QuoteReferenceId": "12345",
    "SalesOrderReferenceID": "SN9019",
    "CreatedDate": "2015-09-11",
    "FreightAmount": "0",
    "MiscCharges": "0",
    "SubTotal": "0",
    "TotalOrderAmount": "10",
    "Lines": {
      "SaleOrderLine": [
        {
          "LineNumber": "1",
          "UnitPrice": "1",
          "Quantity": "10",
          "ItemID": "10"
        },
        {
          "LineNumber": "1",
          "UnitPrice": "2",
          "Quantity": "5",
          "ItemID": "11"
        }
      ]
    },
    "Status": "PreOrder",
    "Comments": "",
    "UnitPrice": "0",
    "ns1:ShippingAddress": {
      "@xmlns:ns1": "http://Northwind.BusinessSolution.Schemas.SchemaCommonStructures",
      "City": "Crestuma",
      "Country": "Portugal",
      "Street": "Crestuma",
      "Email": [
        "sandro.pereira@devscope.net",
        "sandro-pereira@live.com.pt"
      ],
      "Fax": "",
      "State": "Porto",
      "Zip": "4415",
      "Phone": "961098121"
    }
  }
}

Note that when we want to convert an XML message into JSON, you should use the xml() expression alongside the json() expression. This will force the json() expression input to be an XML.

The advantages of using json() alongside the xml() expression to convert XML documents into JSON are:

  • They run inside the Logic App runtime, which means they are executed faster;
  • Expressions are not paid like actions, meaning you can use fewer actions inside your Logic App. Instead, you can use them to define the action inputs. This will minimize workflow complexity and also save on costs (a little bit).

Sometimes, we wish that all things could be this simple and straightforward, but for the ones that are, let’s enjoy them!

I hope you enjoy this developer tip and stay tuned for the following Logic App Best practices, Tips, and Tricks.

Thanks to Luis Rigueira for this idea for Best practices, tips, and tricks.

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for my son! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira