Join me at Azure Logic Apps Community Day 2023 | June 22, 2023 | A walk in the park with the new Logic App Data Mapper

Join me at Azure Logic Apps Community Day 2023 | June 22, 2023 | A walk in the park with the new Logic App Data Mapper

With INTEGRATE 2023 London concluded, time to shift gears to the next big event on cloud integration: Azure Logic Apps Community Day 2023, sometimes called LogicAppsAviators Community Day, which will take place on Thursday, June 22nd at 9 AM (Pacific) or 5 PM (UTC). The event is free and will be streamed on YouTube/Twitch, so be sure to subscribe to the Azure Developers YouTube to stay up to date.

Azure Logic Apps Community Day 2023 will be the must-attend event for anyone who wants to learn more about Logic Apps and how it can help to solve real-life integration problems. It will be a full day of learning from the basics of getting started to deep dives into advanced automation with Logic Apps presented by the Logic Apps product group, Microsoft MVPs, and expert community members. In the end, will be a Round Table Discussion – Ask Me Anything with the Product Group and Community – this will be your opportunity to make “hard” questions.

I will have the pleasure of delivering a session about the new Data Mapper at this event and also be part of the panel on the Round Table Discussion!

About my session

Session Name: A walk in the park with the new Logic App Data Mapper

Abstract: In this session, we will present the new Data Mapper experience for Logic Apps Standard and how we can apply XML to XML transformations or XML to JSON transformations using a visual designer. Here, we will also address how to implement well-known mapping patterns like direct translation, Data translation, content enricher, or aggregator patterns alongside many others.

See the full agenda here: Azure Logic Apps Community Day 2023

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Function: JSON Schema Validation (new release)

Azure Function: JSON Schema Validation (new release)

We just released a new version of our Azure Function JSON Schema Validation, adding support for more complex schema validations. In this case, we add support for applying subschemas validation conditionally.

The ifthen and else keywords allow the application of a subschema based on the outcome of another schema, much like the if/then/else constructs you’ve probably seen in traditional programming languages.

  • If if is valid, then must also be valid (and else is ignored.) If if is invalid, else must also be valid (and then is ignored).
  • If then or else is not defined, if behaves as if they have a value of true.
  • If then and/or else appear in a schema without ifthen and else are ignored.

JSON Schema Validation Function

The JSON Schema Validation is a simple Azure Function that allows you to validate your JSON message against a JSON Schema, enabling you to specify constraints on the structure of instance data to ensure it meets the requirements.

The function receives a JSON payload with two properties:

  • The JSON message in the json property.
  • And the JSON Schema in the jsonSchema property.

Example:

{
    "json": {
        "address": [
            {
                "contact": {
                    "firstName": "myFirstName",
                    "lastName": "myLastName"
                },
                "type": "bill"
            }
        ]
    },
    "jsonSchema": {
        "type": "object",
        "properties": {
            "address": {
                "type": "array",
                "items": {
                    "type": "object",
                    "properties": {
                        "contact": {
                            "type": "object",
                            "properties": {
                                "firstName": {
                                    "type": "string"
                                },
                                "lastName": {
                                    "type": "string"
                                }
                            },
                            "required": []
                        },
                        "type": {
                            "type": "string"
                        }
                    },
                    "required": [
                        "contact",
                        "type"
                    ]
                }
            }
        },
        "if": {
            "properties": {
                "address": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "type": {
                                "const": "bill"
                            }
                        }
                    }
                }
            }
        },
        "then": {
            "properties": {
                "address": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "contact": {
                                "required": [
                                    "firstName"
                                ],
                                "properties": {
                                    "firstName": {
                                        "type": "string"
                                    }
                                }
                            }
                        }
                    }
                }
            }
        },
        "else": {
            "properties": {
                "address": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "contact": {
                                "required": [],
                                "properties": {}
                            }
                        }
                    }
                }
            }
        }
    }
}

The function’s output will be:

  • A 200 OK if the JSON message is valid.
  • Or a 400 Bad Request if there are validation errors/issues.

Where can I download it?

You can download the complete Azure Functions source code here:

Download JSON Schema Validation Azure Function

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Big thanks to my team member Luís Rigueira for adding this new feature.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Microsoft Integration and Azure Stencils Pack for Visio: New version available (v8.0.0)

Microsoft Integration and Azure Stencils Pack for Visio: New version available (v8.0.0)

The last time I released a new version of my stencil, it was on January 26 of 2022. A long time ago indeed, so it is fair to say that I do need to release a new major version of my stencils, and that will be a long work and process. However, I decided to do this task progressively and release minor updates during this “journey”. This way, it becomes easier for me because I don’t need to spend long periods allocated to this task, and at the same time, all of you can start enjoying these new icons.

What’s new in this version? (for now)

The main goal of this release was to provide the new icons present in the Azure Portal, on the Power Platform, and new existing Services. In this version, the changes and additions are:

  • New shapes: New shapes added on MIS Databases and Analytics Stencils, MIS Azure Additional or Support Stencils, Microsoft Integration Stencils, MIS Azure Stencils, and MIS Power Platform Stencils;
  • SVG Files: Add new SVG files;
  • Special Highlights: Microsoft Fabric and the new Logic App Data Mapper

Microsoft Integration, Azure, Power Platform, Office 365, and much more Stencils Pack

Microsoft Integration, Azure, Power Platform, Office 365, and much more Stencils Pack it’s a Visio package that contains fully resizable Visio shapes (symbols/icons) that will help you to visually represent On-premise, Cloud or Hybrid Integration and Enterprise architectures scenarios (BizTalk Server, API Management, Logic Apps, Service Bus, Event Hub…), solutions diagrams and features or systems that use Microsoft Azure and related cloud and on-premises technologies in Visio 2016/2013:

  • BizTalk Server
  • Microsoft Azure
    • Integration
      • Integration Service Environments (ISE)
      • Logic Apps and Azure App Service in general (API Apps, Web Apps, and Mobile Apps)
      • Azure API Management
      • Messaging: Event Hubs, Event Grid, Service Bus, …
    • Azure IoT and Docker
    • AI, Machine Learning, Stream Analytics, Data Factory, Data Pipelines
    • SQL Server, DocumentDB, CosmosDB, MySQL, …
    • and so on
  • Microsoft Power Platform
    • Microsoft Flow
    • PowerApps
    • Power BI
  • Office365, SharePoint,…
  • DevOps and PowerShell
  • Security and Governance
  • And much more…
  • … and now non-related Microsoft technologies like:
    • SAP Stencils
Microsoft Integration (Azure and much more) Stencils Pack

The Microsoft Integration Stencils Pack is composed of 27 files:

  • Microsoft Integration Stencils
  • MIS Additional or Support Stencils
  • MIS AI and Machine Learning Stencils
  • MIS Apps and Systems Logo Stencils  
  • MIS Azure Additional or Support Stencils
  • MIS Azure Mono Color
  • MIS Azure Old Versions
  • MIS Azure Others Stencils
  • MIS Azure Stencils
  • MIS Buildings Stencils
  • MIS Databases and Analytics Stencils
  • MIS Deprecated Stencils
  • MIS Developer Stencils
  • MIS Devices Stencils
  • MIS Files Stencils
  • MIS Generic Stencils
  • MIS Infrastructure Stencils
  • MIS Integration Fun
  • MIS Integration Patterns Stencils
  • MIS IoT Devices Stencils
  • MIS Office365
  • MIS Power BI Stencils
  • MIS PowerApps and Flows Stencils
  • MIS SAP Stencils
  • MIS Security and Governance
  • MIS Servers (HEX) Stencils
  • MIS Users and Roles Stencils

That you can use and resize without losing quality, in particular, the new shapes.

Download

You can download Microsoft Integration, Azure, BAPI, Office 365, and much more Stencils Pack for Visio from GitHub here:

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Natural Language Message Validation with Logic Apps and ChatGPT

Natural Language Message Validation with Logic Apps and ChatGPT

In one of my previous documents, I spoke about how to configure ChatGPT to be used with Azure Logic Apps. At that time, my team and I decided to create that blog post just for fun, but we didn´t yet have in mind a good idea of how to use it in a real-case integration scenario or to help build integration projects.

During a break at INTEGRATE 2023 conference, Mike Stephenson show me an idea that he had while listening to one of the talks about putting ChatGPT to do transformation using natural language, thereby replacing the maps. Needless to say, it was a fascinating and fun conversation. You can see his blog post here:

This put me thinking about where else we could apply ChatGPT in use to help develop, implement, or process our integration needs. Despite having a few ideas, one quickly stood out, and that somehow follows Mike’s idea: Messages Validation! In other words, have a Logic App which is processing some data, and then we use ChatGPT to validate the data for us by describing the schema validation in natural language text.

So I ended up creating this small sample scenario where we have the following JSON Message:

{ 
    "Company": "SP", 
    "Operation": "Insert", 
    "Request": { 
        "OrderID": 2, 
        "ClientName": "Sandro Pereira", 
        "ShippingAddress": "Pedroso, Portugal", 
        "TaxId": 111222111, 
        "ProductName": "Chocollate Bar", 
        "Quantity": 1 
    } 
} 

And we want to validate the incoming message if it is valid or not based on the following rules:

  • The OrderID must exist and be an Integer. 
  • The ClientName can not be empty or null.
  • The Quantity must exist, and it is an Integer. 
  • The value of the Company field can only be “SP” or “LZ”.
  • The Operation field must exist. 

Of course, this task can be made by creating a JSON schema with all the rules and then validating the message against the schema. However, some of these rules are not natively supported by Logic Apps like the use of regular expressions or patterns.

But the main idea here on this post is to provide a simplified way to do this task and use natural language to replace the Schemas.

You can watch my coworker Luís Rigueira describing all the process in this video:

So how do we achieve this? 

First, as you already know, you should create a Logic App. It can be Consumption or Standard. We will be using Consumption, and then you should give it a proper name because starting using proper names from the day one rule never gets old! 

And then create a Logic App that has the following structure:

  • When a HTTP request is received trigger.
  • A Compose action.
  • An HTTP action.
  • And finally, a Response action.

Leave the When a HTTP request is received trigger as his. And on the Compose action, add the following configurations with all the actions to apply the validation of the message:

Check if the Json Input is valid or invalid by these rules:

The OrderID must exist and be an Integer
The client name can not be empty or null
The Quantity must exist and it is an Integer
Company can only be "SP" or "LZ"
Operation must exist

If it is valid return a message saying "body is valid"
If it is invalid return a message saying the "body is invalid" and explain why.

Json input: @{triggerBody()}

As the JSON input, we dynamically select the Body property from the Trigger, which contains the JSON we send via Postman.

Next, on the HTTP call, we need to perform the call to the ChatGPT API to perform or try to perform the JSON message validation. To do that, we need to specify the following body:

{
  "model": "gpt-3.5-turbo",
  "messages": [
      {
          "role": "user",
           "content": "@{outputs('Compose_-_Instructions_to_be_followed')}",
           }
        ]
}

If you want to know or understand a little bit about this – How to call ChatGPT from a Logic App – see my previous blog post: Using Logic Apps to interact with ChatGPT.

Finally, we need to configure the Response action to use the following expression in the response:

trim(outputs('HTTP_-_Call_Chat_Gpt_API')?['body']?['choices']?[0]?['message']?['content'])

This way, we only send the message content as a response. Now if you test your logic app with Postman, this should be the result:

If the rules we added to the Compose action are met, we will get the following response:

  • The body will be valid.

Otherwise, the body is invalid, and ChatGPT gives the reason why.

Of course, I think at this stage maybe AI is still not a reliable option to be used in data transformation or data validation, but it shows potential.

Once again, thank my team member Luis Rigueira for helping me with this always crazy scenarios.

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Function: JSON Schema Validation

Azure Function: JSON Schema Validation

JSON Schema is a declarative language that allows you to annotate. It provides a format for what JSON data is required for a given application and how to interact with it and validate JSON documents to ensure it meets the requirements.

Applying JSON Schemas validation in your solutions will let you enforce consistency and data validity across similar JSON data.

If you are not familiar with JSON Schema, you will then notice that the JSON Schema itself is written in JSON-based format. It’s just a declarative format for “describing the structure of other data”. This is both its strength and its weakness (which it shares with other similar schema languages). It is easy to concisely describe the surface structure of data, and automate validating data against it. However, since a JSON Schema can’t contain arbitrary code, certain constraints exist on the relationships between data elements that can’t be expressed. JSON Schema is a proposed IETF standard.

JSON Schema Validation Function

The JSON Schema Validation is a simple Azure Function that allows you to validate your JSON message against a JSON Schema, enabling you to specify constraints on the structure of instance data to ensure it meets the requirements.

The function receives a JSON payload with two properties:

  • The JSON message in the json property.
  • And the JSON Schema in the jsonSchema property.

Example:

{
    "json": {
        "name": "",
        "extension": "xml"
    },
    "jsonSchema": {
        "type": "object",
        "properties": {
            "name": {
                "type": "string",
                "pattern": "^.*[a-zA-Z0-9]+.*$"
            },
            "extension": {
                "type": "string"
            }
        }
    }

The function’s output will be:

  • A 200 OK if the JSON message is valid.
  • Or a 400 Bad Request if there are validation errors/issues.

Where can I download it?

You can download the complete Azure Functions source code here:

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Big thanks to my team member Diogo Formosinho for testing and helping me develop this function with me!

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

JSON Validator Tool

JSON Validator Tool

Another common task for us developers inside Azure Integration Services, especially inside Logic Apps, is manually creating a JSON message, either inside a Parse JSON action, Compose action, or directly in the connectors (or any other way), but there is a catch…

If you work in the Azure Portal and create an invalid JSON message inside an action or connector, the editor will not allow you to save the Logic App. Instead, it will give you an error saying that the definition contains invalid parameters.

However, if you are creating the same Logic App Consumption inside Visual Studio:

We can successfully validate this Logic App:

And we can actually successfully deploy this Logic App:

And that will become a problem once we run our Logic App. For this reason, it is always good for you to guarantee that the JSON message is well formatted before you deploy your business processes.

And yes, I know many only tools exist to perform this task, so why a Windows tool? Again, for the same reasons I described in my previous tools: security and privacy

I’m starting to become a freak in terms of security. Nothing is free, and the problem with these online tools is that we never know behind the scenes what they are doing. Are you sure that they are not keeping logs of the inputs we provide and the result outputs? And don’t say, but Sandro, this is just a simple message. Well, many messages have sensitive (private) information from users or companies that sometimes you are not aware of, so it is better to play safe than sorry. It is wise to be careful now so that problems do not occur later on and protect yourself against risk rather than be careless.

JSON Validator Tool

JSON Validator Tool is a lightweight Windows tool that allows you to validate and reformat a JSON message.

To not raise the same suspicions about this tool, the source code is available on GitHub!

Download

Hope you find this useful! So, if you liked the content or found it useful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Credits

  • Diogo Formosinho | Member of my team and one of the persons responsible for developing this tool.
Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

BizTalk Mapper Extensions UtilityPack: DateTime Functoids for BizTalk Server 2020

BizTalk Mapper Extensions UtilityPack: DateTime Functoids for BizTalk Server 2020

Today, after almost 3 years since I did the last update on the package, I updated it with a new suite of functoids that are now part of the BizTalk Mapper Extensions UtilityPack project available for BizTalk Server 2020: DateTime Functoids.

DateTime Functoids

This library includes a suite of functoids to perform several DateTime operations that you can use inside the BizTalk mapper.

This project, for now, only contains a single custom Functoid:

  • Get Current Date Functoid: This functoid allows you to get the current date and/or time in a specific format.
    • This functoid requires one input parameter:
      • date, time, or DateTime format;
    • Examples:
      • Input yyyyMMdd >>> Output = 20230526
      • Input HHmm >>> Output = 1519

BizTalk Mapper Extensions UtilityPack

BizTalk Mapper Extensions UtilityPack is a set of libraries with several useful functoids to include and use it in a map, which will provide an extension of BizTalk Mapper capabilities.

BizTalk Mapper Extensions UtilityPack for BizTalk Server 2020

Where can I download it?

You can download the complete Azure Function source code here:

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

PowerShell script to download a specific version of WinSCP

PowerShell script to download a specific version of WinSCP

Today I was helping a BizTalk Server customer migrate their process from using the FTP adapter to the SFTP adapter. And if you are familiar with the BizTalk SFTP adapter, you will be aware of the painful process of choosing the correct version of WinSCP and how that hell we need to do to work correctly with the BizTalk Server, which is quite simple in general:

  • Download WinSCP and the .net Library, ensuring you get the correct version!
  • Copy the .exe and .dll to the BizTalk installation folder
  • DO NOT gac anything. If you GAC the .net library, it will not work because it expects WinSCP.exe to be in the same path, so that’s why they both go into the BizTalk installation folder.

However, the biggest issue is: what is the correct WinSCP version I need for my version of BizTalk Server 2016 or 2020?

And for that reason, Thomas E. Canter, on his day as a Phidiax consultant, decided to create this fantastic PowerShell script BizTalk WinSCP Installer, which during the years, has evolved and was improved by several people like Michael Stephenson, Nicolas Blatter, Niclas Öberg and myself.

If your environment has access to the internet, then I will recommend you use that script to install WinSCP! However, my client didn’t have access to the internet from the production server, and to complicate it a little bit, it didn’t have the same Build version. Let’s say in the test environment, it did have BizTalk Server 2016 with Feature Pack 3 and Cumulative Update 9, but in production, we would find BizTalk Server 2016 with Feature Pack 3 and Cumulative Update 5. That means we couldn’t copy the files from other environments to production. We had to use a different WinSCP version.

To address this scenario, I created a simple PowerShell version that you could choose the version you want to download and that you can run on any machine with access to the internet without checking the Build version you have in your environment.

This is a simple abstract of the PowerShell script:

$checkExeExists = Test-Path $targetNugetExe
if(-not $checkExeExists)
{
    if ($PSCmdlet.ShouldProcess("$sourceNugetExe -OutFile $targetNugetExe", "Run Invoke-WebRequest ")) {
        Invoke-WebRequest $sourceNugetExe -OutFile $targetNugetExe
        $targetNugetExeExists = Test-Path $targetNugetExe
        if (-not $targetNugetExeExists) {
            $Continue = $false
            Write-Error "`n$bangString";
            Write-Error "The download of the Nuget EXE from";
            Write-Error $sourceNugetExe;
            Write-Error "did not succeed";
            Write-Error "$bangString";
        }
        else{
            Write-Success "nuget.exe download successfully."
        }
    }
}

if ($PSCmdlet.ShouldProcess("$getWinSCP", "Run Command")) {
    Invoke-Expression "& $getWinSCP";
    $WinSCPEXEExists = Test-Path $WinSCPEXEDownload
    $WinSCPDLLExists = Test-Path $WinSCPDllDownload
    if (-not $WinSCPDLLExists) {
        $Continue = $false
        Write-Error "`n$bangString";
        Write-Error "WinSCP $winSCPVersion was not properly downloaded.";
        Write-Error "Check the folder and error messages above:";
        Write-Error "$nugetDownloadFolder";
        Write-Error "And determine what files did download or did not download.";
        Write-Error "$bangString";
    }
    else{
        Write-Success "WinSCP $winSCPVersion was properly downloaded."
    }
}

THESE POWERSHELL SCRIPTS ARE PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.

Where can I download it?

You can download the complete Azure Function source code here:

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Azure Function to read key values from Azure App Configuration

Azure Function to read key values from Azure App Configuration

Almost 3 years ago, I wrote a blog post about How to get Key-values from Azure App Configuration within Logic Apps, and this is still valid today since we still don’t have an App Configuration Connector available out of the box. But I realize that I never blog about this resource in my blog.

There are many ways to store application configurations in Azure. At the top of the list is Key Vault which is ideal for storing secrets like passwords that should have limited access to the number of people that can see and modify these values.

However, not all configurations should need that tremendous restricted access. Neither all configurations are secrets. Of course, you can also use Key Vault to store non-secret information. But there are other options available like:

  • Using a SQL database with an app configuration table to store key values.
  • Depending on the services you are using, you can also use the default built-in App Configuration settings. For example, Azure Functions have it:

But you can also make use of the Azure App Configuration service.

Azure App Configuration stores configuration data as key-values. Key-values are a flexible and straightforward representation of developers’ application settings to make these configurations settings dynamic without redeploying your application if changes are required. However, they are not encrypted like the Key Vault, so it is a good solution to store non-sensitive data that you want to make dynamic in your application. 

Combining Azure App Configuration to store and manage your configurations and Azure Key Vault to store your secrets, we will obtain a powerful configuration management nearly for free.

App Configuration makes it easier to implement the following scenarios:

  • Centralize management and distribution of hierarchical configuration data for different environments and geographies
  • Dynamically change application settings without the need to redeploy or restart an application
  • Control feature availability in real-time

The only problem was that unlike Key Vault, which has an available connector to be used inside Logic Apps, App Configuration doesn’t have a connector available.

Get Azure App Configuration Value Function

This Function App is intended to close this gap and for you to be able to use it inside Logic Apps (or any other resource) and read App Configurations.

You can download the complete code for the function from GitHub. The link is below at the bottom of the blog post. Here is a small code snippet:

public static class GetAzureAppConfigurationValue
    {
        [FunctionName("GetAzureAppConfigurationValue")]
        public static async Task Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req,
            ILogger log)
        {
            string appKey = req.Query["appKey"];

            ....

            try
            {
                ...
                var builder = new ConfigurationBuilder();
                builder.AddAzureAppConfiguration(connectionString);
                var build = builder.Build();

                string keyValue = build[appKey.ToString()];

                if (string.IsNullOrEmpty(keyValue))
                {
                    ...
                }
                else return new OkObjectResult(keyValue);
            }
            catch(Exception ex)
            {
                var result = new ObjectResult(ex.Message);
                result.StatusCode = StatusCodes.Status500InternalServerError;
                return result;
            }
        }
    }

This function requires that you pass as a query parameter the appKey parameter:

  • https://URL?appKey=’’

Where can I download it?

You can download the complete Azure Function source code here:

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Note to myself: How to change the language in the Azure Portal

Note to myself: How to change the language in the Azure Portal

This is just another post for the sake of my mental sanity because I hate that Microsoft will induce, based on my location, that I intend for the Azure Portal to have my beloved Portuguese as the default language!

Don’t get me wrong, I’m proud to be Portuguese, and I do love my native language, but technically speaking, I hate to translate technical names into Portuguese for several reasons:

  • Sometimes direct translations do not work properly, and in some cases, the names/concepts are strange:
    • A queue (service bus) in Portuguese is “Fila” but “Fila” can be in English a row, a line, or indeed a queue > and this one is simple.
  • But most importantly, if I try or need to search for documentation or issues regarding some services, most of the resources will be in English.
  • Also, because I do have multiple clients across the world, speaking with them and sharing the screen with them will be easier if everything is in English.

I usually have all the portals and tools in English, but every now and then, I need to create a new profile on my browser, and there you go. I will end up with the Azure Portal in Portuguese and fighting to remember where to change the language, which by the way, is quite simple to achieve :).

To change the language settings in the Azure portal:

  • Click the Settings menu in the global page header.
  • Click the Language & region tab. Of course, depending on the current language, you will get a different name. Mine is “Idioma + região” (Portuguese).
  • Use the drop-downs to choose your preferred language and regional format settings.
  • Click Apply, in my case “Aplicar“, to update your language and regional format settings.

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira