A few days ago, Luis Rigueira created a Kusto query to identify all Logic Apps Consumptions that use SQL V1 actions and triggers that will soon be deprecated (end of March 2024). The query also tries to identify all actions that are using those actions/triggers. This query actually works decently if a Logic App has actions or triggers in the first level (not inside if, switch, scope’s, and do on), but if we have Logic App with nested actions, which is quite common, then that query tries as a best effort to identify those actions – it will identify all Logic Apps, but it will not provide the name of the actions. The reason why is because it is quite difficult to loop to all the logic app definitions (JSON) with a Kusto query.
Don’t get me wrong. That query is awesome for you to identify all Logic Apps that we need to address to fix those actions. Now, if we need to identify all actions with that “problem” to estimate the work involved better, then that Kusto query will not be the best option. To solve this problem, we decided to create a PowerShell script that not only identifies all Logic Apps Consumption using SQL V1 actions and triggers but also identifies the names of those actions and triggers for you, providing a good report that you can use to plan and estimate your work.
# Function to extract actions recursively
function Get-ActionsAndTriggers {
param (
$node
)
$actionsAndTriggers = @()
foreach ($key in $node.psobject.Properties.Name) {
if ($node.$key.type -eq "ApiConnection") {
if ($node.$key.inputs.path -like "*/datasets/default*" -and $node.$key.inputs.host.connection.name -like "*sql*") {
$actionsAndTriggers += $key
}
} elseif ($node.$key -is [System.Management.Automation.PSCustomObject]) {
$actionsAndTriggers += Get-ActionsAndTriggers -node $node.$key
}
}
return $actionsAndTriggers
}
# Retrieve all Logic Apps within the subscription
$logicApps = Get-AzResource -ResourceType Microsoft.Logic/workflows
# Iterate through each Logic App and extract actions and triggers
foreach ($logicApp in $logicApps) {
# Retrieve Logic App definition
$logicAppDefinition = Get-AzResource -ResourceId $logicApp.ResourceId -ExpandProperties
# Extract actions and triggers from the Logic App definition
$allActionsAndTriggers = Get-ActionsAndTriggers -node $logicAppDefinition.Properties.Definition.triggers
$allActionsAndTriggers += Get-ActionsAndTriggers -node $logicAppDefinition.Properties.Definition.actions
# Display the Logic App name if filtered actions and triggers were found
if ($allActionsAndTriggers.Count -gt 0) {
Write-Host "Logic App: $($logicApp.Name) - RG: $($logicApp.ResourceGroupName)" -ForegroundColor Red
# Display the list of filtered actions and triggers
Write-Host "Filtered Actions and Triggers:"
$allActionsAndTriggers
Write-Host ""
}
}
The great thing about this script is that it also identifies all nested actions (actions inside other actions like If, Scopes, Switch, and so on)
If you are wondering, can we do the same and identify all using SQL V2 actions and triggers? Don’t worry, we have your back covered, check the download section. In fact, with small changes to this script you can use for all types of connectors.
Download
THESE COMPONENTS ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND.
You can download the PowerShell script to identify all SQL V1 actions and triggers inside Logic Apps Consumption from GitHub here:
If you want to identify SQL V2 actions and triggers, then download this script from GitHub:
Huge thanks to Luis Rigueira for working with me on these scripts.
Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can help us buy a Star Wars Lego for my son!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
Last week, we posted a Logic App, Best Practices, Tips, and Tricks, about the ability to resubmit multiple runs at once and how that process can be a tedious and sometimes complicated process. Luckily for us, new features appeared recently, and that process was somehow minimized. Nevertheless, a little confused.
At the time we were investigating those capabilities, we were finalizing the development of a tool to achieve that goal. Since this tool had already been developed, we decided to make it available.
This is a simple .NET Windows application that allows you to easily resubmit multiple Logic App Consumption runs at once.
To archive that, you need to be already authenticated on your Azure Portal, and you need to provide the following parameters:
Logic App name;
Resource Group;
and Subscription ID;
Optionally, you can select a DateTime range to filter the failed runs you need to resume.
If you don’t select a range, all failed runs until a max of 250 will be presented.
Where can I download it?
You can download the complete Logic App Consumption Bulk Failed Runs Resubmit tool source code here:
Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more, you can buy (or help me buy) my son a Star Wars Lego!
Thanks to my team member Luís Rigueira for being the mentor of this idea.
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
Unfortunately, no Logic App connector can make the bridge to RabbitMQ, which makes this integration challenge a little bit more complicated. However, we have the ability to create an Azure Function by using the RabbitMQ trigger for Azure Functions to overpass this limitation.
And we saw and explained in our last blog post that Azure Functions integrates with RabbitMQ via triggers and bindings. The Azure Functions RabbitMQ extension allows you to send and receive messages using the RabbitMQ API with Functions.
The purpose of this video is to explain how you create a POC that allows you to receive a message in a RabbitMQ queue, and that event triggers the Azure Function that then will route the message to a Logic App.
This was a real problem presented by a client during one of our Logic Apps training courses, where they have RabbitMQ on-premises, and they did want to pull messages from a queue into a Logic App Consumption to integrate them with other systems.
Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Big thanks to my team member Luís Rigueira for creating this video.
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
An Integration Account allows you to build Logic Apps with enterprise B2B capabilities by adding various necessary artifacts. It serves as a central repository for managing various integration assets such as schemas, maps, certificates, and trading partner agreements.
While nowadays, Logic App Standard natively supports Schemas and maps (without the need for an Integration Account), and there is a new transformation editor called Data Mapper (still in preview). Logic App Consumption still requires us to use the Integration Account and still uses the “old kind of related BizTalk Server Mapper”.
Pre-requirements
So, for us to create, in our developer environment, Schemas and Maps for Logic App Consumption to be used inside an Integration Account we need to install Azure Logic Apps Enterprise Integration Toolsextension for Visual Studio 2019 – unfortunately, there is no support for recent versions of Visual Studio. To do that, we need to:
Download and Install the extension from the Visual Studio Marketplace:
Or install it directly on Visual Studio by:
Open Visual Studio 2019, and on the Extensions menu, select the option Manage Extensions.
Search for Logic App, and then from the list, select todownload and install the Azure Logic Apps Enterprise Integration Tools.
You will probably need to restart Visual Studio.
Create an Integration Account Project
Now that we have installed everything that we need to create a new Integration Account Project, we need to:
Open Visual Studio 2019 and on the What would you like to do? window select the Create a new project option.
On the Create a new project window, search for Integration Account, and from the list below, select the Integration Account template, then click Next.
On the Configure your new project window, do the following configurations and then click Create:
On the Project name property, set a proper name for your project.
On the Location property, set the path where you want to create the project.
On the Solution name property, set a proper name for your project.
Note that a solution is a container for one or more projects in Visual Studio.
After that, a new Integration Account project is created where you can create your Schemas, Flat File Schemas, and Maps. To do that, you just need to:
Right-click on the project name and then select the option Add > New Item…
On the Add New Item window, on the left tree, select the option Logic Apps, and all the possible artifacts for you to create will be present.
Select the type.
Give it a proper name.
And click Add.
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Some of the various artifacts that can be added with an integration account are: Schemas: These are standard . xsd files containing the definition of an XML message
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
I have often been writing about how to handle exceptions and get the correct and detailed error message describing in an easy and clean matter the failures in our processes:
And the reason why that happens is that we need to accept the fact that at a certain point in time, our process will fail. Or by the fact we did some updates that weren’t tested properly, an external system is down or in intervention, or by issues in Microsoft infrastructure, and so many other reasons. But when that happens, one thing is clear: we want to get the exact error message describing why it failed. But, of course, we can always go to the Logic Apps run history – except if it is a Stateless workflow on Standard – to check where and why it failed. And obviously, we will have all the information necessary to solve the problem. However, most of the time, if not always, we want to log the error in some place like Application Insights, SQL Database, Event Hubs, and so on in an automated way without human intervention. To accomplish that, we need to access this error message in runtime.
By default, Logic App allows handling errors using the Configure run after settings at a per action level. For more complex scenarios, it can be done by setting up Scope action and implementing try-catch/try-catch-finally statements. However, getting a detailed error message can be a bit challenging inside Logic App, and you will find different approaches to implement the same capabilities, each of them with advantages and disadvantages.
In this whitepaper, we will be addressing some of the possible ways to address these needs but also addressing some of the following questions:
How can I implement error handling?
This is an important question and topic because, without proper error handling, we will not be able to send in an automated way access the error message instead, our workflow will finish in that action with a failure state.
What out-of-the-box features do I have to capture and access the error details?
How can I improve these default features in a no-code, low-code way and in a code-first approach?
Where can I download it
You can download the whitepaper here:
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
I’ve been wanting to explore the integration capabilities with DocuSign for some time now, but for some reason, I kept putting it off. Or because I didn’t have free time available or because I was doing other stuff. Until now!
And one of the reasons why I want to explore these capabilities is the fact nowadays, we have people entering at DevScope from all over the place, and we do follow a welcome process where they need to sign some documents like work conduct or work ethic, reception manual and so on. Do these need to be manually signed these days? That means that new remote employees need to print the papers, sign, scan, and send them back? Is there a better way?
But first, what is DocuSign? And how can we use it?
DocuSign is a cloud-based electronic signature platform that allows users to sign, send, and manage electronic documents securely. It is a software service that enables users to sign documents electronically and send them for signature to others quickly and efficiently. The platform is designed to be easy to use and accessible from anywhere, on any device.
DocuSign is used by individuals, businesses, and organizations of all sizes across many industries, including finance, real estate, healthcare, and more. The platform streamlines the document signing process and eliminates the need for physical documents and manual signatures, which can be time-consuming and costly. It also offers additional features like document storage, workflow automation, and advanced analytics. Overall, DocuSign helps to simplify the document signing process, increase efficiency, and reduce costs.
So, what we need to do first is to create a 30-day free account (No credit card required) on DocuSign and understand how DocuSign works.
And then, you just need to follow the steps and create your name and your password.
The second thing we will do is create a document template. For that, we need to:
On the top menu, click on the Templates option and then click Create a template.
On the Create a template page, we need to:
On the Template name option, provide a name to your template, for example, Default Service Exchange Agreement.
On the Template description option, provide a short description of your document template.
On the Add documents, select Upload or drag and drop the document to that area.
Click Save and Close.
So knowing this, let’s proceed to the Logic App, and here we will be creating a Logic App Consumption with the name LA-SendEmailToSignWithDocuSign-POC. To simplify the process I’m going to abstract the source system or application that will trigger the Logic App to send the document to be signed. That could be anything, but in our case, we will be using a Request – When a HTTP request is received trigger. To do that, we need to:
On the Search connectors and triggers box, select the Built-in and then Request connector, and then the When a HTTP request is received trigger.
On the When a HTTP request is received trigger, provide the following Body JSON Schema.
Click on + New step to add a new action. From the search textbox, type DocuSign and then select DocuSign – Logic action.
The first action we need to take is to Sign in with the account we create earlier in this tutorial.
Once you sign in, we need to Click on + New step to add a new action. From the search textbox, type DocuSign and then select DocuSign – Create envelope using Template action.
Note: In DocuSign, an “envelope” refers to a single package of one or more documents that are sent for signature to one or more recipients. It is the equivalent of a physical envelope that contains all the necessary documents for signature.
On the Create envelope using Template action, we need to:
On the Account property from the combo box, select the account to use.
On the Account property from the combo box, select the account to use, in our case, Default Service Exchange Agreement.
On the Envelope Status property from the combo box, select the Created option.
Click on + New step to add a new action. From the search textbox, type DocuSign and then select DocuSign – Add recipient to an envelope (V2) action.
On the Add recipient to an envelope (V2) action, we need to:
On the Account property from the combo box, select the account to use.
On the Envelope property, select the value to be the Envelope Id from the Create envelope using Template action.
On the Recipient type property from the combo box, select the Needs to Sign option.
Type Add new parameter and select the Signer name, Signer email, and Signing order properties.
On the Signing order property, type 1.
On the Siginer name property, select the value to be the ExternalName from the When a HTTP request is received trigger.
On the Siginer email property, select the value to be the ExternalEmailAddress from the When a HTTP request is received trigger.
Add the same action once again, but this time with the following configurations:
On the Account property from the combo box, select the account to use.
On the Envelope property, select the value to be the Envelope Id from the Create envelope using Template action.
On the Recipient type property from the combo box, select the Needs to Sign option.
Type Add new parameter and select the Signer name, Signer email, and Signing order properties.
On the Signing order property, type 2.
On the Siginer name property, select the value to be the InternallName from the When a HTTP request is received trigger.
On the Siginer email property, select the value to be the InternalEmailAddress from the When a HTTP request is received trigger.
To finalize, we need to click on + New step to add a new action. From the search textbox, type DocuSign and then select DocuSign – Send envelope action.
On the Send envelope action, we need to:
On the Account property from the combo box, select the account to use.
On the Envelope property, select the value to be the Envelope Id from the Create envelope using Template action.
And go ahead and Save your workflow.
Now if we go ahead and test our Logic App using Postman using the below payload:
We will see that our Logic App was correctly processed.
That means that:
Luis Rigueira, that is the first recipient, got an email from DocuSign with a document to sign.
And only when Luis signs the document, as a second recipient I receive the email to sign.
Note: I don’t need to implement this logic in my Logic App workflow since DocuSign will handle that internal logic.
Once both sign, we will also receive an email notification that the document is signed by both parties.
Note: once again, I do not need to implement that logic since DocuSgign will handle that for me.
Let me say, I was surprised by how excellent and efficient DocuSign is! Quite simple to use, and truly a great experience.
Now, this is the part where I need your imagination because here, we probably not using a real-case scenario we are creating a proof of concept. Nevertheless, I think this can indeed be a good approach for real-case implementation! But probably in most cases, this trigger of the Logic App can occur when a user is created on the Active Directory, or in CRM. From a SharePoint list, from an Onboarding application, well, the sky is the limit for your imagination.
Credits
Once again, a big thanks to my team member Luis Rigueira for participating in this proof-of-concept!
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
Continuous Integration and Continuous Deployment (CI/CD) is a practice that has become an essential aspect of Azure development. Although it is possible to execute each of the CI/CD pipeline steps manually, the actual value can be achieved only through automation.
And to improve software delivery using CI/CD pipelines, either a DevOps or a Site Reliability Engineering (SRE) approach is highly recommended.
In this whitepaper, Pedro Almeida and I will demonstrate how you can use Azure DevOps Pipelines to implement CI/CD based on Logic Apps (consumption).
We will explain it all in detail, from creating a project in Azure DevOps, and provisioning a Logic App Consumption to configuring the built Logic App for CI/CD.
What’s in store for you?
This whitepaper will give you a detailed understanding of the following:
An introduction to:
What is a CI/CD Pipeline?
What are CI/CD Pipelines?
What is Azure DevOps?
Create an organization or project collection in Azure DevOps
Create a project in Azure DevOps
Building a Logic App (Consumption) from scratch
Setting up the Visual Studio Logic App (Consumption) project for CI/CD
A step-by-step approach to building Azure Pipelines