Sometimes we have the need to perform a kind of “fire and forget” pattern in Logic Apps. Todays post is a short one, but very useful one.
Usually, a Logic App will have a synchronous pattern, meaning you call it and you will have to wait for it to finish processing.
But how do we configure our LA to receive a request and continue processing without us having to wait for it?
It’s quite simple actually, although not a very pretty thing to do.
The way to achieve this is to set a Response action right after the trigger action and in the settings, set the “Asynchronous Response” to true. It’s not pretty as I’ve said, but it will set the path for the async pattern we’re looking for.
There should be a flag that you could set in the Trigger to automate this and send back a response like this, but so far, this feature is not available yet.
The response will be sent to the calling system, whatever it is, with the status code 202 Accepted.
You can also set custom headers and a body, but it might not help much.
As you can see, the Response will automatically set a location header for you to “ping” to check the status. By default, the engine will refresh every 20 seconds.
So that’s all there is to it, it’s a simple way to achieve an asynchronous pattern with Logic Apps, although not very pretty, but it works!
Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform and Azure iPaaS?
Integration weekly updates can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.
As part of development guides, it’s always a good idea to have a fallback plan and handle errors.
You can be 99.999% confident that your code won’t fail, but that 0.001% chance happens. “Anything that can go wrong will go wrong” – Murphy’s Law
And so, we resort to our very dear friend, Try-Catch.
In Logic Apps, it’s not exactly an out-of-the-box functionality, but it’s actually quite simple to achieve this and with a few steps. Also, there are multiple ways to catch your errors.
In this post, we will try two approaches:
Using a For-Each loop
and a Filter Array action.
Since I’ve started developing LAs, I’ve used the For-each loop approach but it had some flaws. It involved using a Parse Json to catch only the error message, but not all actions have the same schema.
So, the idea of the Filter array came to play. It’s actually quite easy as well and easy to maintain. You’ll find the same issue with the schemas, but it’s a faster approach.
Let’s dig in. I started to build a simple Logic App, just creating a couple of variables and an HTTP call that I know that will fail. I mocked the results to ensure the outcome is what I needed.
I’ve also built a scope, creating the Try block and a second scope to handle the Catch block. You’ll have to set the “Run after” properties to only trigger on errors, skips or timeouts, if not, it will run on success.
It will always relate to the previous scope.
The For-Each loop approach
Now we start to build our Error handler. I’ve chosen the For Each loop first because it was faster to create since I’m used to it and even have some templates for it as well.
The For Each action requires an array action to iterate on, which means we need to find one. The Scope isn’t a loop, so what will we relate it to?
Well, the scope might not be, but there are N actions inside it, so if you search in the expressions box or the documentation, you’ll find the “result” expression, which records every result of the contained actions within a given scope.
Now, remember, this will need to point to the action you want, but you will not have it in the Dynamic Content, you need to write it using the _ for spaces, because this expression handles the JSON node name like if you’re working in the back code.
Once you have this set, you just need to create a condition to check if the status of the action was “Failed“. Pretty simple.
If you test the execution, you’ll see that the loop is working and iterating the actions batch that the “result” expression returns.
I’m just returning the action outputs in the error string, which will contain the StatusCode, Headers and Body. It should help to diagnose a possible error.
Let’s try the Filter array now.
The Filter Array action approach
Similar to the For Each, we need to iterate through an action that contains child actions. We use the same “result” expression pointing to the same scope as the “From” property and choose “item()?[‘status’]” as the node to search for. Also, we only want the failed actions, so the node should be equal to “Failed“.
As for the error message, it’s a bit different from the For Each type. We’re still picking up the Outputs but we need to get the first action from the Filter array action.
The end result should be the same, as we’re picking up the same info as with the For Each loop.
Usually, an action will return a JSON record as the result of its execution. There are some fields that will always be present, like “Status” and “Tracking ID”. There’s no easy way to find this info, so you have to deconstruct one or more actions to find it. With the information you have now, you can get it from anywhere, you just have to use the “Result” expression.
Here you can see some fields in the Set var action I created and how the status is recorded. For tracking purposes, the execution engine records the begining and end timestamps as well as other useful data.
Now that you know how, it’s time to get working and make your Logic Apps sturdier and with a proper error handling.
Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform and Azure iPaaS?
Integration weekly updates can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.
I usually use the series A fish out of water when I want to write something that goes a little bit off-topic on my main blog topic: Enterprise Integration. But this time is different. This was kind of the first thing I thought when I saw this error happen since that didn’t make sense at all.
To contextualize better this reason and blog post, I have been testing the new Logic Apps (Preview) for a while, and if you already try it before, you will know that this new type of Logic Apps runs on top of Azure Function Runtime. This means that now you can run Logic Apps anywhere: in the cloud, on-premises, locally on your laptop, or wherever you need to.
One of the requirements necessary to have the full Logic Apps designer support in VS Code is the Microsoft Azure Storage Emulator 5.10 tool
Microsoft Azure Storage Emulator 5.10 tool – This tool is necessary to have the full Logic Apps designer support in VS Code. This tool will use a local Microsoft SQL Server Express LocalDB instance (you can also use a SQL Server instead) to emulate Azure storage services.
Everything was working properly for the last four months or more, however for no plausible reason today while I was trying to start the Microsoft Azure Storage Emulator I got this error:
Probing SQL Instance: ‘(localdb)MSSQLLocalDB’. Caught exception while probing for SQL endpoint. A network-related or instance-specific error occurred while discovering a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 50 – Local Database Runtime error occurred. Error occurred during LocalDB instance startup: SQL Server process failed to start. ) Number of SqlErrors Reported: 1 SqlError: System.Data.SqlClient.SqlError: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 50 – Local Database Runtime error occurred. Error occurred during LocalDB instance startup: SQL Server process failed to start. ) Could not find a LocalDB Installation. Probing SQL Instance: ‘localhostSQLExpress’. Caught exception while probing for SQL endpoint. A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 – Error Locating Server/Instance Specified) Number of SqlErrors Reported: 1 SqlError: System.Data.SqlClient.SqlError: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 – Error Locating Server/Instance Specified) No available SQL Instance was found. One or more initialization actions have failed. Resolve these errors before attempting to run the storage emulator again.
Cause
Again, I can’t find any plausible reason for this to start happend, except the fact that my machine installed some system updates.
Solution
I don’t know if all these steps are necessary or not, I was just simple frustrated about this situation because I need to present this topic in my upcoming sessions this week, so I didn’t test all the hypotheses available properly. Nevertheless, I managed to solve this issue by simply recreating my LocalDb instances and allowing the emulator to recreate his db files.
Here is the steps necessary to perform these tasks:
Get your current LocalDB instance name
<pre class="wp-block-preformatted">
sqllocaldb i
</pre>
Delete the existing LocalDB instance
If you feel more comfortable you can choose to stop the LocalDB instance before you delete
<pre class="wp-block-preformatted">
sqllocaldb d &lt;LocalDB instance>
</pre>
Create your new LocalDB instance with the same name or a different one.
<pre class="wp-block-preformatted">
sqllocaldb c &lt;LocalDB instance>
</pre>
and finally, you need to delete the following Azure Storage Emulator database files to allow the emulator to recreate db files on the first start.
Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform and Azure iPaaS?
Integration weekly updates can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.
Everyday we learn new things and when it comes to Logic Apps, we tend to learn even more, because it’s always shifting and new components are added. If we’re using ARM templates, the deployment brings out some challenges and with it, new things to learn (and lots of cute little things that make you want to bang your head against a brick wall).
Usually when we work with a CSV file we tend to keep the sorting according to the specification. It isn’t always alphabetical nor descending/ascending.
Sometimes, it’s just a real mess but it makes sense to the client and to the application that is consuming it.
A few days ago, whilst working on a client project and after dozens of tests, we started to see errors in our CSV file, where the headers and columns were arranged in a alphabetical sorting. This was not my intent, when I built the CSV array I wanted it to be in a certain order.
So why was this array now being sorted, who gave that command and how could I correct it?
Why and who:
As we dig in the Logic App code, we see that the Logic App is JSON in its core (my god, shocking development!). As such, it will follow JSON rules on sorting. If we set / append our variable with an array, even though that array won’t show up ordered in our code, it will when we deploy it to our Resource Group.
Lets prove this.
First, we set our LA in Visual Studio and initialize a string. Then we set two values to it ( “Append to string variable”) . One as a string and the other as an array.
Let’s look at the back code.
Looking good so far. Our strings are set and it’s in the order we want.
Lets deploy it to our RG and check again.
Well, there it is. In ARM deployment, if we write a JSON object, on deployment it gets sorted and will appear like this in the designer tool in Portal.
Funny thing is that if we change our object to the string we want, the designer will not recognize this as a change and doesn’t let you save.
Even in Code View the changes are not recognized.
But if we add other text to it, the changes are now recognized and Portal allows to save.
But still, it won’t show you the changes and will still sort out your CSV array, once again because it’s JSON.
A few weeks ago, this behavior wasn’t noticeable I had a few Logic Apps in place with the string array in a specific order and when deploying it didn’t get sorted.
I searched in Azure updates to see if anything was mentioned but nothing came up.
How to bypass this issue?
If you’re working with a CSV file like I was, after you build your array, you’ll need to build a CSV table.
The action “Create CSV table” will take care of this from you, but as we know, it will not be in the same format we need.
(notice I’ve switched to array variable because I can’t parse the string in JSON)
So, leaving the Columns in automatic mode will mess up your integration as you can see. The output will be sorted and it won’t be what you want / need.
What a mess!! This is nothing like we wanted.
We will need to manually define the columns headers and the value they’re going to have.
If you don’t have many fields, it’s quick to do this, but when you have lots of fields… well, let’s just say I hope you have plenty of time and don’t lose focus.
And there we have it. Fields are now displayed correctly, the data is in the right place and we’ve managed to get around this annoying problem.
Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform and Azure iPaaS?
Integration weekly updates can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.