Yesterday while troubleshooting a BizTalk Server developer environment at a client, I encountered an unusual error while trying to configure the BizTalk Server Backup job:
Could not find server ‘server name’ in sys.servers.
Based on the error description, and by the fact that I was trying to run a SQL Server job, I knew that the error should be on the SQL Server side, some incorrect configurations,
Cause
Why this problem start to happen is unclear to me, and at the time I didn’t have all the information available to understand that. However, after investigating a little this error type I realize that we can execute a SQL query to check out what is your linked server:
select * from sys.servers
Or
Select @@SERVERNAME
The problem was that once I run these scripts, I realize the server name was not the expected one, it was incorrect! For example:
I was expecting BTS2020LAB01.
But instead, I was seeing VMIMAGE01.
And that was causing the failure on the Backup job and other BizTalk server jobs.
Solution
To solve this issue we need to fix that server name and for that, we can apply the following SQL script:
sp_dropserver 'VMIMAGE01'
GO
sp_addserver 'BTS2020LAB01',local
GO
After applying this script, make sure you restart the SQL Server service.
Once you have done these steps, you can successfully start your BizTalk Backup job.
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
If you come from BizTalk and have an integration background like me, you may remember that we had a Loopback Adapter in BizTalk Server.
The Loopback adapter is a two-way send adapter that, in essence, returns a copy of the outbound message back to the caller, in most cases, an orchestration. This capability can be used in several situations or scenarios:
One is to replace the need to call pipelines inside an orchestration. Calling pipelines inside orchestrations is a good option, but it complicates the logic of the orchestrations with loops and expressions. You’ll also end up repeating the “code” for each message type/orchestration.
Or have a way to invoke a party/agreement to get more details in the message.
Using Loopback adapter to subscribe NACKs off Web Services.
And so on.
However, while trying different implementation strategies using Azure Integration Services, I realize that the use of a Loopback API is also very handy.
Are you wondering where you can use this component or strategy?
In many different ways like:
If we need to do a liquid transformation inside Logic App Consumption, we need to have an Integration Account, which is expensive. Instead, we can expose an operation in API Management to perform a liquid transformation. For that, you will need the Loopback API.
Throwing an exception inside Logic Apps. There are many ways to archive these capabilities. One of the options is again exposing an operation in API Management to throw back the exception. For that, you will need the Loopback API.
And so on.
Soon I’m going to publish some of these implementation strategies.
For now, I leave you the Loopback API code for you to try and provide feedback.
Loopback API Azure Function
The Loopback API is simply an Azure Function that, in its essence, returns a copy of the inbound message/request back to the caller. We can say that this Azure Function mimics the Echo API, which comes by default in the API Management,
You can download the complete code for the function from GitHub. The link is below at the bottom of the blog post. Here is a small code snippet:
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
//dynamic data = JsonConvert.DeserializeObject(requestBody);
string contentType = req.Headers["Content-Type"];
return new ContentResult { Content = requestBody, ContentType = req.Headers["Content-Type"] };
Where can I download it
You can download the complete Azure Function source code here:
Once again, thank my team member Luis Rigueira for testing these concepts with me!
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
I have often been writing about how to handle exceptions and get the correct and detailed error message describing in an easy and clean matter the failures in our processes:
And the reason why that happens is that we need to accept the fact that at a certain point in time, our process will fail. Or by the fact we did some updates that weren’t tested properly, an external system is down or in intervention, or by issues in Microsoft infrastructure, and so many other reasons. But when that happens, one thing is clear: we want to get the exact error message describing why it failed. But, of course, we can always go to the Logic Apps run history – except if it is a Stateless workflow on Standard – to check where and why it failed. And obviously, we will have all the information necessary to solve the problem. However, most of the time, if not always, we want to log the error in some place like Application Insights, SQL Database, Event Hubs, and so on in an automated way without human intervention. To accomplish that, we need to access this error message in runtime.
By default, Logic App allows handling errors using the Configure run after settings at a per action level. For more complex scenarios, it can be done by setting up Scope action and implementing try-catch/try-catch-finally statements. However, getting a detailed error message can be a bit challenging inside Logic App, and you will find different approaches to implement the same capabilities, each of them with advantages and disadvantages.
In this whitepaper, we will be addressing some of the possible ways to address these needs but also addressing some of the following questions:
How can I implement error handling?
This is an important question and topic because, without proper error handling, we will not be able to send in an automated way access the error message instead, our workflow will finish in that action with a failure state.
What out-of-the-box features do I have to capture and access the error details?
How can I improve these default features in a no-code, low-code way and in a code-first approach?
Where can I download it
You can download the whitepaper here:
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
In my last blog post – Logic Apps and DocuSign integration – I explain how you can create and dynamically populate a contract using an HTTP input and then use the DocuSign connector to complete the workflow! Today we are going to see another DocuSign integration capability.
By default, and as I mentioned in my last post, once the document is completed and signed by both parties, all participants will receive an email informing them that All signers completed Complete with DocuSign and with an available link for them to access and view the document.
However, the signed document will live and will be available on the DocuSign platform, which is probably not ideal, typically, we would like to store it internally, if it is personal, in our local drive or dropbox, in an organization inside SharePoint or CRM for example.
So, the main questions are: Is it possible to automate this part of the process? and if yes, how?
And yes, it is possible, and this is what we will address in this blog post! For simplicity, we will be using Dropbox as our archive system.
Before we start to explain all the steps you need to do to implement this logic, let me tell you that what you usually expect to see in the Logic App workflow will be typically like this:
Where the:
Logic App will be triggered once the status of the envelope change to Completed.
From there, we will make an additional call to Get the envelope documents’ content.
And then create a file on Dropbox.
That will create a for each action since the structure of the response is an array despite only containing one row.
I don’t like this approach for several reasons:
We are making an unnecessary external call to DocuSign – the Get envelope documents content call.
It is less performant. The processing time of the Logic App can go from 3 to 5 seconds:
We can do way better! and do the same in less than 2 seconds. So here we are going to explain the “intermedium” approach. For simplicity, I won’t implement the “advanced” approach that only contains two actions!
So, to archive a signed document, what we need to do is:
Create a new Logic App, let’s call it: LA-ProcessDocuSignSignedDocuments-POC.
On the Search connectors and triggers box, type DocuSign, select the DocuSign connector, and then select the When an envelope status changes (Connect) (V3) trigger.
On the When an envelope status changes (Connect) (V3) trigger, we need to:
On the Account property from the combo box, select the account to use.
On the Connect name property, type a name that correctly describes your connector, for example, Process Document Signed.
On the Envelope Status property from the combo box, select the envelope-completed option.
Note: if you use V2 of this action, the Envelope event property value will be Completed and not envelope-completed.
Click on + New step to add a new action. From the search textbox, type Data Operations and then select Data Operations – Compose action.
On the Compose action, we need to add the following input value:
Here, we are extracting only the necessary pieces of information that the trigger will provide to us and creating a simple JSON message with all the information we will need to create the file in our Dropbox according to the rules we specify. For example:
The expression triggerBody()?[‘data’]?[‘envelopeSummary’]?[‘recipients’]?[‘signers’][0]?[‘name’] will contain the name external entity/person that signed the document.
The expression triggerBody()?[‘data’]?[‘envelopeSummary’]?[‘sender’]?[‘userName’] will contain the name internal entity/person that signed the document – in this sample, my name.
The expression triggerBody()?[‘data’]?[‘envelopeSummary’]?[‘envelopeDocuments’][0]?[‘name’] will contain the original name of the file we add on DocuSign – in our case, SERVICE EXCHANGE AGREEMENT.pdf.
The tricky part was the File data to be added to the Dropbox. If we send only the bytes that are available on the PDFBytes property, a file will be created on our Dropbox, but it will be corrupted, and we cannot open it. So we need to create a structure with content type and content information.
Now, continue our Logic App workflow. Let us:
Click on + New step to add a new action. From the search textbox, type Data Operations and then select Data Operations – Parse JSON action.
On the Parse JSON action, we need to:
On the Content property, set the value to the Output property of the previous Compose action, in our case, the Map TriggerData to DropBoxData.
Note: This will tokenize the properties of the JSON message we created earlier, and that will allow us to easily used them in the following action.
Note: these two actions, the Compose, and the Parse JSON, are optional. We could not use them and configure all the properties of the next action with expressions – this will be the “advanced” approach!
Click on + New step to add a new action. From the search textbox, type DropBox and then select Dropbox – Create file action.
On the Create file action, we need to:
On the Folder Path property, define the path inside dropbox to where to store the file.
On the File Name property, we want the file to have the following naming convention:
___
So for that, we need to define this property with the following expression:
On the File Content property, set the value to the FileData property of the previous Parse JSON action, in our case, the Parse DropBoxData JSON.
Finally, save your Logic App.
By clicking saving, this will also create in DocuSign platform a Connect configuration with the name you used on the trigger, in our example, Process Document Signed. You can see this by:
On the top menu, select Settings. From there, in the left tree, select the Connect option under Integrations.
Now, for these “intermedium” and “advanced” approach to work we need to do a small trick on the connection inside DocuSign. To do that, we need to:
Select the Process Document Signed connection.
On the Edit Custom Configuration page, scroll down to the option Trigger Events under Event Settings, and expand the Envelope and Recipients panel.
And then expand the Include Data panel and make sure you select the following options:
Custom Fields
Documents
Extensions (optional)
Recipients
Tabs (optional)
Click Save Configuration in the top right corner on the Edit Custom Configuration page
Now to test it, we need to start by executing the Logic App we create yesterday, in order to generate a new document (aka envelope) to be signed.
Once both parties sign the document, our Logic App will be triggered:
And a file will be added in our Dropbox:
And, to be sure, if you click on it, you will the the document signed by Luis and me.
Now, have you notice on the picture above the execution time of this approach? All of them in less than a 2 seconds!
Credits
Once again, a big thanks to my team member Luis Rigueira for participating in this proof-of-concept!
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira