In Google Web UI Perspective, we have the option to download the Google Docs (word) files in terms of the formats. From the Logic App perspective, we can easily list files from the folder, but converting that .gdoc files into Onedrive word files (.docx extension) using Azure Logic App is not a straightforward task since we don’t have a suitable connector to achieve it.
For this reason, we decide to create the Azure Function that will act as our connector to archive this transformation.
Google Documents (.gdoc) into Word Documents (.docx) Converter Azure Function
This is a simple function that will be able to convert a .gdoc file inside Google Drive into a base64 .docx encoded file.
The Azure Function will include the following NuGet packages:
Google.Apis.Auth (Version 1.58.0 or later): This package provides authentication and authorization functionality for accessing Google APIs.
Google.Apis.Docs.v1 (Version 1.58.0 or later): This package provides the Google Docs API client library, allowing you to interact with Google Docs.
Google.Apis.Drive.v3 (Version 1.58.0 or later): This package provides the Google Drive API client library, enabling you to interact with Google Drive.
And basically, what this function will do is:
The Azure Function is triggered from the Logic App (can be another method) by an HTTP POST request ([HttpTrigger(AuthorizationLevel.Anonymous, “post”, Route = null)]).
The function expects the fileId to be provided as a query parameter in the request URL and the X-Secret-Google-Credentials header to contain the Google service account credentials in JSON format.
If either the fileId or X-Secret-Google-Credentials is missing or empty, the function returns a BadRequest response indicating the missing information.
If the required parameters are provided, the function loads the service account credentials from the provided JSON file (GoogleCredential.FromJson(googleCredentialsJson)).
The function creates Drive and Docs services using the service account credentials.
It then makes a request to export the specified Google Docs file (service.Files.Export(googleDocsFileId, “application/vnd.openxmlformats-officedocument.wordprocessingml.document”)) and retrieves the resulting document as a stream.
The stream is used to create an HttpResponseMessage with the exported document as the content.
The response headers are set to indicate that the response content should be treated as a downloadable attachment with the file name “converted.docx” and the MIME type “application/vnd.openxmlformats-officedocument.wordprocessingml.document”.
Overall, the Azure Function converts a Google Docs file to a DOCX file format using the Google Docs API and returns the converted file in base64 in the HTTP response.
To use this function, you need to:
Pass the fileID as a query parameter to the Azure Function.
fileId=
And then, set the X-Secret-Google-Credentials header with this JSON format containing the Google credentials -> you can find this info when you create a private key on your Google service account.
You can download the complete Azure Functions source code here:
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Big thanks to my team member Luís Rigueira for realizing this idea.
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
Last night when I was about to deliver my Logic Apps Data Mapper session at Azure Logic Apps Community Day 2023, more or less 15 minutes earlier, I decided to do a last test run on my demos – it is always good for you to do a last-minute validation – but to be fair, I have been developing and testing my demos for about a week, so I knew that except something extremely unpredictable happens, my solution and my local environment were good to go: solutions and samples were working, and I have everything properly configured. Hey, what could happen wrong if I even had performed the same test an hour ago with success?
… really! really! F****!!! I almost s*** myself!
When you expect less, something weird will happen! One of my demos – the critical one – needs me to emulate the Logic App execution locally on my machine, so when I try to run my Logic App Standard locally by:
Select the Run menu option and click Run Without Debugging
I got the following error:
Failed to verify “AzureWebJobsStorage” connection specified in “local.settings.json”. Is the local emulator installed and running?
With 15 minutes to go for a live demo, your brain freezes immediately, and the adrenaline kicks you! You do not read all the error messages, or you will not read them properly :):)… no matter how experienced you are! That first minute is terrifying!
Then experience jumps in… Sandro, you still have 15 minutes. Relax!
Cause
The reason for this error is simple, to run Logic App Standard, you must have a storage emulator. In the earlier days, you could use the following:
Microsoft Azure Storage Emulator 5.10 tool. This tool is necessary to have full Logic Apps designer support in VS Code.
Nowadays, you can use the Azurite emulator for local Azure Storage development. This is a lightweight server clone of Azure Storage that simulates most of the commands supported by it with minimal dependencies.
Not only do you need to have it installed, but also running!
Unfortunately, when starting the debug, it does not start all the dependencies. The emulator is one of these dependencies. You need to do it manually first. However, Azurite cannot be run from the command line if you only installed the Visual Studio Code extension. Instead, use the Visual Studio Code command palette.
Solution
Luckily for us, the extension supports the following Visual Studio Code commands.
Azurite: Clean – Reset all Azurite services persistence data
Azurite: Clean Blob Service – Clean blob service
Azurite: Clean Queue Service – Clean queue service
Azurite: Clean Table Service – Clean table service
Azurite: Close – Close all Azurite services
Azurite: Close Blob Service – Close blob service
Azurite: Close Queue Service – Close queue service
Azurite: Close Table Service – Close table service
Azurite: Start – Start all Azurite services
Azurite: Start Blob Service – Start blob service
Azurite: Start Queue Service – Start queue service
Azurite: Start Table Service – Start table service
To open the command palette, press F1 in Visual Studio Code. In our case, then we need to execute the following command:
Azurite: Start
And there you go! Problem solved and demos delivered!
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
You may already know that I usually use the series A fish out of water when I want to write something that goes a little bit off-topic on my main blog topic: Enterprise Integration. This time, and despite this, can also be considered an Enterprise Integration – ETL process – I don’t consider myself a SQL Server “expert”. I often delegate these tasks to my data team. However, this week one of my clients call me regarding an issue we were facing with our integration platform, which is mainly composed of SQL Server, BizTalk Server, and Azure.
While diagnosing the problem – almost feeling like Dr. House – I realize we were not getting any new data because the ETL jobs failed with the error: There is already an active instance of this package.
That happened because we were controlling the execution of the package and not allowing multiple executions of the same package to coincide by doing the following validation.
IF (SELECT COUNT(*) AS ExecutionCount
FROM SSISDB.catalog.executions
WHERE status = 2
AND folder_name = ''
AND package_name = '.dtsx') > 0
BEGIN
THROW 50000, 'There is already an active instance of this package.', 1;
END
The problem was that I didn’t knew at that time how to monitor which SSIS packages were currently running and how to stop them. Because for some reason, I’m guessing network issues, those packages were kind of zombies. And I didn’t have my team available at that time. Because that was a production environment, I had to learn, which is also good! And this may be a helpful tip for other situations where we need to check what are the SSIS Packages running from the Catalog and want to subsequently stop them.
To accomplish that, we need to:
On current versions of Windows, on the Start page, type SSMS and then select Microsoft SQL Server Management Studio.
When using older versions of Windows, on the Start menu, point to All Programs, point to Microsoft SQL Server, and then select SQL Server Management Studio.
On the Object Explorer panel, expand Integration Services Catalogs, right-click on SSISDB, and select the Active Operations option from the options menu.
A new Active Operations window will open, presenting all the running packages
From this window, you can select the package you want and click the Stop button to force stopping that SSIS package execution.
It’s also possible to do the same process via T-SQL by running these to queries:
Query to retrieve all currently running packages in the SSIS.Catalog
SELECT * FROM SSISDB.catalog.executions WHERE end_time IS NULL
Query to stop the execution of a specific SSIS package
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
We just released another version of our Azure Function JSON Schema Validation, adding support to another feature. In this case, a very basic one, required fields.
In order to specify the mandatory properties or elements, we need to use the required keyword, where you can specify a list of strings that need to be present as key names in the list of key:value pairs that appear in a JSON document. Each of these strings must be unique.
JSON Schema Validation Function
The JSON Schema Validation is a simple Azure Function that allows you to validate your JSON message against a JSON Schema, enabling you to specify constraints on the structure of instance data to ensure it meets the requirements.
The function receives a JSON payload with two properties:
Or a 400 Bad Request if there are validation errors/issues.
Where can I download it?
You can download the complete Azure Functions source code here:
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Big thanks to my team members Luís Rigueira and Diogo Formosinho for testing and adding this new feature.
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
I have been testing Data Mapper for almost maybe 4 months since the first private previews. Still, I have usually tried the Data Mapper capabilities and not the interaction between Logic Apps Standard workflow and the Data Mapper. Now that I’m preparing and finalizing my session for the Azure Logic Apps Community Day 2023, I’m finding these little headaches in trying to put these pieces working together. You also need to be aware that this behavior and experience may change in the future since DaTa Mapper is still in preview.
So, while I was trying to call a transformation created by the new Data Mapper, in this case, a JSON to JSON transformation, running locally in my machine, I was always getting this really annoying and non-sense error since it doesn’t provide any real and valuable help or insight on the issue we are facing:
undefined. undefined
Sometimes I think the Microsoft developer team likes that I write all these Errors and Warnings, Causes and Solutions blog posts, or they are just teasing us.
I was surprised to see this error since I just finished developing my map, and I had successfully tested it on the Data Mapper editor.
And that you can see in action in this video of Kent Weare:
Solution
For the map to run successfully in runtime, within your local.settings.json file in your logic apps standard project, ensure you have the following property configuration:
FUNCTIONS_WORKER_RUNTIME property set to dotnet-isolated.
And add the AzureWebJobsFeatureFlags property with the value: EnableMultiLanguageWorker
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
Always funny returning to an old BizTalk Server solution I did in the past! Today while modifying an existing solution for the first time in 3 years! – you have to love BizTalk Server for that consistency and reliability that is difficult to find in any other platform or service! – I got a weird issue while trying to rebuild the solution:
The operation could not be completed. The parameter is incorrect.
with no more details! which is always lovely!
Cause
It has been a constant these days, but that is honestly true: I don’t know exactly the reason why. And the solutions describing this type of issue are not consistent and range from restarting visual studio, and deleting files to restarting the machine!
But my feeling is that is more related to Visual Studio security execution permissions and user permissions.
Solution
To solve this issue you should run Visual Studio as an administrator. To do that you need:
Select the Start button, and then in the Search box, type Visual Studio.
Next, right-click either Visual Studio , and then select More > Run as administrator.
Open your BizTalk Visual Studio Solution and try to build it. It worked for me!
Build started…
Build succeeded
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
I have been developing Logic App Standard for a long time and never had an issue running them locally, as far as I remember, until a few months ago. I never pay too much attention because I was not running them locally, and I didn’t block my ability to develop my solutions or workflows.
However, this week, while developing my demos for the Azure Logic Apps Community Day 2023 event, I had the need to test them locally, and every time I try to run them locally by either:
Start Debugging
Or Run Without Debugging
I was getting the following error:
Failed to find “func host start” task.
Cause
To be honest, I don’t know because I had all the pre-requirements installed, but I guess, and this is just me guessing, that some Azure Function extension update broke some configuration between these to extensions.
Solution
I know that you probably will not like it… but after spending a few hours, I give up and when to a drastic approach – this one – that solved the problem.
To solve this issue, you need to:
Uninstall all Azure Functions extension dependencies.
In my case, the Azure Logic Apps – Data Mapper and Azure Logic Apps (Standard) extensions
Uninstall the Azure Functions extension.
Restart Visual Studio Code
Install all the extensions again, in my case:
Azure Function
Azure Logic Apps (Standard)
Azure Logic Apps – Data Mapper
Just to be in a safe state, restart Visual Studio Code again.
After that I was able to run my workflow locally without any issue.
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
An Integration Account allows you to build Logic Apps with enterprise B2B capabilities by adding various necessary artifacts. It serves as a central repository for managing various integration assets such as schemas, maps, certificates, and trading partner agreements.
While nowadays, Logic App Standard natively supports Schemas and maps (without the need for an Integration Account), and there is a new transformation editor called Data Mapper (still in preview). Logic App Consumption still requires us to use the Integration Account and still uses the “old kind of related BizTalk Server Mapper”.
Pre-requirements
So, for us to create, in our developer environment, Schemas and Maps for Logic App Consumption to be used inside an Integration Account we need to install Azure Logic Apps Enterprise Integration Toolsextension for Visual Studio 2019 – unfortunately, there is no support for recent versions of Visual Studio. To do that, we need to:
Download and Install the extension from the Visual Studio Marketplace:
Or install it directly on Visual Studio by:
Open Visual Studio 2019, and on the Extensions menu, select the option Manage Extensions.
Search for Logic App, and then from the list, select todownload and install the Azure Logic Apps Enterprise Integration Tools.
You will probably need to restart Visual Studio.
Create an Integration Account Project
Now that we have installed everything that we need to create a new Integration Account Project, we need to:
Open Visual Studio 2019 and on the What would you like to do? window select the Create a new project option.
On the Create a new project window, search for Integration Account, and from the list below, select the Integration Account template, then click Next.
On the Configure your new project window, do the following configurations and then click Create:
On the Project name property, set a proper name for your project.
On the Location property, set the path where you want to create the project.
On the Solution name property, set a proper name for your project.
Note that a solution is a container for one or more projects in Visual Studio.
After that, a new Integration Account project is created where you can create your Schemas, Flat File Schemas, and Maps. To do that, you just need to:
Right-click on the project name and then select the option Add > New Item…
On the Add New Item window, on the left tree, select the option Logic Apps, and all the possible artifacts for you to create will be present.
Select the type.
Give it a proper name.
And click Add.
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Some of the various artifacts that can be added with an integration account are: Schemas: These are standard . xsd files containing the definition of an XML message
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
With INTEGRATE 2023 London concluded, time to shift gears to the next big event on cloud integration: Azure Logic Apps Community Day 2023, sometimes called LogicAppsAviators Community Day, which will take place on Thursday, June 22nd at 9 AM (Pacific) or 5 PM (UTC). The event is free and will be streamed on YouTube/Twitch, so be sure to subscribe to the Azure Developers YouTube to stay up to date.
Azure Logic Apps Community Day 2023will be the must-attend event for anyone who wants to learn more about Logic Apps and how it can help to solve real-life integration problems. It will be a full day of learning from the basics of getting started to deep dives into advanced automation with Logic Apps presented by the Logic Apps product group, Microsoft MVPs, and expert community members. In the end, will be a Round Table Discussion – Ask Me Anything with the Product Group and Community – this will be your opportunity to make “hard” questions.
I will have the pleasure of delivering a session about the new Data Mapper at this event and also be part of the panel on the Round Table Discussion!
About my session
Session Name: A walk in the park with the new Logic App Data Mapper
Abstract: In this session, we will present the new Data Mapper experience for Logic Apps Standard and how we can apply XML to XML transformations or XML to JSON transformations using a visual designer. Here, we will also address how to implement well-known mapping patterns like direct translation, Data translation, content enricher, or aggregator patterns alongside many others.
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira
We just released a new version of our Azure Function JSON Schema Validation, adding support for more complex schema validations. In this case, we add support for applying subschemas validation conditionally.
The if, then and else keywords allow the application of a subschema based on the outcome of another schema, much like the if/then/else constructs you’ve probably seen in traditional programming languages.
If if is valid, then must also be valid (and else is ignored.) If if is invalid, else must also be valid (and then is ignored).
If then or else is not defined, if behaves as if they have a value of true.
If then and/or else appear in a schema without if, then and else are ignored.
JSON Schema Validation Function
The JSON Schema Validation is a simple Azure Function that allows you to validate your JSON message against a JSON Schema, enabling you to specify constraints on the structure of instance data to ensure it meets the requirements.
The function receives a JSON payload with two properties:
Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego!
Big thanks to my team member Luís Rigueira for adding this new feature.
Author: Sandro Pereira
Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.
He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira