Global Automation Bootcamp | February 4, 2022 | How to monitor your integrations solutions with Automation Account

Global Automation Bootcamp | February 4, 2022 | How to monitor your integrations solutions with Automation Account

As part of the Intergalactic Automation Summit 2022 online event organized by the Power Community that is taking place between 4-6th February 2022:

  • 4th Feb- Power Automate Bootcamp
  • 5th Feb- Azure Integration Bootcamp
  • 6th Feb- Power Platform ALM DevOps

All of these events are free! And you can register here.

I choose to submit a session to the Global Automation Bootcamp, and I’m honored to be accepted as a guest speaker on a session about How to monitor your integrations solutions with Automation Account. My session will take place at 05:00 pm according to GMT/UTC.

How to monitor your integrations solutions with Automation Account

In this session, we will address how you can monitor your integrations solutions using Azure Integration Account running PowerShell Runbooks and Logic Apps to notify inconsistencies in your solutions. For those reasons, I would like to invite you to join me at the Global Automation Bootcamp virtual event on Friday, February 4, 2022.

Session name: How to monitor your integrations solutions with Automation Account

Abstract: In this session, we will address how you can monitor your integrations solutions using Azure Integration Account running PowerShell Runbooks and Logic Apps to notify inconsistencies in your solutions.

Join us and reserve your presence at the Global Automation Bootcamp virtual event on Friday, February 4, 2022, it is free!

The post Global Automation Bootcamp | February 4, 2022 | How to monitor your integrations solutions with Automation Account appeared first on SANDRO PEREIRA BIZTALK BLOG.

BizTalk Monitor Suspend Instance Terminator Service

BizTalk Monitor Suspend Instance Terminator Service

Monitoring a BizTalk Server environment can sometimes be a complex task due to the infrastructure and complexity layers behind the BizTalk Server. Apart from that, the administrator teams need to monitor all the applications deployed to the environment.

Ideally, the administration team should use all monitoring tools at their disposal, whether they are included with the product, such as BizTalk Server Administrative console, Event Viewer, HAT, or BAM. But the main problem with these tools is that:

  • They need manually intervention.
  • Almost all of them requires remote access to the environment.

When an administrator must manually check each server or application by events that may have occurred, that is not a very efficient and effective way to allocate the team’s time nor to monitor the environment.

Of course, they can also use other monitoring tools from Microsoft, such as Microsoft System Center Operation Manager (SCOM), or third-party monitoring solutions such as BizTalk360. These tools should be able to read events from all layers of the infrastructure and help the administration team to take preventive measures, notifying them when a particular incident is about to happen, for example, when the free space of a hard drive is below 10%. Furthermore, they should allow the automation of operations when a specific event occurs, for example, restart a service when the amount of memory used by it exceeds 200MB, thereby preventing incidents or failures, without requiring human intervention.

But the question is: and if you don’t have these tools?

You can archive these tasks in several ways. Many people create custom web portals to emulate some of the most basic tasks of the admin console. One of my favorite options is using a mix of PowerShell, schedule tasks, and/or Azure Services like Logic Apps and Functions. But today I will show you a different or alternative way:

  • Create a Windows Service to monitor suspended Instances and automatically terminate them

Note: of course, this solution can be expanded to other kinds of stuff or add new funcionalities.

BizTalk Monitor Suspend Instance Terminator Service

This is a Windows Service that will be continually monitoring BizTalk Server for specific suspended messages (with an interval of x seconds/minutes/hours defined on code) and termites them automatically.

This tool allows you to configure:

  • The type of suspended messages you want to terminate
  • Terminate without saving the messages or saving them to a specific folder before terminating them.

These configurations are made on the app config of the service:

<ServiceFilter>
	<add key="ServiceClass" value="64"/>
	<add key="ServiceStatus" value="32"/>
	<add key="ErrorId" value="0xC0C01B4E"/>
	<add key="Action" value="Terminate"/>
	<add key="SaveLocation" value="C:ArchiveError1"/>
</ServiceFilter>
<ServiceFilter>
	<add key="ServiceClass" value="4"/>
	<add key="ServiceStatus" value="4"/>
	<add key="ErrorId" value="0xc0c01680"/>
	<add key="Action" value="SaveAndTerminate"/>
	<add key="SaveLocation" value="C:ArchiveError2"/>
</ServiceFilter>

You can also define on the app config file the:

  • Database name, that by default is already BizTalkMgmtDb
  • and the Database Server Host Name, by default localhost

The solution available on GitHub already provides a straightforward setup file.

Download

THIS TOOL IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.

You can download the BizTalk Server GetTrackedMessage tool from GitHub here:

The post BizTalk Monitor Suspend Instance Terminator Service appeared first on SANDRO PEREIRA BIZTALK BLOG.

DevOps – Multi environment variables/ groups

DevOps – Multi environment variables/ groups

This is a topic that has been asked to me a few times, making me wonder how hard it actually was. Working with this nearly every day makes us assume some things are very easy, but not everyone has this insight.

So, exactly do we set variables for different environments and how does it work when we want to replace tokens?

Variables for different environments

Having multiple environments creates the need to have different values assigned to your variables, because, for example, that Test Webservice won’t work in PROD and you definitely don’t want to use that PROD file share and delete files in your DEV/Test environment.

Using Pipeline Variables helps you to set different values to different Stages.

This is extremely helpful because, even though you have to duplicate/triplicate variables, you won’t need to worry about the incorrect value going to the wrong stage. Also, having the Scope set to Release, it will affect all stages.

So, it’s a win-win situation.

But! It’s only valid for this Release Pipeline in specific. If you have another Release and some variables are common, you have to re-do everything… all, over, again.

Send in the Variable Groups!

Variable Groups

The Variable Groups are containers for variables that can be used in multiple Releases and Pipelines. Think of it as a common class in your project that you can reference anywhere.

You can define the Groups and their variables in the Library. Inside the group, you can set all the variables you need, and add to it any time as well, and assign the values right away.

Keep in mind that this is thought of as a static group, it’s not supposed to change often.

If you change a variable value or add a new one, it will not be considered in the already created releases. If anything changes in here, you will need to create new releases (not the pipelines) and redeploy them. When you create the release, it takes a snapshot of the values and uses them as they are. Thus the need to create a new one to get those new values.

After linking the group to the Release, you will see that you can also set a Scope. This works exactly like the pipeline variables, they will only be used in that specific Stage and nowhere else.

Also, when expanded, you can see the values that are set for that group.

Now, how does the Token Replacement task works with this?

Replace Tokens

This task, our savior (yes, I like it very very much), comes to our rescue once again.

I’ve explained before how to use it and how it works.

But for this post, I’ll explain again. The task searches in the folders/files you’ve defined and tries to match the token that you’re setting in the definition with the one in the file(s). As the token is found, it uses a string.Replace function to inject the values in the files.

It will scour the Variables for a match and take the value to insert in the file.

But how does this link with the Variable Groups?

Well, at runtime, DevOps does a magical thing and sees the groups you’ve defined for a Stage as variables. So technically, it’s as if you’ve defined all the variables in one place and not in groups.

Pretty sweet, right?

So, the Replace Tokens will use all those variables and will try to replace them in your files. You don’t have to define the group or anything, it will just see the whole picture.

Hope this helps you with your automations and deployments.

Happy coding!

The post DevOps – Multi environment variables/ groups appeared first on SANDRO PEREIRA BIZTALK BLOG.

Logic Apps: CI/CD Part 3- Building your Azure Pipeline

Logic Apps: CI/CD Part 3- Building your Azure Pipeline

In the previous posts of these series, we’ve talked about how to build and prepare your Logic App for CI/CD. In this last post, I’ll show you how to build your Azure Pipeline, making it prepared for any environment you need.

If you’ve missed the other posts, here are the links for them:

Logic Apps: CI/CD Part 1- Building your Logic App

Logic Apps: CI/CD Part 2- Preparing for CI/CD

The Pipeline

Assuming you already have your repo configured, building the pipeline is fairly simple and quick. I’m not a big fan of using YAML, I find it easier to use the classic editor, having the GUI seems more appealing to me.

Having your repo in place and all the code ready, you need create the Pipeline.

As such, you need to choose the classic editor (or venture yourself in YAML) and select your repo and branch.

The available templates are helpful but if you’re just trying to deploy logic apps, I’d suggest you start with an empty job, because you might have actions that are not necessary and you’ll have to delete them.

The first thing we’re going to do, is configure the pipeline for continuous integration. It doesn’t take much to achieve this, you just need to activate the needed triggers. By default, it will filter to your main branch, but you can change this and trigger for specific projects and branches. This comes in handy when you have multiple projects and you only want to include some in the build.

After enabling the triggers, you’ll need to add the required tasks to get your pipeline going. You might be getting a few secrets in Key vault, if that’s the case, do remember to add the Azure Key Vault task. This will pull either all the secrets or the filtered ones you’ve selected, keeping them in cache for the pipeline execution. This will be used in the Replace Tokens task, which I’ll discuss a bit down the road.

As you can see, it doesn’t take many tasks to have a functional pipeline, ready to deploy your Logic App to the CI environment.

The required tasks are:

  • Visual Studio build – to build your solution, obviously
  • Copy files – which will copy the desired files over to a folder in the Drop
  • Publish build artifacts – makes the drop available to use in the pipeline and the release
  • Replace Tokens – a very handy tool that allows you to replace your tokens with the variables or group variables values
  • ARM template deployment

The Copy files task is very simple and easy to use. You take the input folder, copy the files you want/need to the target folder. Easy-peasy-lemon-squeezy.

I’d advise you to set the Target Folder as a named one, when you’re building the Release, it will be easier to find what you need if you divide your assets by name.

After copying the files, we will replace the tokens. How does this work?

Simply put, the task collects all the variables in memory and searches for the token pattern in all the target files. Given that we wrote our parameters with the __ … __ token, if we use other tokens in the files, it should not affect them. This is by far, in my opinion, the most helpful task in multi-environment deployment. It takes out the need to have multiple files by environment and having tons of variables.

Having the files copied, tokens replaced, our Logic App is ready for deployment in the CI environment. Now, this is not mandatory, you might not want to deploy your LA from the pipeline, you might want to use the Release instead. This is fine, you just need to move the ARM deployment tasks to the Release, it will not affect the outcome nor the pipeline.

As you can see, after selecting the Azure details (Subscription, RG, Location, etc) it becomes easy to select your LA to deploy. Since we used the LogicApps folder, we just need to reference the JSON files and the task will pick them up from the drop folder and deploy them.

Final notes

You’re now ready to go on your adventures and build your Logic Apps, get them ready for Continuous Integration and deploy them. I didn’t approached the Release Pipeline because it’s also very simple. You will only require to create your variables, replace your tokens and deploy the ARM templates.

You can fiddle around with gates, automated deployments, pre-deployment approvals and all, but that is a more advanced feature.

Having multiple releases that you want to joint deploy, you can even build Orchestrations (I can hear all the BizTalk bells ringing in our heads). This is not as simple as isolated deployments, because it does involve some orchestration of the parts (well, duhh).

I hope this small series of posts helped you to solve issues and to improve your deployments.

And, as always, Happy coding!

Fallout 76: o Vault Boy está de volta - Recomendações Expert Fnac

The post Logic Apps: CI/CD Part 3- Building your Azure Pipeline appeared first on SANDRO PEREIRA BIZTALK BLOG.

Logic Apps: CI/CD Part 2- Preparing for CI/CD

Logic Apps: CI/CD Part 2- Preparing for CI/CD

In the last post we talked about building a Logic App from scratch and gave a few hints on what we would change to prepare for CI/CD.

In this post, we will show you how to prepare your Logic App and template files, how to set and rename your parameters and will hint on how it will correlate with the Azure Pipeline.

So lets recap. We saw that the needed requirements are having VS installed, Azure SDK, Logic Apps for Visual Studio tools extension and an active Azure subscription. We built a new Azure Resource Group project with the Logic Apps template and added a few actions to our LA, nothing too fancy, just enough to show what’s needed.

Now, let’s look at how we will change the code to get it ready.

Changing the JSON code to prepare it for CI/CD is simple but requires attention, because if not done properly, you won’t be able to deploy your template and it might take you a while to find where the problem is. Even though VS gives you a few hints, because Intellisense helps, it might still not explain why it’s failing.

The first thing I like to do is to rename the connection parameters, having “servicebus_1_connectionString” is just horrible and does not help you understand what kind of connection you have. For this case, because we only have one connection, I’ll rename it to “arm_serviceBus_connectionString”, because we’re using an ARM (Azure Resource Manager) template and because this is the type of parameter. I will also add a template variable, named “SingleQuote”, which will be, as you’ve might have guessed, a single quote mark.

So, we end up with this:

Notice that I’ve also added the initial state control, as you may remember from a previous post about this. You can check it here: https://blog.sandro-pereira.com/2020/12/29/controlling-the-initial-state-of-a-logic-app/

If you have other connectors, I suggest you continue changing names to match the same naming convention. It will help you and others to know what that is supposed to be.

After the Logic App file is taken care of, you will also need to apply these changes in the Parameters file.

By default, it will be almost empty, just having the logicAppName parameter with a Null value. This will make your deployment fail, because the template isn’t valid.

In fact, you won’t even be able to deploy it, because VS is smart enough to prompt you for the missing values, taking the default ones from the LogicApp.

At this point, we’re no longer dealing with the definition, we’re dealing with the values we want the Logic App parameters to have. So, “type” and “defaultValue” no longer apply, you should use “value” directly or, if you’re dealing with KeyVault secrets, you can just reference KV and the secret name.

In this example, I’m setting the SB connection string both ways, to show how it can be done.

If you’ve done everything right, you’re Logic App should be deployed without any fuss.

Now comes the fun part, that is dealing with the Parameters Template file. It is incredibly difficult to do this and it’s going to take several hours. So grab that coffee and get confortable.

You will need to change your values to a token and an identifier, to later use in the Pipeline and releases.

Wow, that took us… 30 seconds, maybe. I’m exhausted and I need a break. You can even get that KV value with the token, you just need to change the identifier to the KV secret name.

We’re sweating over here with all this work.

In the next blog post, we will build the Pipeline and give the hints for the Release as well.

Happy coding!

The post Logic Apps: CI/CD Part 2- Preparing for CI/CD appeared first on SANDRO PEREIRA BIZTALK BLOG.

Delete discarded messages from local folders using PowerShell

This is a so common task on BizTalk Server that I already forgot how many times I did it. Depending on several scenarios, like:

  • Testing
  • Certain parts of the application are not yet ready to production
  • Or even discarded unwanted messages

We want/need to create a send port and subscribe specific messages to be discarded on a folder. Otherwise, they will get stuck on the administration console, and we don’t want that.

After a while, the problem is that the folder will get a considerable amount of messages, and writing a large number of files to disk will get progressively slower as the number of files in the target directory gets large. This is because your computer’s operating system must keep track of all files in a directory. Even bulk deleting all of these files will take a longer time. Moving or deleting files from the target directory on a regular basis will ensure that the performance is not adversely affected.

A large number of small files make more impact than a small number of large files, and most of the time, BizTalk Server consumes/produces small messages. However, at some point, you may completely fill the hard drive, which is more critical.

With this script, you can easily configure the folders and the type of files you want to monitor and delete.

 
Get-ChildItem -Path C:Temp -Include *.* -File -Recurse | foreach { $_.Delete()} 

This will help BizTalk Administrators to take full control of their environments

Download

THIS POWERSHELL SCRIPT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.

You can download Delete Discarted messages from local folders PowerShell script from GitHub here:

The post Delete discarded messages from local folders using PowerShell appeared first on SANDRO PEREIRA BIZTALK BLOG.

Find Orphaned Azure API Connectors in all Resource Groups with PowerShell

Find Orphaned Azure API Connectors in all Resource Groups with PowerShell

Recently I wrote my version of a script that Mike Stephenson initially created: Find Orphaned Azure API Connectors with PowerShell. This PowerShell script will look at all of the API Connections in a specific resource group and then inspect every Logic App in your resource group to check if the API Connections are being used or not. The goal of this script, of course, is to identify orphaned API Connections in a single Resource Group quickly and effectively.

I modify the original script to have a better output or at least a different output that works better for my needs. Automatically add a Deprecated tag on all the API Connectors with the value True or False. And add additional capabilities on the generation of the output report in a CSV format.

The only limitation of this script is that it only checks a specific Resource Group. So, if you have 3 or 4 Resources Groups, you need to configure this script and run it 3 or 4 times.

To streamline this process and not waste so much time, I decided to create a new version of this script. This new script will look at all the API Connections in all resource groups on a single Azure Subscription and then inspect every Logic App in that specific Resource Group (RG) to check if the API Connections of that RG are being used or not.

What’s new on this PowerShell script:

  • It will check in all Resources Groups available on a single Subscription if API Connections are being used or not.
  • Subscription Details output is improved and with coloring to better read
  • List of available API Connectors group by Resource Group output is improved and with coloring to better read
  • List of Logic Apps and API Connectors association group by Resource Group and Logic App output is improved and with coloring to better read
  • List of Orphaned API Connectors order by Resource Group output is improved and with coloring to better read

Download

THIS POWERSHELL SCRIPT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.

You can download Find Orphaned API Connectors in all Resource Groups from GitHub here:

The post Find Orphaned Azure API Connectors in all Resource Groups with PowerShell appeared first on SANDRO PEREIRA BIZTALK BLOG.

BizTalk Server: Automation Deployment with Azure DevOps – Deploying the Project

BizTalk Server: Automation Deployment with Azure DevOps – Deploying the Project

Following Sandros last post on BizTalk Server: Automation Deployment with Azure DevOps – Create a build agent, we’re going to show how to create the deployment steps, by creating the Pipeline and Release Pipeline, using a few DevOps tasks.

The standard BizTalk Deployment task does a decent job in deploying the application, but it doesn’t handle changing tokens or registering DLLs in GAC.

To deploy in multiple machines or to change your Bindings according to your environment, you have to make your file dynamic. This means, replacing your connections with variables.

Let’s start with the basic:

Creating the project and installing it in DEV

As always, it’s better to first create the DevOps repository and clone it in your machine.

Having this created, you need to get your project working and have a Deployment Project as well. This will contain the needed DLLs and Binding files pointers to your BTS project. This will also contain the Application name to be deployed and some other configurations.

You will see that you can set the Biztalk Assemblies path as well as other Assemblies, Pre and Post processing scripts and the Deployment Sequence. This is one of the most important steps, because, as you know, it does matter in which order you deploy your BT Assemblies.

When referencing your BT projects, do make sure that the Application Project is using the same framework version as your other projects. If it’s not the same version, it will not be able to copy the DLLs to the referenced Path and will not build successfully.

Building this project will generate a ZIP file that contains all that is needed. You can try to publish it directly, after configuring the application.

The bindings file that is created with the project is just an empty template, so you’ll want to deploy your application in your Dev Environment and create those bindings. It will make a difference if you export your application bindings when it’s started and when it’s stopped, so keep that in mind.

For this example, I’m going to export the bindings with the Application fully stopped.

Your standard Bindings export will carry the ports and URIs/connections straight from the Admin console. Through a little magic, we will configure these values to be dynamic and it’s super easy.

Making your Bindings dynamic for deployment

Now you’ve exported the bindings and you want to make it ready for DevOps and to accept multiple configurations.

From my example, you can see that the ReceiveLocation and ReceivePort names are static. If we tokenize this, you can call it whatever you want, therefore reducing the risk of colliding with other existing ports in your end systems.

So, keeping in mind the desired token, I’m going to replace these values, ReceiveLocation address included, with a variable and token identifier. With a few magic touches, we end up with something like this:

And that’s it. Of course, this is a very small and simple example, but even with a goliath project, it will still be the same pattern. You find what you want to make dynamic, tokenize it, save and upload your changes to your Repo.

Building your Pipeline and Release Pipeline

Now you have your source code in your Repository, your bindings ready for dynamic changes and you want to deploy it.

You will need to set up your build Pipeline before you can get your Release ready, so get to work.

The Pipeline itself doesn’t need to be too complicated, you just need to build your Solution, with or without the OutPath argument (I found that setting this would make my life easier in some projects) and publish the drop.

With your drop created, your Release pipeline needs the following tasks:

  • Extract Files – to unzip your file
  • Replace Tokens (a great extension by Guillaume Rouchon, more info here)
  • Archive Files – to zip it back
  • BizTalk Server Application Deployment – I recommend this, but you can do it with PowerShell

Extracting your file contents is straight forward, you just need to select your zip in your drop contents and a destination folder. Keep in mind that you will need to know where it lands, to zip it back.

Replacing the Tokens is just as before, you select the *.XML mask or point directly to your bindings and select the Token that it should be looking for. Remember, that the variables you define are case sensitive. You can also use a Variable Group, it is a great way of knowing your environment specific variables or common variables that your might have.

Once this is done, you can proceed to recreate the Zip file and it’s contents. The destination folder you’ve selected when Unzipping will now be the Root folder you are pointing to.

Remember to tick out the “Prepend root folder name to archive paths” option. If you keep this selected, your file will end up with a structure like “Zip / bindings” instead of just “bindings” and the deployment will fail, because it’s not the expected folder structure. Also, tick the “Replace existing archive” option, else you will create a copy and deploy the original version instead.

And for the final step, the Deployment Task. I chose to use the standard task instead of PowerShell, because I didn’t want to handle scripts at this point.

Select the Zip package and set the operation to Create. From what I’ve found out, this will Upsert your application, while Update will not create the app if it doesn’t exist.

And this is what you need. If you’ve set everything properly, your Release Pipeline will deploy your Application to your Server and get it up and running, with the parameters you’ve set in your bindings file.

It took a while to understand how this process worked but in the end, it turned out to be very simple and all it took was to apply the same concept we already used with the ARM deployment for Azure resources.

Happy coding!

The post BizTalk Server: Automation Deployment with Azure DevOps – Deploying the Project appeared first on SANDRO PEREIRA BIZTALK BLOG.