Today, we are going over another real scenario, this time from one of our PowerBI Robots clients. For those unfamiliar with it, PowerBI Robots is part of DevScope’s suite of products for Microsoft Power BI. It automatically takes high-resolution screenshots of your reports and dashboards and sends them anywhere to an unlimited number of recipients (any users and any devices), regardless of being in your organization or even having a Power BI account.
Challenge
The COVID-19 pandemic massified remote work, and one of our PowerBI Robots clients asked us for a way to start receiving high-resolution screenshots of their reports and dashboards. On top of the devices at the client’s facilities (mainly TVs), these screenshots should also be available on a Microsoft Teams Channel where they could be seen by all users with access to it. PowerBI Robots allows users to “share” high-resolution screenshots of Power BI reports and dashboards in many ways, but it didn’t have this capability out-of-the-box, so we proactively introduced it using Azure Integration Services
This proof-of-concept will explain how you can extend the product’s features by making use of PowerBI Robots’ out-of-the-box ability to send a JSON message to an HTTP endpoint and then using Azure Integration Services such as Azure Blog Storage, Azure File Storage, Logic Apps, or even Power Platform features like Power Automate to share these report or dashboard images on platforms like Teams, SharePoint or virtually everywhere.
Create Blob Storage
In theory, we could send an image in base64 directly to Teams, but the problem is that messages on Teams have a size limit of approximately 28KB. This encompasses all HTML elements such as text, images, links, tables, mentions, and so on. If the message exceeds 28KB, the action will fail with an error stating: “Request Entity too large“.
To avoid and bypass this limitation, we have to use an additional Azure component to store the Power BI report images provided by PowerBI Robots. And to do that, we can choose from among resources such as:
Azure Blob Storage: Azure Blob storage is a feature of Microsoft Azure. It allows users to store large amounts of unstructured data on Microsoft’s data storage platform. In this case, Blob stands for Binary Large Object, which includes objects such as images and multimedia files.
Azure File Storage: Azure Files is an Azure File Storage service you can use to create file-sharing in the cloud. It is based on the Server Message Block (SMB) protocol and enables you to access files remotely or on-premises via API through encrypted communications.
Or even a SharePoint library, where you can store images and many other types of files.
We chose to use blob storage for its simplicity and low cost for this POC.
To start, let’s explain the structure of Azure Blob storage. It has three types of resources:
The storage Account
A container in the storage account
A blob
If you don’t have a Storage Account yet, the first step is to create one, and for that, you need to:
From the Azure portal menu or the Home page, select Create a resource.
On the Create a resource page, on the search type Storage account and from the list, select Storage account and click Create.
On the Create a storage account Basics page, you should provide the essential information for your storage account. After you complete the Basics tab, you can choose to further customize your new storage account by setting options on the other tabs, or you can select Review + create to accept the default options and proceed to validate and create the account:
Project details
Subscription: Select the subscription under which this new function app is created.
Resource Group: Select an existing Resource Group or create a new one in which your function app will be created.
Instance details
Storage account name: Choose a unique name for your storage account.
Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only.
Region: Choose a region near you or near other services your functions access.
Note: Not all regions are supported for all types of storage accounts or redundancy configurations
Performance: Standard or Premium Select
Standard performance for general-purpose v2 storage accounts (default). This type of account is recommended by Microsoft for most scenarios.
Select Premium for scenarios requiring low latency.
Redundancy: Select your desired redundancy configuration.
Now that we have the storage account created, we need to create our Blob Container. And for that we need:
In the left menu for the storage account, scroll to the Data storage section, then select Containers.
On the Containers page, click on + Container button.
From the New Container window:
Enter a name for your new container. You can use numbers, lowercase letters, and dash (-) characters.
Select the public access level to Blob (anonymous read access for blobs only).
Blobs within the container can be read by anonymous request, but container data is not available. Anonymous clients cannot enumerate the blobs within the container.
Click Create to create the container.
Create a Logic App
PowerBI Robots is capable of sending a JSON request with all the information regarding a configured playlist:
To receive and process requests from PowerBI Robots, we decided to use and create a Logic App, which is a cloud-based platform for creating and running automated workflows that integrate your apps, data, services, and systems. To simplify the solution, we will also use the Azure Portal to create the Logic App.
From the Azure portal menu or the Home page, select Create a resource.
In the Create a resource page, select Integration > Logic App.
On the Create Logic App Basics page, use the following Logic App settings:
Subscription: Select the subscription under which this new Logic App is created.
Resource Group: Select an existing Resource Group or create a new one in which your Logic app will be created.
Type: The logic app resource type and billing model for your resource. In this case, we will be using Consumption.
Consumption: This logic app resource type runs in global, multi-tenant Azure Logic Apps and uses the Consumption billing model.
Standard: This logic app resource type runs in single-tenant Azure Logic Apps and uses the Standard billing model.
Logic App name: Your Logic App resource name. The name must be unique across regions.
Region: The Azure datacenter region where to store your app’s information. Choose a region near you or near other services your Logic app access.
Enable log analytics: Change this option only when you want to enable diagnostic logging. The default value in No.
When you’re ready, select Review + Create. Then, on the validation page, confirm the details you provided, and select Create.
After Azure successfully deploys your app, select Go to resource. Or, find and choose your Logic App resource by typing the name in the Azure search box.
Under Templates, select Blank Logic App. After selecting the template, the designer now shows an empty workflow surface.
In the workflow designer, under the search box, select Built-In. Then, from the Triggers list, select the Request trigger, When a HTTP request is received.
For us to tokenize the values of the message we are receiving from the PowerBI Robots, we can, on the Request trigger, click on Use sample payload to generate schema
And copy the JSON message provided earlier to the Enter or paste a sample JSON payload window and then click Done.
Under the Request trigger, select New step.
Select New step. In the search box, enter Variables, and from the result panel select the Variables, and choose the Initialize variable action and provide the following information:
Name: varDateTime
Type: String
Value: Select Expression and add the following expression formatDateTime(utcNow(), ‘yyyy-MM-dd HH:mm’)
Note: this variable will be used later in the business process to provide the data in a clear format on the message to be sent to the Teams channel.
Under the Request trigger, select New step.
Select New step. In the search box, enter Variables, and from the result panel select the Variables, and choose the Initialize variable action and provide the following information:
Name: varHTMLBody
Type: String
Value: (Empty)
Note: this variable will be used later in the business process to dynamically generate the message to be sent to the Teams channel in an HTML format.
Select New step. In the search box, enter Blob, and from the result panel select the Azure Blob Storage and choose the Create blob (v2)action.
If you don’t have yet a connection create you first need to create the connection by setting the following configurations and then click Create:
Connection name: Display connection name
Authentication type: the connector supports a variety of authentication types. In this POC, we will be using Access Key.
Azure Storage Account name: Name of the storage account the connector we create above. We will be using dvspocproductsstracc.
Azure Storage Account Access Key: Specify a valid primary/secondary storage account access key. You can get these values on the Access keys option under the Security + networking section on your storage account.
Then provide the following information:
Storage account name: Select from the dropdown list the storage account. The default should be Use connection settings (dvspocproductsrracc)
Folder path: navigate to the folder /robots-reports
Blob name: Dynamic set the name of the file to be created. To avoid overlap we decide to use the unique workflow id of the message as part of the name of the report we receive on the source message:
Blob content: the Base64 content we receive on the source message.
Note: by setting the name or the content on the Create blob action, this will automatically add a For Each loop statement on our business flow since these fields can occur multiple times inside the source message. And this is correct and what we want.
Select New step. In the search box, enter Variables, and from the result panel select the Variables, and choose the Set variable action and provide the following information:
And finally, select New step. In the search box, enter Teams, and choose from the result panel the Microsoft Teams, choose the Post message in a chat or channel action and provide the following information:
Post as: Select User
Post in: Select Channel
Team: Select the Team, in our case PowerBI Robots Webhooks
Channel: Select the Team channel, in our case General
Message: place the message we create above by using the varHTMLBody
Note: if you don’t have yet created a Teams Connector, you need to Sign in using the account that will be making these notifications.
As a result, once we receive a new request from the PowerBI Robots, will be a fancy message on teams with a thumbnail of the report:
You can click on it and see it in full size:
More About PowerBI Robots?
PowerBI Robots automatically takes screenshots of your Microsoft Power BI dashboards and reports and sends them anywhere, to an unlimited number of recipients. Simply tell PowerBI when and where you want your BI data, and it will take care of delivering it on time.
Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform and Azure iPaaS?
Integration weekly updates can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.
It’s been some time since I created the BizTalk ServerSSO Application Configuration tool. The tool is available for several versions of BizTalk Server. It provides the ability to add and manage applications, add and manage key-value pairs in the SSO database, and import and export configuration applications to be deployed to different environments.
However, and although I love this tool, there is a significant limitation. It is a Windows application tool. So, most of the time, we need remote access to the BizTalk Server machines to access the tool to be able to read or change these values inside the SSO Applications.
To bypass this limitation, we create a Web version of this tool. The tool has almost the same features available as the traditional windows tool:
You can securely export and import Application configurations and it is compatible with MSFT SSO snap-in;
You can duplicate Applications (copy and past);
You can rename Applications;
You can easily add new key-values;
You can edit key-values;
Other versions
This tool is also available in the format of Windows Application for the following BizTalk Server versions:
While organizing my vast resources in my hard drive, I recently found out, polished, and improved two SQL Server queries that allow us to check the users and groups with access to BAM resources.
These are simple SQL Queries, but they are essential for maintaining your environment under control, security, and privacy standards compliance.
Generally, BizTalk Server is compatible with all the privacy standards-compliant like GDPR or FIPS. BizTalk Server is a messaging broker that doesn’t capture or store any data on its system other than for the time needed to complete business processes and connect and route messages to their target systems. However, because you can process messages and/or communicate with systems that contain sensitive data (personal data), you must have some good practices in BizTalk Server Applications to comply with privacy standards.
Business Activity Monitoring (BAM) is a collection of tools that allow you to manage aggregations, alerts, and profiles to monitor relevant business metrics (called Key Performance Indicators, or KPIs). It gives you end-to-end visibility into your business processes, providing accurate information about the status and results of various operations, processes, and transactions so you can address problem areas and resolve issues within your business. But it is also a component that can capture data from the messages passing by the systems, and some of these data can be sensitive – that shouldn’t happen, but it can happen.
So, it is always good in terms of security, control, documentation, and in some cases, privacy to know which users can access BAM data.
BizTalk Server: SQL Query to list all Users with access to BAMPrimaryImport database
This is a simple SQL Server Query that provides a list of all users that has access to the BAMPrimaryImport database.
THIS SQL SCRIPT IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.
Monitoring a BizTalk Server environment can sometimes be a complex task due to the infrastructure and complexity layers behind the BizTalk Server. Apart from that, the administrator teams need to monitor all the applications deployed to the environment.
Ideally, the administration team should use all monitoring tools at their disposal, whether they are included with the product, such as BizTalk Server Administrative console, Event Viewer, HAT, or BAM. But the main problem with these tools is that:
They need manually intervention.
Almost all of them requires remote access to the environment.
When an administrator must manually check each server or application by events that may have occurred, that is not a very efficient and effective way to allocate the team’s time nor to monitor the environment.
Of course, they can also use other monitoring tools from Microsoft, such as Microsoft System Center Operation Manager (SCOM), or third-party monitoring solutions such as BizTalk360. These tools should be able to read events from all layers of the infrastructure and help the administration team to take preventive measures, notifying them when a particular incident is about to happen, for example, when the free space of a hard drive is below 10%. Furthermore, they should allow the automation of operations when a specific event occurs, for example, restart a service when the amount of memory used by it exceeds 200MB, thereby preventing incidents or failures, without requiring human intervention.
But the question is: and if you don’t have these tools?
You can archive these tasks in several ways. Many people create custom web portals to emulate some of the most basic tasks of the admin console. One of my favorite options is using a mix of PowerShell, schedule tasks, and/or Azure Services like Logic Apps and Functions. But today I will show you a different or alternative way:
Create a Windows Service to monitor suspended Instances and automatically terminate them
Note: of course, this solution can be expanded to other kinds of stuff or add new funcionalities.
BizTalk Monitor Suspend Instance Terminator Service
This is a Windows Service that will be continually monitoring BizTalk Server for specific suspended messages (with an interval of x seconds/minutes/hours defined on code) and termites them automatically.
This tool allows you to configure:
The type of suspended messages you want to terminate
Terminate without saving the messages or saving them to a specific folder before terminating them.
These configurations are made on the app config of the service:
I recently brought some old BizTalk Server resources back to life, like the BizTalk Server WCF-Loopback Adapter or File-Z Adapter. And I have been working on several more resources. So today, it is a pleasure to bring back to life again an old tool created by my friend Thiago Almeida (LinkedIn, Twitter) back in the day he was a BizTalk Server developer:
Get Tracked Message tool
This tool allows you to programmatically extract a message body from the BizTalk tracking database using 3 possible ways, as Thiago Almeida mentioned in his original blog post:
Operations DLL: this method uses the Microsoft.BizTalk.Operations assembly. This is pretty straightforward. You add a reference to Microsoft.BizTalk.Operations.dll and use the GetTrackedMessage of the BizTalkOperations class. You can also get to the message context using this method. This method is only available for BizTalk Server 2006 and late.
SQL: this method uses the bts_GetTrackedMessageParts stored procedure inside the tracking database expects the message GUID and will return the compressed message data. We can then use reflection to invoke the Decompress method of Microsoft.BizTalk.Message.Interop.CompressionStreams class inside Microsoft.BizTalk.Pipeline.dll to decompress the data returned from SQL.
And WMI: this method uses the WMI MSBTS_TrackedMessageInstance.SaveToFile method to save the instance to disk. This was the popular method in BizTalk Server 2004 since there were no operations DLL back then.
As some of you know, the body and context of messages in BizTalk are compressed, and rightfully so. However, the actual compression and decompression code are hidden inside an assembly called BTSDBAccessor.dll. This DLL, the BizTalk Database Accessor, is unmanaged and does a lot of work for BizTalk, including accessing the BizTalk databases to send and receive messages.
The application has only one form and expects the following parameters:
The message guid of the message you want to extract.
You can get this value for example, from the Message Flow
The extraction type (Use Operations DLL, Use SQL, Use WMI)
Tracking DB server (the BizTalk server name)
Tracking DB name (the BizTalk Tracking database name)
Credits
Thiago Almeida | Linkedin | The original craetor of this tool.
Diogo Formosinho | Linkedin | Member of my team and that help me migrate this tool and that add a more modern look to the tool.
Download
THIS TOOL IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.
You can download the BizTalk Server GetTrackedMessage tool from GitHub here:
After I delivered 21 lectures last year, it is time to start the 2022 season in a “new place” (it is virtual): Chicago! The M365 Chicago Virtual Conference brought to you by VEEAM is a free online (in Microsoft Teams), one-day event on Friday, January 14, 2022. The event will run from 8:30 am ~ 4:30 pm Central Time. ?
Microsoft 365 specialists, Cloud IT administrators, Power Platform administrators, end-users, architects, developers, and other professionals that work with Microsoft Collaboration or Cloud Technologies will meet to share the latest information for working with anything and everything related to Microsoft 365 and Power Platform.
“M365 Below in Chicago!” is a community-led event dedicated to educating and engaging members of the technical community. The event draws upon the expertise of IT Professionals, Microsoft MVPs, Developers, Solution Architects, and other experts who come together to share their real-world experiences, lessons learned, best practices, and general knowledge with other like-minded individuals.
You will find sessions on different subjects like:
Teams
Power Platform
Employee Experience (Microsoft Viva)
SharePoint, OneDrive, Office, and Yammer
User Adoption & Productivity
M365 Security and Compliance
See the full event schedule here: Full schedule.
I choose to submit a session to this event, and I’m honored to be accepted as a guest speaker on a session about Power Automation: A new set of Best practices, tips and tricks. My session will take place at 03:00 pm according to UTC+0 on the Water Tower Power Platform room.
Power Automation: A new set of Best practices, tips and tricks
As I mentioned before, my session will be all about best practices and small tips and tricks that we can apply to our Power Automate flows. For those reasons, I would like to invite you to join me at the M365 Below in Chicago! virtual event on Friday, January 14, 2022.
Session name: Power Automation: A new set of Best practices, tips and tricks
Abstract: A brand new set of tips and tricks and best practices that you should know for being more productive and building more reliable and effective Power Automate flows. This is not an introduction session anymore. Instead, this session will go through a list of 10 new best practices, tips, and tricks addressing advanced topics like deployment, dynamic connectors configurations, etc.
Join us and reserve your presence at the M365 Below in Chicago! virtual event on Friday, January 14, 2022, it is free!
2021 wasn’t the year we all desire to be, COVID-19 pandemic is still present, but overall it was a good year for my family and me. We manage to be safe and well, which is the most important. Because we use our homes more and more, renovations at home continue to happen during 2021 to continue to build our dream house:
It was a year we all got vaccinated, and that allowed us to have more freedom, so we were able to safely visit Portugal (our country) and take a deserved vacation away from the hustle of the city or the most tourist places:
But without exaggeration! Homeworking, or working in all possible crazy situations and from literally everywhere, was still present:
And I decide not to do any in-person events. Instead, I did 21 online events and kept friends close by doing a lot of video calls!
My Blog has still solid numbers and growing every year…
The numbers of my blog kept solid in terms of visitors and new content, and 2021 was again a very productive year:
Publish 77 new posts on my blog;
One more publisher: Pedro Almeida
More than 375,328 visits to my blog.
Coming from 210 countries across the world in the past year (and more than 15,137 cities).
And the countries that most visited my blog are once again the United States, followed by India, the United Kingdom, and this time Australia
Not bad if we compare to previous years:
2020: 392,535 visits, 214 countries, 92 new posts
2019: 431,000visits, 207countries, 43 new posts
2018: 246,381 visits, 194 countries, 70 new posts
2017: 210,000 visits, 167 countries, 63 new posts (migrate to a new blog)
2016: 318,576 visits, 190 countries, 50 new posts
2015: 350,000 visits, 184 countries, 79 new posts
2014: 310,000 visits, 183 countries, 52 new posts
2013: 200,000 visits, 176 countries, 79 new posts
2012: 170,000 visits, 171 countries, 102 new posts
2011: 91,000 visits, 61 new posts
I want to say thanks to all my readers. I appreciate all the visits to my blog, and thanks for your support. And a big thanks to my team at DevScope:
Attractionsin 2021
These are the top 10 posts that got the most views in 2021:
2021 was also the year I finished my second book: Migrating to BizTalk Server 2020. This time with the help of good friends: Tom Canter and Lex Hegt.
The book is almost available! Unfortunately, we had some setbacks with printing books and shipping (Brexit), but I’m confident it will become available by the end of this month (January 2022).
Open Source Contributions…
Continue to improve the existing GitHub contributions and add new ones. Here are some samples:
BizTalk Server File-RADITZ Adapter: The File-RADITZ adapter is kind of the arch-enemy of the File-Z Adapter, this adapter doesn’t pick up or process empty (zero-byte) files.
BizTalk Server WCF-Loopback Adapter: The Loopback adapter is simply a two-way send adapter that, in its essence, returns a copy of the outbound message.
BizTalk Server SSO Application Configuration CLI: Unfortunately, there is no command line tool to allow you to script the deployment SSO Application Configurations or perform CI/CD thru DevOps. This tool is designed to address this gap allowing you to: you can securely import Application configurations by using this CLI application
PowerShell Runbook: Find Broken Azure API Connections: This PowerShell Runbook will look at all of the API Connections in all resource groups present in a specific Azure Subscription and provide a list of all broken API Connections.
Function App: Find Broken Azure API Connections: This PowerShell Function App will look at all of the API Connections in all resource groups present in a specific Azure Subscription and provide a list of all broken API Connections.
API Connections Status Report: This PowerShell script will look at all of the API Connections in all resource groups present in a specific Azure Subscription and providen their currenct status.
Find Orphaned Azure API Connectors: This PowerShell script will look at all of the API Connections in all resource groups present in a specific Azure Subscription and then inspect every Logic App in your resource group to check if the API Connections are being used or not. The goal of this script, of course, is to identify orphaned API Connections in a single Resource Group quickly and effectively.
And if you think I stayed and contribute only to my blog, you are very wrong I perform several other publications outside my blog:
Publish 9 guest blog posts on Serveless360 about Azure Integration Services on topics like Logic Apps, API Management, Azure App Configuration, and Power Automate.
However, that is not all! I still was able to deliver 21 virtual sessions in several conferences and User Groups worldwide on topics like BizTalk Server, Logic Apps, and Power Automate:
Azure User Group Portugal | January 13, 2021 | Logic Apps: Development experiences
101 Talk Arena | January 14, 2021 | 101 Talk Arena with Sandro Pereira: What about integration now?
Microsoft Integrate Conference DACH 2021 | January 21, 2021 | Logic Apps: Anywhere, Everywhere
Azure Lowlands | January 29, 2021 | How to create robust monitor solutions with PowerShell, Azure Functions and Logic Apps
Global Automation Bootcamp 2021 | February 6, 2021 | Power Automation: Best practices, tips and tricks
Virtual Scottish Summit 2021 | February 19, 2021 | Power Automation: Best practices, tips and tricks
Global Power Platform Bootcamp 2021 – Münsterland | February 19, 2021 | Power Automate: Best practices, Tips and Tricks
Webinar: PowerTalk by Atea | February 22, 2021 | Power Automate: Best practices, Tips and Tricks
Global Integration Bootcamp 2021 – Virtual | February 25, 2021 | Logic App (Preview): The new kid on the block
Power Platform Virtual Conference | March 12, 2021 | Power Automate: Best practices, Tips and Tricks
Global Azure Lüdinghausen 2021 | April 16, 2021 | Logic App (Preview): The new kid on the block
Global Azure Portugal 2021 | April 16, 2021 | Logic App (Preview): The new kid on the block
DeveloperWeek Europe 2021 | April 26, 2021 | The most important Best practices you need to know to develop Azure Logic Apps
Microsoft 365 Virtual Marathon | April 26, 2021 | Logic Apps: Best practices, Tips and Tricks
Power Platform 24 | May 5, 2021 | How to create robust monitor solutions with PowerShell and Power Automate
Bizz Summit ES | June 4, 2021 | Power Automation: Best practices, tips and tricks
Dutch Microsoft Cloud Call | June 5, 2021 | Logic Apps: Best practices, Tips and Tricks
This is a topic that has been asked to me a few times, making me wonder how hard it actually was. Working with this nearly every day makes us assume some things are very easy, but not everyone has this insight.
So, exactly do we set variables for different environments and how does it work when we want to replace tokens?
Variables for different environments
Having multiple environments creates the need to have different values assigned to your variables, because, for example, that Test Webservice won’t work in PROD and you definitely don’t want to use that PROD file share and delete files in your DEV/Test environment.
Using Pipeline Variables helps you to set different values to different Stages.
This is extremely helpful because, even though you have to duplicate/triplicate variables, you won’t need to worry about the incorrect value going to the wrong stage. Also, having the Scope set to Release, it will affect all stages.
So, it’s a win-win situation.
But! It’s only valid for this Release Pipeline in specific. If you have another Release and some variables are common, you have to re-do everything… all, over, again.
Send in the Variable Groups!
Variable Groups
The Variable Groups are containers for variables that can be used in multiple Releases and Pipelines. Think of it as a common class in your project that you can reference anywhere.
You can define the Groups and their variables in the Library. Inside the group, you can set all the variables you need, and add to it any time as well, and assign the values right away.
Keep in mind that this is thought of as a static group, it’s not supposed to change often.
If you change a variable value or add a new one, it will not be considered in the already created releases. If anything changes in here, you will need to create new releases (not the pipelines) and redeploy them. When you create the release, it takes a snapshot of the values and uses them as they are. Thus the need to create a new one to get those new values.
After linking the group to the Release, you will see that you can also set a Scope. This works exactly like the pipeline variables, they will only be used in that specific Stage and nowhere else.
Also, when expanded, you can see the values that are set for that group.
Now, how does the Token Replacement task works with this?
Replace Tokens
This task, our savior (yes, I like it very very much), comes to our rescue once again.
I’ve explained before how to use it and how it works.
But for this post, I’ll explain again. The task searches in the folders/files you’ve defined and tries to match the token that you’re setting in the definition with the one in the file(s). As the token is found, it uses a string.Replace function to inject the values in the files.
It will scour the Variables for a match and take the value to insert in the file.
But how does this link with the Variable Groups?
Well, at runtime, DevOps does a magical thing and sees the groups you’ve defined for a Stage as variables. So technically, it’s as if you’ve defined all the variables in one place and not in groups.
Pretty sweet, right?
So, the Replace Tokens will use all those variables and will try to replace them in your files. You don’t have to define the group or anything, it will just see the whole picture.
Hope this helps you with your automations and deployments.
We wish you a Merry Christmas We wish you a Merry Christmas We wish you a Merry Christmas and a Happy New Year!
We are entering the 3rt pandemic year. We are in a better position, the vaccination is helping minimize the impact of this pandemic, but there is still a long way to go. This makes his holiday season even more important because we never know what tomorrow will be, so on this Christmas, I hope you all can spend time with your close family and friends in the safest way you can. And if you are like me, with three young kids, then it’s all about creating happy memories that will last a lifetime, and we need it more than ever!
And once again, my sincere wishes for a Merry Christmas and a Happy New Year to all my readers, friends, customers, partners, coworkers, my amazing Integration Team at DevScope (Pedro Almeida and Diogo Formosinho – you guys rock!), to all Microsoft Integration and Azure Community (BizTalk Server, Logic Apps, API Management, Service Bus, and so on), MSFT Product Groups, all the Portuguese Communities, my MVP “family” and of course to my beautiful family.
Thanks in advance for all the support and encouragement given throughout another year – 11 years as MVP. I couldn’t make it without you guys!. It was, once again, an incredible year for me, on both a personal and professional level, and I hope that the next will be a year filled with new challenges. I promise that I will continue to share knowledge, hoping to help someone.
May this festive season sparkle and shine, may all of your wishes and dreams come true, and may the new year be made of great happiness. Merry Christmas to you and your entire family, and a happy new year!