DevOps – Multi environment variables/ groups

DevOps – Multi environment variables/ groups

This is a topic that has been asked to me a few times, making me wonder how hard it actually was. Working with this nearly every day makes us assume some things are very easy, but not everyone has this insight.

So, exactly do we set variables for different environments and how does it work when we want to replace tokens?

Variables for different environments

Having multiple environments creates the need to have different values assigned to your variables, because, for example, that Test Webservice won’t work in PROD and you definitely don’t want to use that PROD file share and delete files in your DEV/Test environment.

Using Pipeline Variables helps you to set different values to different Stages.

This is extremely helpful because, even though you have to duplicate/triplicate variables, you won’t need to worry about the incorrect value going to the wrong stage. Also, having the Scope set to Release, it will affect all stages.

So, it’s a win-win situation.

But! It’s only valid for this Release Pipeline in specific. If you have another Release and some variables are common, you have to re-do everything… all, over, again.

Send in the Variable Groups!

Variable Groups

The Variable Groups are containers for variables that can be used in multiple Releases and Pipelines. Think of it as a common class in your project that you can reference anywhere.

You can define the Groups and their variables in the Library. Inside the group, you can set all the variables you need, and add to it any time as well, and assign the values right away.

Keep in mind that this is thought of as a static group, it’s not supposed to change often.

If you change a variable value or add a new one, it will not be considered in the already created releases. If anything changes in here, you will need to create new releases (not the pipelines) and redeploy them. When you create the release, it takes a snapshot of the values and uses them as they are. Thus the need to create a new one to get those new values.

After linking the group to the Release, you will see that you can also set a Scope. This works exactly like the pipeline variables, they will only be used in that specific Stage and nowhere else.

Also, when expanded, you can see the values that are set for that group.

Now, how does the Token Replacement task works with this?

Replace Tokens

This task, our savior (yes, I like it very very much), comes to our rescue once again.

I’ve explained before how to use it and how it works.

But for this post, I’ll explain again. The task searches in the folders/files you’ve defined and tries to match the token that you’re setting in the definition with the one in the file(s). As the token is found, it uses a string.Replace function to inject the values in the files.

It will scour the Variables for a match and take the value to insert in the file.

But how does this link with the Variable Groups?

Well, at runtime, DevOps does a magical thing and sees the groups you’ve defined for a Stage as variables. So technically, it’s as if you’ve defined all the variables in one place and not in groups.

Pretty sweet, right?

So, the Replace Tokens will use all those variables and will try to replace them in your files. You don’t have to define the group or anything, it will just see the whole picture.

Hope this helps you with your automations and deployments.

Happy coding!

The post DevOps – Multi environment variables/ groups appeared first on SANDRO PEREIRA BIZTALK BLOG.

Logic Apps: Sending Files Dynamically – A little Christmas gift

Logic Apps: Sending Files Dynamically – A little Christmas gift

A few months ago I was involved in an integration project that required sending files to a local file system, that resided in an Azure VM.

This was a bit of an issue, because we had to use the ISE connector and it was in Preview at the time, we had to do a Proof-of-concept and make sure it worked properly.

It all went well, the integration went live and the client was happy.

A couple weeks ago, Michael Stephenson asked for my help, to solve an issue that he was having. The problem was on how to send files to another system.

Now, in an operative system, you’d just move the file over by code or using the UI. But in the integration with Logic Apps, what came out was a JSON message.

Of course you can copy files over to another Azure storage, but in this particular case, we were pulling files with a webservice and dropping them into a local VM. The OOTB connectors help, but they don’t create the file object on their own.

In my example, I’m using an Azure File Storage connector to get the file content. It mimics a Webservice where you’d get the same content. In this case, we’re also getting the content-type in the response.

If we set the File content to a variable, it will just send, well, the content itself. In my example, it’s a XML log file, so it’s actually translating the Base64 into a XML string, which is interesting. It might not be properly formatted, but it’s a string at the end of the day, instead of its Base64 representation.

So, we pick up the file, assign the string to a String object and set it to create in the end system.

This works, but creates a string file but this is not what we wanted. We also need to keep the MIME type of the file.

I’ve searched for help in how to achieve this, but there’s no information about it. This is quite shocking, because it almost sounds like Microsoft doesn’t want you to know how to do this or they didn’t took the time to document this.

The good thing, is that it’s quite simple to do this. The JSON object we need to create is extremely simple and very easy to understand how it works.

Let’s get another example. In this case, we have a SOAP web service that returns an array of fields, one of them being the Base64 for the file and the MIME type. This is the perfect example of our issue. As I’ve said before, if I just set the FileData field to a variable, it will end up in sending the string to a file, not caring about what type of file it is.

So, grab those fields into a JSON Parse action, we’re about to jingle some files into your system.

Create a Compose action and…

TA-Daaa! You just need to add a $content and $content-type fields in a JSON object and it will create your file.

It’s this easy!

It’s interesting that the GetFile connector provides the answer we needed, but as I’ve said, there’s no actual guide lines for this. There might be a few other metadata fields you can set for a file, I haven’t explored this yet, but it makes sense to be able to do it.

I hope this helps you and takes away one headache before Christmas.

The post Logic Apps: Sending Files Dynamically – A little Christmas gift appeared first on SANDRO PEREIRA BIZTALK BLOG.

Logic Apps: CI/CD Part 3- Building your Azure Pipeline

Logic Apps: CI/CD Part 3- Building your Azure Pipeline

In the previous posts of these series, we’ve talked about how to build and prepare your Logic App for CI/CD. In this last post, I’ll show you how to build your Azure Pipeline, making it prepared for any environment you need.

If you’ve missed the other posts, here are the links for them:

Logic Apps: CI/CD Part 1- Building your Logic App

Logic Apps: CI/CD Part 2- Preparing for CI/CD

The Pipeline

Assuming you already have your repo configured, building the pipeline is fairly simple and quick. I’m not a big fan of using YAML, I find it easier to use the classic editor, having the GUI seems more appealing to me.

Having your repo in place and all the code ready, you need create the Pipeline.

As such, you need to choose the classic editor (or venture yourself in YAML) and select your repo and branch.

The available templates are helpful but if you’re just trying to deploy logic apps, I’d suggest you start with an empty job, because you might have actions that are not necessary and you’ll have to delete them.

The first thing we’re going to do, is configure the pipeline for continuous integration. It doesn’t take much to achieve this, you just need to activate the needed triggers. By default, it will filter to your main branch, but you can change this and trigger for specific projects and branches. This comes in handy when you have multiple projects and you only want to include some in the build.

After enabling the triggers, you’ll need to add the required tasks to get your pipeline going. You might be getting a few secrets in Key vault, if that’s the case, do remember to add the Azure Key Vault task. This will pull either all the secrets or the filtered ones you’ve selected, keeping them in cache for the pipeline execution. This will be used in the Replace Tokens task, which I’ll discuss a bit down the road.

As you can see, it doesn’t take many tasks to have a functional pipeline, ready to deploy your Logic App to the CI environment.

The required tasks are:

  • Visual Studio build – to build your solution, obviously
  • Copy files – which will copy the desired files over to a folder in the Drop
  • Publish build artifacts – makes the drop available to use in the pipeline and the release
  • Replace Tokens – a very handy tool that allows you to replace your tokens with the variables or group variables values
  • ARM template deployment

The Copy files task is very simple and easy to use. You take the input folder, copy the files you want/need to the target folder. Easy-peasy-lemon-squeezy.

I’d advise you to set the Target Folder as a named one, when you’re building the Release, it will be easier to find what you need if you divide your assets by name.

After copying the files, we will replace the tokens. How does this work?

Simply put, the task collects all the variables in memory and searches for the token pattern in all the target files. Given that we wrote our parameters with the __ … __ token, if we use other tokens in the files, it should not affect them. This is by far, in my opinion, the most helpful task in multi-environment deployment. It takes out the need to have multiple files by environment and having tons of variables.

Having the files copied, tokens replaced, our Logic App is ready for deployment in the CI environment. Now, this is not mandatory, you might not want to deploy your LA from the pipeline, you might want to use the Release instead. This is fine, you just need to move the ARM deployment tasks to the Release, it will not affect the outcome nor the pipeline.

As you can see, after selecting the Azure details (Subscription, RG, Location, etc) it becomes easy to select your LA to deploy. Since we used the LogicApps folder, we just need to reference the JSON files and the task will pick them up from the drop folder and deploy them.

Final notes

You’re now ready to go on your adventures and build your Logic Apps, get them ready for Continuous Integration and deploy them. I didn’t approached the Release Pipeline because it’s also very simple. You will only require to create your variables, replace your tokens and deploy the ARM templates.

You can fiddle around with gates, automated deployments, pre-deployment approvals and all, but that is a more advanced feature.

Having multiple releases that you want to joint deploy, you can even build Orchestrations (I can hear all the BizTalk bells ringing in our heads). This is not as simple as isolated deployments, because it does involve some orchestration of the parts (well, duhh).

I hope this small series of posts helped you to solve issues and to improve your deployments.

And, as always, Happy coding!

Fallout 76: o Vault Boy está de volta - Recomendações Expert Fnac

The post Logic Apps: CI/CD Part 3- Building your Azure Pipeline appeared first on SANDRO PEREIRA BIZTALK BLOG.

Logic Apps: CSV Alphabetic sorting

Logic Apps: CSV Alphabetic sorting

Everyday we learn new things and when it comes to Logic Apps, we tend to learn even more, because it’s always shifting and new components are added. If we’re using ARM templates, the deployment brings out some challenges and with it, new things to learn (and lots of cute little things that make you want to bang your head against a brick wall).

Usually when we work with a CSV file we tend to keep the sorting according to the specification. It isn’t always alphabetical nor descending/ascending.

Sometimes, it’s just a real mess but it makes sense to the client and to the application that is consuming it.

A few days ago, whilst working on a client project and after dozens of tests, we started to see errors in our CSV file, where the headers and columns were arranged in a alphabetical sorting. This was not my intent, when I built the CSV array I wanted it to be in a certain order.

So why was this array now being sorted, who gave that command and how could I correct it?

Why and who:

As we dig in the Logic App code, we see that the Logic App is JSON in its core (my god, shocking development!). As such, it will follow JSON rules on sorting. If we set / append our variable with an array, even though that array won’t show up ordered in our code, it will when we deploy it to our Resource Group.

Lets prove this.

First, we set our LA in Visual Studio and initialize a string. Then we set two values to it ( “Append to string variable”) . One as a string and the other as an array.

{x} 
Append to string variable - pure string 
Name 
Value 
csvString 
•zulu": " 
•alpha": 
•charlie": 
romeu " 
•juliet": 
Append to string variable - array 
Name 
Value 
csvString 
•zulu": " 
•alpha": 
•charlie": 
romeu " 
•juliet":

Let’s look at the back code.

"Append_to 
string_variable -_pure_string" : 
"type": 
"AppendToStringVariabIe" , 
"inputs": 
"name": "csvString" 
"value : 
"runAfter": 
"Initialize variable": 
"Succeeded" 
"Append_to 
string_variable - 
array" : 
"type": 
"AppendToStringVariabIe" , 
"inputs": 
'name": "csvString" 
"value": 
"Zulu": " 
"alpha"• " 
"chat-lie": , 
"juliet": " 
"runAfter": 
"Succeeded" 
-_pure_string" :

Looking good so far. Our strings are set and it’s in the order we want.

Lets deploy it to our RG and check again.

Development Tools 
Logic app designer 
Logic app code view 
e 
Versions 
API connections 
Quick start guides 
Release notes 
Settings 
@ Workflow settings 
@ Access keys 
Identity 
Properties 
Locks 
Export template 
Monitoring 
Alerts 
Append to string variable - pure string 
o 
Name 
Value 
csvString 
"Zulu": 
"alpha": 
"charlie•: 
"romeu . 
"juliet": 
{x} 
Append to string variable - array 
Name 
Value 
csvString 
"alpha": 
"charlie": 
"juliet": " 
mmeu": " 
"Zulu":

Well, there it is. In ARM deployment, if we write a JSON object, on deployment it gets sorted and will appear like this in the designer tool in Portal.

Funny thing is that if we change our object to the string we want, the designer will not recognize this as a change and doesn’t let you save.

Save 
X Discard 
> Run 
Designer 
Code view Parameters Templates Connectors 
When a HTTP request is received 
Initialize variable 
Append to string variable - pure string 
• Value 
csvString 
"alpha": 
Append to string variable - array 
Value 
•alpha • : 
•juliet":

Even in Code View the changes are not recognized.

R save 
X Discard 
Run Designer 
"definition" 
"actions": 
Code view 
"*schema": "https://schema.management. 
"Append _ to string_variable - _ array 
"inputs" : 
" name" • "csvString" , 
"value : 
"Zulu": " 
"alpha" • " 
"charlie": 
"romeu" • 
"juliet"• "

But if we add other text to it, the changes are now recognized and Portal allows to save.

X Discard Run 
Designer 
"definition" 
"actions": 
Code view 
"*schema": "https://schema.management. 
"Zulu": 
"alpha" • " 
"charlie": 
"romeu" 
"juliet"• 
"Append _ to string_variable - _ arra) 
"inputs" : 
" name" • "csvString" , 
"value : 
"juliet"

But still, it won’t show you the changes and will still sort out your CSV array, once again because it’s JSON.

A few weeks ago, this behavior wasn’t noticeable I had a few Logic Apps in place with the string array in a specific order and when deploying it didn’t get sorted.

I searched in Azure updates to see if anything was mentioned but nothing came up.

How to bypass this issue?

If you’re working with a CSV file like I was, after you build your array, you’ll need to build a CSV table.

The action “Create CSV table” will take care of this from you, but as we know, it will not be in the same format we need.

{x} 
Append to array variable 
-2 
Name 
Value 
csvString 
"*Zulu": 
"Hromeu•: "2", 
"Halpha": 
"*juliet": 
"*charlie•: "5" 
{x} 
Append to array variable 
Parse JSON 
Create CSV table 
o 
From 
Body x 
Columns

(notice I’ve switched to array variable because I can’t parse the string in JSON)

So, leaving the Columns in automatic mode will mess up your integration as you can see. The output will be sorted and it won’t be what you want / need.

Create CSV table 
INPUTS 
CSV 
"Hcharlie": " 
"Hjuliet"• ' 
"Hromeu": " " 
"Hzulu": " 
"Halpha": " 
OUTPUTS 
Body 
Halphe , Hcharl ie, Hjul let , Hromeu , Hzulu 
Os 
Show raw inputs 
Show raw outputs

What a mess!! This is nothing like we wanted.

We will need to manually define the columns headers and the value they’re going to have.

Create CSV table 
• From 
• Columps 
Hromeu 
8,7'.l11et 
???? ? 
CustDT 
???? ? 
Hju11et ? 
? 
?

If you don’t have many fields, it’s quick to do this, but when you have lots of fields… well, let’s just say I hope you have plenty of time and don’t lose focus.

Create CSV table 
INPUTS 
CSV 
"Halpha": " " 
"Hcharlie": " 
"Hjuliet"• " " 
"Hromeu": ' 
"Hzulu": " 
OUTPUTS 
Body 
Hzulu, Hromeu,Ha1phe , Hjuliet, Hcherl ie 
18 
Show raw inputs 
Show raw outputs

And there we have it. Fields are now displayed correctly, the data is in the right place and we’ve managed to get around this annoying problem.

Happy coding!

The post Logic Apps: CSV Alphabetic sorting appeared first on SANDRO PEREIRA BIZTALK BLOG.

Introducing a new brand and Azure Functions: Moving from Azure Portal to VS

Introducing a new brand and Azure Functions: Moving from Azure Portal to VS

Before we start with the actual post, I’d like to introduce a new brand we will be developing. This is the first official post with the “It’s not Rocket Science!” brand.

RocketScienceBanner

With this concept, we intend to explain how some procedures are not as complicated as you may think and that they’re not Rocket science. I’m open to suggestions on posts to demystify Azure Logic Apps and other Azure services.

So, welcome to a new concept and I hope you find it useful.

Azure Functions: Moving from Azure Portal to VS

Creating Functions in Portal presents some challenges, like lack of Intellisense, which, as we know, helps a lot. Not having CI/CD is also a concern and of the worse case scenarios:

Someone deleting you function by mistake!

Panic!, all hell broke loose. I tried to apply the same method as I did with the Logic Apps accidental delete recovery, but the “Change history” tab isn’t available for Functions. You end up losing your code, your executions and maybe some sleep over this.

The obvious next step is to get your backup from your repository and re-create the function. IF you have it in a repo.

We can’t prevent someone deleting our resources, everyone makes mistakes. But you can prevent letting your code sit in Azure Portal without version control, CI/CD and repository control.

So, the best way to do this is to migrate it to a VS solution. In this post, I will use VS2019, but VSCode is also available as others IDEs. You can do this right from the start by choosing another development environment besides the Portal.

With an existing Function, you’ll need a few things before you can migrate it.  Besides your VS with an active subscription, you will also need the Azure SDK feature. This can be installed using the Installer you’ve downloaded from MS or in VS itself, in the “Tools”-> “Get Tools and Features” menu.

After a fresh install of Windows, I didn’t installed the Azure SDK, so I had to download it and install. According to the setup, it should take about 6,5GB.

If you had to install the SDK, Updates are also important, keep that in mind.

Lets return to the code.

The good thing about VS is that you don’t need to reference the required DLLs, they will probably already be there.

So you need to create a new Solution, with an Azure Function project.

Small note: you can also create a project using VB instead of C#, but… VB…

You can either choose a template with a trigger already created or you can choose an empty project, but when you try to Add a function, it will ask you again what template you want to use.

You can also add an empty class, but you will be missing some references.

As you can see, there are quite a few differences between the Class and the Function template.

I recommend you use the same template as the one you used to build your function, you will get more references that will be necessary for your code to work.

And now comes the easiest part. You can just copy-paste your code into VS, but leave the usings out. Most likely, if you haven’t used anything outstanding, they will already be there.

Now you have you code in VS, ready to run. Is it working properly? Well, you can just debug it locally, when pressing F5, the Azure emulator will start and you will be able to test your function like it was on the cloud.

You can just use your Postman or another web request tool to test your project.

Once you’ve tested everything and are ready to deploy to your subscription, you’ll need configure the publishing profile.

You have two major ways of doing this. You can choose the blue pill and configure your connection by hand or you take the red pill and download the profile from the existing FA.

My advice, get the publish profile. Saves you time and it’s Plug’n’Play.

And that’s it. Your newly migrated function is now ready for your CI/CD and you can manage and version it with VS and DevOps.

Now, this does have a catch, or not depending on how you look at it. Because we deploy using a Zip file, the code is no longer available in the Portal, you must now always use VS to view it.

I like this, because it means that from a Security perspective, noone will be accessing the source code through the Portal and it forces new Devs, and old ones too, to join the best practices policy and have everything in a proper Repo and version controlled.

The post Introducing a new brand and Azure Functions: Moving from Azure Portal to VS appeared first on SANDRO PEREIRA BIZTALK BLOG.