Have you ever wondered how to add tags to yourFunction App through Visual Studio?
Let’s break it down, but first, here’s a quick overview of how you would do it in the Azure Portal:
On your Function App overview page, under the Essentials information on the left, you’ll find “Tags” with an “Edit” button next to it.
Clicking on it allows you to add new tags to your function app. These tags essentially function as meta tags, consisting of key and value pairs, such as Name and Value.
But why do I need tags? You might be wondering.
Overall, tags offer a flexible and customizable way to manage and govern resources in Azure, enabling better organization, cost management, monitoring, and governance across your environment.
Organization and Categorization: Tags allow you to categorize and organize resources based on different criteria, such as department, project, environment (e.g., production, development), or cost center. This makes it easier to locate and manage resources, especially in larger deployments with numerous resources.
Cost Management: Tags can be used for cost allocation and tracking. By assigning tags to resources, you can easily identify the costs associated with specific projects, teams, or departments. This helps in budgeting, forecasting, and optimizing resource usage to control costs effectively.
Monitoring and Reporting: Tags provide metadata that can be used for monitoring and reporting purposes. You can use tags to filter and aggregate data in monitoring tools, allowing you to gain insights into resource usage, performance, and operational trends across different categories.
Access Control and Governance: Tags can also be leveraged for access control and governance purposes. By tagging resources based on their sensitivity, compliance requirements, or ownership, you can enforce policies, permissions, and compliance standards more effectively.
Now that we already describe the importance of tags and how you can add them from the Azure Portal, let’s dive into it with Visual Studio:
After you’ve published your Azure Function, or if you’re working with an existing published one, head over to the Solution Explorer and right-click on your solution.
From there, go to Add -> New Project. Now, search for Azure Resource Group and give it a double click.
You’ll be prompted to name your project. You can leave the location as is since it’s the project you’re currently working on. Click on Create once you’re done.
Now, in the Solution Explorer, you’ll spot a new project. Inside, you’ll find two .json files:
azuredeploy.json
azuredeploy.parameters.json
The file we’re interested in is azuredeploy.json. Double-click on it and replace its content with the provided JSON. Don’t forget to customize it with the tags you need and also your Function App Name. For now, let’s use these tags for our proof of concept:
Back in the Solution Explorer, right-click on the project you’ve just created and select Deploy -> New.
You’ll then need to choose your subscription and resource group. Finally, hit Deploy.
Once the deployment finishes smoothly without any errors, it’s time to inspect your Function App. You’ll notice that all your tags are now displayed on the Function App overview page.
Adding tags to your function app through Visual Studio provides a streamlined way to organize, manage, and govern your resources in Azure by categorizing resources based on criteria such as environment, project, company, etc.
Tags facilitate easier navigation and management, particularly in complex deployments. Moreover, tags play a crucial role in cost allocation, monitoring, reporting, and access control, offering valuable insights and enhancing governance across your environment.
While both methods, Visual Studio and the Azure Portal, offer ways to manage tags for resources like functionapps, for simple solutions that don’t require having multiple environments, there are certain advantages to using VisualStudio for this task:
Automation and Consistency: Visual Studio allows you to automate the deployment of resources along with their tags using Infrastructure as Code (IaC) principles. This ensures consistency across deployments and reduces the chance of human error compared to manually adding tags in the Azure Portal.
Version Control: When managing your Azure resources through Visual Studio, you can maintain version control over your infrastructure code. This means you can track changes to your tags along with other resource configurations, making it easier to revert to previous versions if needed.
Integration with Development Workflow: For teams that primarily work within Visual Studio for development tasks, integrating tag management into the development workflow streamlines processes. Developers can manage both code and resource configurations in a unified environment, enhancing collaboration and efficiency.
Scalability: Visual Studio is well-suited for managing tags across multiple resources or environments. With the ability to define and deploy resource templates containing tags programmatically, scaling tag management becomes more manageable, especially in large-scale deployments.
Consolidated Management: Using Visual Studio for tag management allows you to centralize the configuration of tags alongside other resource settings. This consolidated approach simplifies overall resource management, providing a single interface for configuring and deploying resources and their associated tags.
It is important to note that the choice between VisualStudio and the Azure Portal ultimately depends on your specific requirements, preferences, and existing workflows. While Visual Studio offers certain advantages for tag management, the Azure Portal provides a user-friendly interface that may be more accessible for simple or ad-hoc tag assignments. This way, organizations should evaluate their needs and capabilities to find the most suitable approach for managing tags in their Azure environment.
Of course, in the end, the best solution is to use CI/CD pipelines to accomplish this task.
Hope you find this helpful! If you enjoyed the content or found it useful and wish to support our efforts to create more, you can contribute towards purchasing a Star Wars Lego for Sandro’s son!
Continuous Integration and Continuous Deployment (CI/CD) is a practice that has become an essential aspect of Azure development. Although it is possible to execute each of the CI/CD pipeline steps manually, the actual value can be achieved only through automation.
And to improve software delivery using CI/CD pipelines, either a DevOps or a Site Reliability Engineering (SRE) approach is highly recommended.
In this whitepaper, I will address and explain how you can implement CI/CD oriented to Azure Function Apps using Azure DevOps Pipelines.
I will explain in detail all the basic things you have to know, from the creation of an Azure Function on Visual Studio 2022 to everything you need to create and configure inside DevOps to archive the implementation of the CI/CD process using Azure Functions.
What’s in store for you?
This whitepaper will give you a detailed understanding of the following:
An introduction to:
What are Continuous Integration (CI) and Continuous Deployment (CD)?
What are CI/CD Pipelines?
What is Azure DevOps?
Create an organization or project collection in Azure DevOps
Create a project in Azure DevOps
Building your Azure Function from scratch
Publish your code from Visual Studio
A step-by-step approach to building Azure Pipelines
A step-by-step approach to building Azure Release Pipelines
Today I’m going to go over how we solved a real scenario from one of our PowerBI Portal clients. For those who aren’t familiar with it, PowerBI Portal is a web tool that allows organizations to host an unlimited number of Power BI reports and dashboards on an online portal, and give access to it to any number of users, regardless of being in their organization or even having a Power BI account. PowerBI Portal is mainly used by organizations looking to share data with their clients, partners and suppliers, but there have been numerous entrepreneurial cases of people using it as a platform, selling their clients access to the portal or charging for the time they spent on PowerBI Portal.
Other interesting points about PowerBI Portal are the tool’s double layer of row-level security (user and role), which allows data managers to specify who has access to what, and the ability to only consume Power BI Embedded capacity when there’s activity on the platform, which can severely reduce an organization’s consumption bill.
Finally, it’s worth mentioning how flexible the PowerBI Portal API is, allowing for custom solutions such as the one we’ll cover in this blog post.
Challenge
Our PowerBI Portal client wanted a daily report of the top 10 vendors that accessed their organization’s portal along with the most viewed dashboards/reports to better understand how the tool was being used and by whom. The PowerBI Portal API is actually very powerful and straightforward to use, but it didn’t have this capability out-of-the-box so we proactively extend the product’s capabilities by using Azure Integration Services.
This proof-of-concept will explain how you can extend the product by using the existing APIs and creating a fancy Power BI access on the PowerBI Portal audit report.
Create Function App
If you don’t have yet a Function App with the .NET runtime stack created, the first step is to create one, and for that, you need to:
From the Azure portal menu or the Home page, select Create a resource.
In the Create a resource page, select Compute > Function App.
On the Create Function App Basics page, use the following function app settings:
Subscription: Select the subscription under which this new function app is created.
Resource Group: Select an existing Resource Group or create a new one in which your function app will be created.
Function App name: Name that identifies your new function app.
Publish: Select Code.
Runtime stack: Select the option .NET
Version: Choose the version of your installed runtime, in this case, 6
Region: Choose a region near you or near other services your functions access.
Select Next : Hosting. On the Hosting page, enter the following settings:
Storage Account: Create a storage account used by your function app or select an existing one
Operating system: I choose to use Windows since I’m feeling more comfortable with it.
Plan: Hosting plan that defines how resources are allocated to your function app. In this case, you need to select the Consumption plan.
You can customize the other option according to your intentions or leave the default values. For this demo, we will now select Review + create to review the app configuration selections.
On the Review + create page, review your settings, and then select Create to provision and deploy the function app.
Create HTTP trigger function
The next step is to create two HTTP trigger Function:
FA_Audit_Top10Reports
FA_Audit_Top10USers
For that we need to:
From the left menu of the Function App window, select Functions, then select Create from the top menu.
From the Create Function window, leave the Development environment property as Develop in portal and select the HTTP trigger template.
Under Template details give a proper name for New Function, and choose Function from the Authorization level drop-down list, and then select Create.
On the FA_Audit_Top10Reports window, select Code + Test, then on the run.ps1 file add the following code:
This function will return a list of top 10 reports in a HTML table format
...
string requestBody = new StreamReader(req.Body).ReadToEnd();
JArray data = (JArray)JsonConvert.DeserializeObject(requestBody);
var apiReport = new JArray();
var groups = data
.GroupBy(s => s["name"])
.Select(s => new
{
Dashboard = s.Key,
Count = s.Count()
})
.OrderByDescending(s=> s.Count).Take(10);
...
Note: this is a small part of the code. Click on the button below to download a simplified version of the source code from the overall solution.
On the FA_Audit_Top10USers window, select Code + Test, then on the run.ps1 file add the following code:
This function will return a list of top 10 users in a HTML table format
...
string requestBody = new StreamReader(req.Body).ReadToEnd();
JArray data = (JArray)JsonConvert.DeserializeObject(requestBody);
var apiReport = new JArray();
var groups = data
.GroupBy(s => s["userEmail"])
.Select(s => new
{
User = s.Key,
Count = s.Count()
})
.OrderByDescending(s=> s.Count).Take(10);
...
Note: this is a small part of the code. Click on the button below to download a simplified version of the source code from the overall solution.
Finally, we need to create a scheduling Logic App to trigger the monitoring Function and notify if any API Connection is broken. To simplify the solution, we will be using the Azure Portal to create also the Logic App.
From the Azure portal menu or the Home page, select Create a resource.
In the Create a resource page, select Integration > Logic App.
On the Create Logic App Basics page, use the following Logic app settings:
Subscription: Select the subscription under which this new Logic app is created.
Resource Group: Select an existing Resource Group or create a new one in which your Logic app will be created.
Type: The logic app resource type and billing model to use for your resource, in this case we will be using Consumption
Consumption: This logic app resource type runs in global, multi-tenant Azure Logic Apps and uses the Consumption billing model.
Standard: This logic app resource type runs in single-tenant Azure Logic Apps and uses the Standard billing model.
Logic App name: Your logic app resource name, which must be unique across regions.
Region: The Azure datacenter region where to store your app’s information. Choose a region near you or near other services your Logic app access.
Enable log analytics: Change this option only when you want to enable diagnostic logging. The default value in No.
When you’re ready, select Review + Create. On the validation page, confirm the details that you provided, and select Create.
After Azure successfully deploys your app, select Go to resource. Or, find and select your logic app resource by typing the name in the Azure search box.
Under Templates, select Blank Logic App. After you select the template, the designer now shows an empty workflow surface.
In the workflow designer, under the search box, select Built-In. From the Triggers list, select the Schedule trigger, Recurrence.
In the trigger details, provide the following information:
Interval: 1
Frequency: Day
Under the Recurrence trigger, select New step.
Select New step. In the search box, enter HTTP, and from the result panel select the HTTP, HTTP action and provide the following information:
Headers: you need to create the X-API-KEY with your access token
Queries: you need to specify two query parameters:
pageNumber: 1
pageSize: 100
Select New step. In the search box, enter Azure Functions, and from the result panel select the Azure Functions, select the Function App that contains the Functions we create above and then select the FA_Audit_Top10Users function and provide the following information
Request Body: Result body of the HTTP action – @{body(‘HTTP’)}
Do the same steps, this time for the FA_Audit_Top10Reports function
Select New step. In the search box, enter Variables, and from the result panel select the Variables, Initialize variable action and provide the following information:
Name: varEmailBody
Type: String
Value: provide the HTML email body template and add the result of the functions to that template
Note: this is a small part of the HTML body template code. You should customize it according to your needs.
And finally, select New step. In the search box, enter Office 365 Outlook, and from the result panel select the Office 365 Outlook, Send an email (v2)action and provide the following information:
Body: varEmailBody – @{variables(‘varEmailBody’)}
Subject: [DEV] Power BI Portal Daily Report
To: list of your email addresses.
The result, once you try to execute the Logic App, will be a fancy HTML email:
More about Power BI Portal
PowerBI Portal is a web tool that allows users to embed any number of Power BI reports and dashboards on a portal with their organization’s layout, that can be shared with whoever they want, regardless of being in their organization or even having a Power BI account. Know more about it here.
It was with great pleasure that I presented for the first time on January 29, 2020, a session at the Azure Lowlands event, this time about How to create robust monitor solutions with PowerShell, Azure Functions, and Logic Apps.
First of all, I want to congratulate the organizers on a very well organized event!
About the session
Session name: How to create robust monitor solutions with PowerShell, Azure Functions and Logic Apps
Abstract: Monitoring your systems or platforms is a crucial aspect of any organization. Based on my experience, all your clients will tell you that all the platforms or applications are being monitoring by external partners or internally. Nevertheless, when disasters occur or are in the process of happening, guess what? Your team will be the last to know. This session will address and present how you can easily and quickly create a robust monitoring solution on your platforms using PowerShell, Functions app, and Logic Apps or Power Automate Flows.
How to create robust monitor solutions with PowerShell, Azure Functions and Logic App Slides
How to create robust monitor solutions with PowerShell, Azure Functions and Logic App Video
For any reason, you could not be present at this online event, or if you want to review it again, you can now do it here: https://youtu.be/vf9cmfEb3Z8?t=10886
There are times where you need to do data type conversions in a Logic App.
I recently ran into an issue where I was syncing records between CRMOL (on line) and Salesforce. The record coming from CRMOL had NULL values. When converted to JSON, the NULL value is a string.
I could use the Logic App Replace function, but when you have to evaluate 30 to 50 fields, it becomes a tedious chore.
I decided to create a Function App. I used a C# Web Hook, so I can pass in the Response coming out of CRMOL. I could return the parsed Record to be mapped to Salesforce.
The code is very simple. I loop through each field and replace the “NULL” values with a empty string.
In Part – 2, I will show you the code and other utility functions that can be used in Logic Apps.