Send Log Analytics error reports by filtering Data from an entire Resource Group (video)

Send Log Analytics error reports by filtering Data from an entire Resource Group (video)

A Log Analytics workspace is a unique environment for logging data from Azure Monitor and other Azure services, such as Microsoft Sentinel and Microsoft Defender for Cloud. Each workspace has its own data repository and configuration but might combine data from multiple services. It is also a centralized place where you can store, collect, and analyze data from various sources.

Think of it as a tool that helps you keep track of everything happening across your Azure environment. It allows you to perform advanced analytics and visualize data in real-time to help you identify and troubleshoot issues quickly. You can also use it to create custom queries and alerts, set up automation tasks, and integrate with other Azure services to get even more insights into your data. Overall, the log analytics workspace is a powerful tool that helps you stay on top of your Azure environment and ensure everything runs smoothly.

In this video, we’ll cover the process of sending log analytics error reports by filtering data from an entire resource group.

Send Log Analytics error reports by filtering Data from an entire Resource Group by Luis Rigueira

Hope you find this helpful! So, if you liked the content or found it useful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Big thanks to my team member Luís Rigueira for creating this video.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Logic App Consumption: Send Query Reports from Log Analytics Workspace

Logic App Consumption: Send Query Reports from Log Analytics Workspace

In this blog post and tutorial, what we pretend to do is to create a Logic App, in this case, a Logic App Consumption, that sends reports through Log Analytics using an Azure Function to retrieve the data dynamically.

To do this, first, we need to understand a few things. First, what is the Log Analytics Workspace?

Log Analytics Workspace

A Log Analytics workspace is a unique environment for logging data from Azure Monitor and other Azure services, such as Microsoft Sentinel and Microsoft Defender for Cloud. Each workspace has its own data repository and configuration but might combine data from multiple services. It is also a centralized place where you can store, collect, and analyze data from various sources.

Think of it as a tool that helps you keep track of everything happening across your Azure environment. It allows you to perform advanced analytics and visualize data in real-time to help you identify and troubleshoot issues quickly. You can also use it to create custom queries and alerts, set up automation tasks, and integrate with other Azure services to get even more insights into your data. Overall, the log analytics workspace is a powerful tool that helps you stay on top of your Azure environment and ensure everything runs smoothly.

As explained above, we can create custom queries to retrieve data from Azure resources, such as Logic Apps, and how does that work?

First of all, you need to create a Log Analytics Workspace. For that:

  • Search for Log Analytics in the search bar in Azure Portal and click on Log Analytics workspaces.
  • And next, click on Create.
  • After this, populate the fields with the more appropriate information for your scenario, like the Name, Resource group, and Region, then click Review and Create.

You should have this once the resource creation is finished, and this is your Log Analytics Workspace.

And if you click on Logs, it will open a panel that works via queries.

Some of them are already built-in queries, but if you click on the X.

You will have a panel where you can write your own queries. And they work based on a scope. For example, you can apply this query to your resource group as you can apply your query to your Logic App. You also have a time range to apply the query to events that happened in the timespan you have defined.

But about queries on Azure Log Analytics, how do they work? And how to write them?

In Log Analytics on the Azure portal, queries are used to search and analyze the data that has been collected in the Log Analytics workspace. Think of queries as a way to ask questions about your data, such as, How many times did this event occur? or What was the average response time for this API? Once you write a query, the Log Analytics workspace will return the results in a table format, which you can use to gain insights and make data-driven decisions.

To write a query in Log Analytics on the Azure portal, you will use a query language called Kusto Query Language (KQL). And what is Kusto Query Language (KQL)?

Kusto Query Language (KQL) is a query language used in Azure Data Explorer, Azure Monitor, and Log Analytics for querying large datasets. KQL is a simple yet powerful language that allows you to search, analyze, and visualize data in a flexible way. KQL has a SQL-like syntax but also supports many features that are specific to Azure services, such as functions for querying JSON data, time-series data, and hierarchical data.

Some of the key features of KQL include:

  • Support for querying structured, semi-structured, and unstructured data
  • Built-in functions for working with dates, strings, arrays, and other data types
  • Support for aggregations, joins, and subqueries
  • Ability to query data in real-time
  • Integration with Azure services for data collection, storage, and visualization
  • KQL is a powerful tool that can help you gain insights and make data-driven decisions, especially when dealing with large datasets in Azure.

Here is an example of KQL:

AzureDiagnostics
| where Category == "LogicAppWorkflowRuntime" and WorkflowName == "my-logic-app"
| where OperationName == "WorkflowRunStarted" or OperationName == "WorkflowRunCompleted"
| project WorkflowName, OperationName, StartTime, EndTime, Status, Message

This query will:

  • Search for all events in the AzureDiagnostics table where the Category is LogicAppWorkflowRuntime, and the WorkflowName is my-logic-app.
  • It will then filter the results only to include events where the OperationName is either WorkflowRunStarted or WorkflowRunCompleted.
  • The project operator is used to select specific columns to include in the results. In this case, the query will return each event’s WorkflowName, OperationName, StartTime, EndTime, Status, and Message.
  • | is the pipe operator, which connects the different parts of the query together.

This query can help you monitor the performance and status of your Logic App by tracking when workflows are started and completed, as well as any associated error messages or status codes. You can run this query in Azure Log Analytics or Azure Monitor to gain insights into your logic app’s performance and troubleshoot any issues that arise.

How to configure a Logic App to send data to the Log Analytics workspace

So, now you have an idea of how it works, but, as we explained before, Azure Log Analytics collects the events from various Azure Resources, so to make this possible, we need to create a connection between the Resource we want to collect data from, and the Azure Log Analytics.

To do that, let’s create a Logic App, and in doing so, do not forget to use the same Resource Group and Region you have your Azure Log Analytics stored, and give the Logic App a name that makes sense to you and implement the desired business logic.

Do not forget: Start using Proper names from day one!

Or access to an existing Logic App.

  • Next, click Diagnostic Settings and + Add diagnostic setting on your Logic App.
  • Give a name to your Diagnostic setting name, and check the boxes:
    • allLogs
    • AllMetrics
    • and Send to Log Analytics Workspace
  • Next, choose the subscription where your Log Analytics Workspace is created and choose your Log Analytics Workspace, the one we just created.

And from now on, this Logic App will send data to the Log Analytics!

Create a Logic App to create a report from Log Analytics Workspace

To do that, we need to:

  • Create a new Logic App and add a Recurrence trigger and set the following configurations:
    • Choose the Interval as 1 and the Frequency Day
  • Next, choose the operation Azure Monitor Logs and the ActionRun query and list results.
  • Next, you will have some fields to populate, like:
    • Subscription
    • Resource Group
    • Resource Type
    • Resource Name
    • Query
    • Time Range
  • This is the query we will be using:
AzureDiagnostics
| where ResourceProvider == "MICROSOFT.LOGIC"
| where Category == "WorkflowRuntime"
| where status_s == "Failed"
| project LogicAppName = tostring(resource_workflowName_s), message = tostring(error_message_s)
| summarize count() by LogicAppName, message

In our case, we are dealing with a Logic App that has given us some errors already.

This is the raw output from the query if we run our logic app:

{
"statusCode": 200,
"headers": {
"Pragma": "no-cache",
"Transfer-Encoding": "chunked",
"Vary": "Accept-Encoding",
"Cache-Control": "no-store, no-cache",
"Set-Cookie": "ARRAffinity=eac69d9633b62a80172d43feba694263b4d9fccb8b9d953b364b8fc058f6e946;Path=/;HttpOnly;Secure;Domain=azuremonitorlogs-we.azconn-we-002.p.azurewebsites.net,ARRAffinitySameSite=eac69d9633b62a80172d43feba694263b4d9fccb8b9d953b364b8fc058f6e946;Path=/;HttpOnly;SameSite=None;Secure;Domain=azuremonitorlogs-we.azconn-we-002.p.azurewebsites.net",
"x-ms-request-id": "e8945bb2-f438-4ee9-9b22-58ae9971e462",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"X-Content-Type-Options": "nosniff",
"X-Frame-Options": "DENY",
"Timing-Allow-Origin": "*",
"x-ms-apihub-cached-response": "false",
"x-ms-apihub-obo": "false",
"Date": "Mon, 13 Mar 2023 12:47:12 GMT",
"Content-Type": "application/json; charset=utf-8",
"Expires": "-1",
"Content-Length": "2536"
},
"body": {
"value": [
{
"LogicAppName": "LA-AppInsights-POC",
"message": "",
"count_": 8
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "An action failed. No dependent actions succeeded.",
"count_": 17
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "The execution of template action 'For_each' failed: the result of the evaluation of 'foreach' expression '@{body('Get_Logic_App_Consumption_Failures_By_Type_From_Log_Analytics')?['value']} ------@{items('For_each_2')}---@{items('For_each_2')?['count_']}---@{items('For_each_2')?['message']}' is of type 'String'. The result must be a valid array.",
"count_": 7
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "Unable to process template language expressions in action 'DecodedContent' inputs at line '0' and column '0': 'The template language function 'decodeBase64' was invoked with a parameter that is not valid. The value cannot be decoded from base64 representation.'.",
"count_": 2
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "The 'from' property value in the 'table' action inputs is of type 'Object'. The value must be of type 'Array'.",
"count_": 1
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "The provided 'Http' action inputs are not valid. A request body must not be included for 'GET' requests.",
"count_": 1
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "Unable to process template language expressions in action 'Initialize_variable_2' inputs at line '0' and column '0': 'The template language function 'decodeBase64' expects one parameter: the string to decode from base64 representation. The function was invoked with '0' parameters. Please see https://aka.ms/logicexpressions#decodeBase64 for usage details.'.",
"count_": 1
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "Unable to process template language expressions in action 'Initialize_variable_2' inputs at line '0' and column '0': 'The template language function 'decodeBase64' was invoked with a parameter that is not valid. The value cannot be decoded from base64 representation.'.",
"count_": 1
},
{
"LogicAppName": "LA-AppInsights-POC",
"message": "The 'from' property value in the 'table' action inputs is of type 'String'. The value must be of type 'Array'.",
"count_": 1
},
{
"LogicAppName": "LA-Monitoring-Telemetry",
"message": "",
"count_": 2
},
{
"LogicAppName": "LA-Monitoring-Telemetry",
"message": "An action failed. No dependent actions succeeded.",
"count_": 3
},
{
"LogicAppName": "LA-Monitoring-Telemetry",
"message": "The 'from' property value in the 'table' action inputs is of type 'Null'. The value must be of type 'Array'.",
"count_": 1
}
]
}
}

If we use this same query in our Log Analytics Workspace, we get these results:

As you can see, we have a report of the Last 7 days with data including the Logic App Name, the error message, and the count.

This is useful in situations where you or your company or the project you are working on is dealing with a lot of data, and you need to know what Failed, where it Failed, and how many times it Failed. So, since we can query this information in the Log Analytics Workspace, we can do the same in our Logic App, as we explained before, using the Azure Monitor Logs and the ActionRun query and list results.

Now going back to our Logic app, we have two options, send this report as an HTML table as it is using a Data Operation – Create HTML Table, and then send an email with this information.

And what you will receive is something like this (in this example, we are not counting with the Logic App Name)

But this data delivery seems a bit old-fashioned, so why not create something more appealing? Something to be presented in the Body of an email like this:

And to achieve this, what we have done was to use an already HTML base document and an Azure Function to map the Rows into an HTML table. (we are not going to address this topic in this tutorial)

Finally, in our Logic App, we may want to validate if the query got results or if it is empty and then send an email with the report.

  • On the body of the email, add the Variable containing the HTML template (and the Azure Function call result), and add a subject and an email.
  • Save the Logic App once it is finished.

And now you will get the report from Log Analytics Workspace!

Remember that, for example, if you set the recurrence as daily and the query also is looking for events that have occurred in the last 24 hours, you might not have an email to present since the condition prevents it from sending the email if the Body is null, but this also means there are no flaws in your logic apps, which is always nice!

Thanks to my team member Luís Rigueira for helping me realize and implement this idea.

Hope you find this useful! So, if you liked the content or found it useful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

How to audit Power BI access on PowerBI Portal using Azure Logic App and Function App

How to audit Power BI access on PowerBI Portal using Azure Logic App and Function App

Today I’m going to go over how we solved a real scenario from one of our PowerBI Portal clients. For those who aren’t familiar with it, PowerBI Portal is a web tool that allows organizations to host an unlimited number of Power BI reports and dashboards on an online portal, and give access to it to any number of users, regardless of being in their organization or even having a Power BI account. PowerBI Portal is mainly used by organizations looking to share data with their clients, partners and suppliers, but there have been numerous entrepreneurial cases of people using it as a platform, selling their clients access to the portal or charging for the time they spent on PowerBI Portal.

Other interesting points about PowerBI Portal are the tool’s double layer of row-level security (user and role), which allows data managers to specify who has access to what, and the ability to only consume Power BI Embedded capacity when there’s activity on the platform, which can severely reduce an organization’s consumption bill.

Finally, it’s worth mentioning how flexible the PowerBI Portal API is, allowing for custom solutions such as the one we’ll cover in this blog post.

Challenge

Our PowerBI Portal client wanted a daily report of the top 10 vendors that accessed their organization’s portal along with the most viewed dashboards/reports to better understand how the tool was being used and by whom. The PowerBI Portal API is actually very powerful and straightforward to use, but it didn’t have this capability out-of-the-box so we proactively extend the product’s capabilities by using Azure Integration Services.

This proof-of-concept will explain how you can extend the product by using the existing APIs and creating a fancy Power BI access on the PowerBI Portal audit report.

Create Function App

If you don’t have yet a Function App with the .NET runtime stack created, the first step is to create one, and for that, you need to:

  • From the Azure portal menu or the Home page, select Create a resource.
  • In the Create a resource page, select Compute > Function App.
  • On the Create Function App Basics page, use the following function app settings:
    • Subscription: Select the subscription under which this new function app is created.
    • Resource Group: Select an existing Resource Group or create a new one in which your function app will be created.
    • Function App name: Name that identifies your new function app. 
    • Publish: Select Code.
    • Runtime stack: Select the option .NET
    • Version: Choose the version of your installed runtime, in this case, 6
    • Region: Choose a region near you or near other services your functions access.
  • Select Next : Hosting. On the Hosting page, enter the following settings:
    • Storage Account: Create a storage account used by your function app or select an existing one
    • Operating system: I choose to use Windows since I’m feeling more comfortable with it.
    • Plan: Hosting plan that defines how resources are allocated to your function app. In this case, you need to select the Consumption  plan.
  • You can customize the other option according to your intentions or leave the default values. For this demo, we will now select Review + create to review the app configuration selections.
  • On the Review + create page, review your settings, and then select Create to provision and deploy the function app.

Create HTTP trigger function

The next step is to create two HTTP trigger Function:

  • FA_Audit_Top10Reports
  • FA_Audit_Top10USers

For that we need to:

  • From the left menu of the Function App window, select Functions, then select Create from the top menu.
  • From the Create Function window, leave the Development environment property as Develop in portal and select the HTTP trigger template.
  • Under Template details give a proper name for New Function, and choose Function from the Authorization level drop-down list, and then select Create.
  • On the FA_Audit_Top10Reports window, select Code + Test, then on the run.ps1 file add the following code:
    • This function will return a list of top 10 reports in a HTML table format
...
string requestBody = new StreamReader(req.Body).ReadToEnd();
JArray data = (JArray)JsonConvert.DeserializeObject(requestBody);

var apiReport = new JArray();

var groups = data
	.GroupBy(s => s["name"])
	.Select(s => new
	{
		Dashboard = s.Key,
		Count = s.Count()
	})
	.OrderByDescending(s=> s.Count).Take(10);
...

Note: this is a small part of the code. Click on the button below to download a simplified version of the source code from the overall solution.

  • On the FA_Audit_Top10USers window, select Code + Test, then on the run.ps1 file add the following code:
    • This function will return a list of top 10 users in a HTML table format
...
string requestBody = new StreamReader(req.Body).ReadToEnd();
JArray data = (JArray)JsonConvert.DeserializeObject(requestBody);

var apiReport = new JArray();

var groups = data
	.GroupBy(s => s["userEmail"])
	.Select(s => new
	{
		User = s.Key,
		Count = s.Count()
	})
	.OrderByDescending(s=> s.Count).Take(10);
...

Note: this is a small part of the code. Click on the button below to download a simplified version of the source code from the overall solution.

Create a Logic App

Finally, we need to create a scheduling Logic App to trigger the monitoring Function and notify if any API Connection is broken. To simplify the solution, we will be using the Azure Portal to create also the Logic App.

  • From the Azure portal menu or the Home page, select Create a resource.
  • In the Create a resource page, select Integration > Logic App.
  • On the Create Logic App Basics page, use the following Logic app settings:
    • Subscription: Select the subscription under which this new Logic app is created.
    • Resource Group: Select an existing Resource Group or create a new one in which your Logic app will be created.
    • Type: The logic app resource type and billing model to use for your resource, in this case we will be using Consumption
      • Consumption: This logic app resource type runs in global, multi-tenant Azure Logic Apps and uses the Consumption billing model.
      • Standard: This logic app resource type runs in single-tenant Azure Logic Apps and uses the Standard billing model.
    • Logic App name: Your logic app resource name, which must be unique across regions.
    • Region: The Azure datacenter region where to store your app’s information. Choose a region near you or near other services your Logic app access.
    • Enable log analytics: Change this option only when you want to enable diagnostic logging. The default value in No.
  • When you’re ready, select Review + Create. On the validation page, confirm the details that you provided, and select Create.
  • After Azure successfully deploys your app, select Go to resource. Or, find and select your logic app resource by typing the name in the Azure search box.
  • Under Templates, select Blank Logic App. After you select the template, the designer now shows an empty workflow surface.
  • In the workflow designer, under the search box, select Built-In. From the Triggers list, select the Schedule trigger, Recurrence.
  • In the trigger details, provide the following information:
    • Interval: 1
    • Frequency: Day
  • Under the Recurrence trigger, select New step.
  • Select New step. In the search box, enter HTTP, and from the result panel select the HTTPHTTP action and provide the following information:
    • Method: GET
    • URI: specify the endpoint of PowerBI Portal API – https://admin.powerbiportal.com/api/AccessHistory
    • Headers: you need to create the X-API-KEY with your access token
    • Queries: you need to specify two query parameters:
      • pageNumber: 1
      • pageSize: 100
  • Select New step. In the search box, enter Azure Functions, and from the result panel select the Azure Functions, select the Function App that contains the Functions we create above and then select the FA_Audit_Top10Users function and provide the following information
    • Request Body: Result body of the HTTP action – @{body(‘HTTP’)}
  • Do the same steps, this time for the FA_Audit_Top10Reports function
  • Select New step. In the search box, enter Variables, and from the result panel select the VariablesInitialize variable action and provide the following information:
    • Name: varEmailBody
    • Type: String
    • Value: provide the HTML email body template and add the result of the functions to that template
<!DOCTYPE html>
<html>
	<style>
		.notification-body{
			margin:0 auto;
			text-align:center;
			width: 1050px;
			border:1px black;
			border-collapse: collapse;
			background-color:#ffffff;
		}	
...
	</style>
	<body style="background-color: #dfdfdf;">
		<table style='background-color: #dfdfdf; height: 50px; width: 100%; line-height: 50px;'>
			<tr>
				<td>
					<p style='font-size:5.0pt;'>
						<span> </span>
					</p>
				</td>
			</tr>
		</table>
		...
		<table style='width:992px;margin:0 auto;border-collapse:collapse;'>
			<tr style='height: 15px;'>
				<td>
					<table style="width:100%;">
						<tr>
							<td class="notification-table-header" style='width:100%; height:9pt'>
								<p style='text-align:left; font-size:14.0pt;margin-left:5px; margin-bottom:10px;'><b>TOP 10 VIEWERS:</b></p>
							</td>
							<td style="width:auto;"></td>
						</tr>
					</table>
				</td>
			</tr>
			<tr>
				<td class="notification-table-text">
					</br>
					<table class="table-list" style="width:70%; text-align:left;">
						<tr class="table-list-header">
							<td style='margin-left:10px; margin-top:5px; margin-bottom:5px;'>USER</td>
							<td style='margin-left:10px; margin-top:5px; margin-bottom:5px;'>NUMBER ACCESSES</td>
							<td style='margin-left:10px; margin-top:5px; margin-bottom:5px;'>OVERHALL</td>
						</tr>
		@{body('FA_Monitoring_Top10Users')}
						</table>
				</td>
			</tr>
        ....
		<tr>
			<td class="notification-table-text">
				</br>
				<table class="table-list" style="width:70%; text-align:left;">
					<tr class="table-list-header">
						<td style='margin-left:10px; margin-top:5px; margin-bottom:5px;'>DASHBOARD</td>
						<td style='margin-left:10px; margin-top:5px; margin-bottom:5px;'>NUMBER VIEWS</td>
						<td style='margin-left:10px; margin-top:5px; margin-bottom:5px;'>OVERHALL</td>
					</tr>
		@{body('FA_Monitoring_Top10Reports')}
					</table>
			</td>
		</tr>
		...
		</table>
		...
	</body>
</html>

Note: this is a small part of the HTML body template code. You should customize it according to your needs.

  • And finally, select New step. In the search box, enter Office 365 Outlook, and from the result panel select the Office 365 OutlookSend an email (v2) action and provide the following information:
    • Body: varEmailBody – @{variables(‘varEmailBody’)}
    • Subject: [DEV] Power BI Portal Daily Report
    • To: list of your email addresses.

The result, once you try to execute the Logic App, will be a fancy HTML email:

More about Power BI Portal

PowerBI Portal is a web tool that allows users to embed any number of Power BI reports and dashboards on a portal with their organization’s layout, that can be shared with whoever they want, regardless of being in their organization or even having a Power BI account. Know more about it here.

The post How to audit Power BI access on PowerBI Portal using Azure Logic App and Function App appeared first on SANDRO PEREIRA BIZTALK BLOG.