Microsoft Integration Weekly Update: April 15, 2019

Microsoft Integration Weekly Update: April 15, 2019

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

 

Microsoft Announcements and Updates

 

Community Blog Posts

 

Videos

 

Podcasts

 

How get started with iPaaS design & development in Azure?

Feedback

Hope this would be helpful. Please feel free to reach out to me with your feedback and questions.

MVPDays Microsoft Flow Conference 2018 | How we use Microsoft Flow and PowerApps: Real cases scenarios | Video and slides are available

MVPDays Microsoft Flow Conference 2018 | How we use Microsoft Flow and PowerApps: Real cases scenarios | Video and slides are available

To finalize these sessions resources shared, at least for now because I’m still waiting for other resources to get available. Here is one that I forgot even to mention in my blog that I was speaking at that event: the MVPDays Microsoft Flow Conference 2018 that toked place online on December,12 of last year (http://www.mvpdays.com/?page_id=11493). This was my first session about real case scenarios using PowerApps and Microsoft Flow.

About my session

Session Name: How we use Microsoft Flow and PowerApps: Real cases scenarios

01-Microsoft-Flow-PowerApps-Real-cases-scenarios

Session Overview: We know that all business problem can be solved with a variety of technologies and different solutions. However, sometimes developing that type of solutions has traditionally been too costly and time-consuming for many of the need’s teams and departments face, especially those projects that are internally for organizations to use or for a short time period. As a result, many of these projects or solutions will be on the shelf or in the imaginary of the collaborators.

They are in Dynamics 365, Office 365, on premises, on the cloud… they are everywhere, and they are fantastic! Developers can do it; IT can do it… you can do it!

Microsoft Flow and PowerApps, sometimes together sometimes isolated are here to help you, and in this session, we will show you real live scenarios on how we use these two technologies in our customers and internally at DevScope.

  Microsoft Flow and PowerApps: Real cases scenariosMicrosoft Flow and PowerApps: Real cases scenarios

  Microsoft Flow and PowerApps: Real cases scenariosMicrosoft Flow and PowerApps: Real cases scenarios

Slides and Video

MVPDays Microsoft Flow Conference 2018 | How we use Microsoft Flow and PowerApps: Real cases scenarios

The post MVPDays Microsoft Flow Conference 2018 | How we use Microsoft Flow and PowerApps: Real cases scenarios | Video and slides are available appeared first on SANDRO PEREIRA BIZTALK BLOG.

Integration Use Group | The NoS-addin – your (free) BizTalk Dev buddy! | Video and slides are available

Integration Use Group | The NoS-addin – your (free) BizTalk Dev buddy! | Video and slides are available

Another day, another resource shared! This time regarding my session deliver on Integration User Group or also known as Integration Monday about BizTalk Server NoS add-in – BizTalk NoS Ultimate – that is a Visual Studio add-in for BizTalk developers that aims to improve the experience while developing BizTalk projects. It is an extension to Microsoft Visual Studio that will offer lots of useful functionalities, mainly for developers, by which BizTalk users can save valuable time while working on their day-to-day activities and improve productivity.

About my session

Session Name: The NOS-addin – your (free) BizTalk Dev buddy!

BizTalk Server NoS Ultimate add-in BizTalk Dev Buddy

Session Overview: The NOS-add-in is a tool specifically developed for BizTalk developers. It contains all kind of features to make the life of a BizTalk developer easier and thereby less time-consuming.

In this session, I will show the different capabilities of this tool.

BizTalk Server NoS Ultimate add-in BizTalk Dev Buddy

About Integration User Group

Integration User Group aims to educate, evangelize, inform the community about various integration technologies, and how developers and architects can share and learn about the evolving integration and messaging capabilities of the Microsoft platform.

Website: http://www.integrationusergroup.com/

The post Integration Use Group | The NoS-addin – your (free) BizTalk Dev buddy! | Video and slides are available appeared first on SANDRO PEREIRA BIZTALK BLOG.

Group Email using SMTP Notification Channel in BizTalk360

Group Email using SMTP Notification Channel in BizTalk360

Monitoring and Alert notification is amongst the core functionalities in BizTalk360. To empower better usability, we are improving the alert notification in the upcoming release v9.0 by adding the option for sending alert notifications to a group of Email Id’s. We always give importance to our customer voice and give importance to customer feedbacks while picking up the features for every release. Also this feature becomes implemented based on our customers feedback.

Email-Distribution-List-Feedback

This feature not only addresses the Group Email list, it also has the additional capability to add recipients based on UP Alert and Auto Correct Alert, and an option to copy in people (CC) in notification emails. Those functionalities are also highly recommended by customer feedback as below.

Group-Email-for-UP-alert - Auto Correct-alertAdding-Email-CC-Feedback

We have achieved the above feedbacks in one powerful feature called “SMTP Notification Channel”.

What is SMTP Notification channel in BizTalk360?

Intending to send email notifications, you simply need to specify the recipients Email address. You can add multiple Email id(s) separated by semicolon. The SMTP Notification channel emails are sent using the SMTP Client server.

SMTP-Notification-Channel-Configuration

The following table provides descriptions of the configuration fields and indicates whether they are required. The section that follows this table provides example configurations.

FieldIsRequired?Description
Email ToRequiredEmail address of the notification recipients for all type of alert
CCOptionalEmail address of the notification recipients
Up AlertOptionalEmail address of the notification recipients for Up alert
Auto Correct AlertOptionalEmail address of the notification recipients for Auto Correct alert

The user can create multiple email distribution lists by configuring multiple SMTP notification channels and map them to the same or different Alarms based on the business needs.

SMTP-Notification-Channel-Alarm-Mapping

Note: To receive alerts through the SMTP notification channel, the user needs to configure the SMTP settings in BizTalk360 (BizTalk360-> Settings-> Monitoring and Notification-> SMTP). The SMTP Notification channel will take the server connection details form the configured SMTP settings.

SMTP-Settings-Configuration

BizTalk360 Alarm Configuration Methods

As you know in BizTalk360, you can configure the alarm notifications in two ways, one is by configuring Alarm only with email ids and the other one is mapping Notification channels (ServiceNow, Slack, PowerShell, Webhook, Microsoft Teams).

Now, in the notification channel we have added an additional advantage for Email configuration, called the SMTP Notification channel. This will behave the same as the native Email configuration methods, which is actually very easy to use and maintain.

BizTalk360 native Email Configuration

In earlier versions of BizTalk360, a user could configure the Alarm by directly proving the Email id(s), as shown below.

Alarm-Email-Configuration

 

If the user wants to use the same recipients in another alarm, the user needs to copy the email and paste it to the new alarm or they need to type it again manually. It is a very time consuming and tedious process when you want to configure multiple email ids.

Also, in the earlier BizTalk360 versions, there is no option to group the alerts like UP Alert, AutoCorrect Alert and to have a CC for the admin or any other recipients. To overcome all, we have introduced the “SMTP Notification Channel”.

Cons of using the native Email configuration

  • It’s hard to configure multiple recipients for multiple alarms
  • Any user can change the recipients for any alarms, which is a security risk
  • All the alert like up alerts, down alert, autocorrect will be sent to all the configured Emails
  • No CC option was available

Grouping Email using SMTP Notification Channel

Using this SMTP Notification channel, a professional can effortlessly create multiple email contact groups and effectively send or automate sending BizTalk360 alerts to thousands of recipients in multiple email groups at the same time. In Addition to that, they can group the email recipients based on the type of alerts like Up Alert and Auto Correct Alert.

Pros of using the SMTP Notification channel

  • It is an easy and effective way to create and manage multiple mailing lists
  • User can group the email recipients accordingly to their business needs, based on the alerts like UP alert and Auto correct Alert
  • It helps to add the CC to the alerts, which helps to notify an admin or other professional as per their business norms
  • Any user can use the Configured SMTP channel for their respective alarms
  • Only the admin or Super user can create the SMTP Notification channel. Another user cannot create the Channel which improves the security

How can we effectively use the SMTP Notification for our business

Creating an Email Distribution List with the SMTP Notification Channel: Most of the customers use BizTalk360 for the monitoring capability. The user can monitor BizTalk artifacts through the BizTalk360 alarms and the user gets notified when any artifact goes down. The user can configure n number of alarms for their business needs.

For instance, if a user was monitoring multiple BizTalk artifacts with more than 100 BizTalk360 Alarms with the same set of recipients, and if the admin wants to add/remove a recipient, then the admin needs to go and manually change all the alarms which is time consuming. This can be done much simpler with using the STMP channel; it is enough to provide email recipients only at SMTP notification channel configuration. The same can be used across all the alarms, just by enabling the SMTP channel. And, if there is any change in recipient ids, it’s enough to change it in SMTP notification channel and it will be reflected in all the alarms. The user can also configure multiple SMTP channels to multiple alarms or a single alarm as below, this will reduce the manual effort.

Mapping-Multiple-SMTP-Notification-channel-in-Alarm

Personalize Alerts to notify different group of users: As you know, users get notified through different types of Biztalk360 alerts such as Down alert, UP Alert, AutoCorrect Alert, Regular Alert and Data Monitoring Alert. The Down Alert will be triggered when the correct state of the artifact is different from the Expected state; the UP Alert will get triggered when all the configured artifacts are in Healthy state; the AutoCorrect Alert will be triggered when system tries to rectify the violation to make BizTalk Environment healthy.

In case an admin wants a down alert to be sent to everyone in the team, but the Up Alert or Auto correct alert need to be notified to only specific members in team, that can be easily achieved by configuring the selected recipient ids Up Alert/Auto correct field while configuring SMTP channel.

UP-Alert-and-Down-Alert

Create the SMTP Channel with the Email To, which indicates the Down Alert recipients, then configure the UP Alert and Down Alert recipients on the respective fields. Then configure the channel to the respective Threshold Alarm. When the artifacts are down, it will trigger the alert only to the configured recipients and the same for UP Alert and AutoCorrect Alert.

Make a copy to different users: Within an organization, a user needs to provide a copy of an alert to the admin or some other authority. In this case, they can configure the CC in the SMTP Channel, then the alert will get notified to the admin.

Customizing SMTP Notification Channel: The SMTP Notification channel can be customized based on the business requirements using the GitHub project file. For Instance, if the user wants to send the Data Monitoring alerts to a specific Email group, this can be achieved by adding a piece of code in the GitHub project file. For SMTP Notification channel, the data are retrieved based on the alert notification type.

Notification-Type

You need to perform the following steps:

  • Add the fields to get input for Data monitoring Alert Emails in the global properties XML File.

<TextArea Name=”DataMonitoring-Alert” DisplayName=”Datamonitoring Alert Email” IsMandatory=”false” Tooltip=”Only Data Monitoring alerts will be trggerred to the configured Email Id(s)” DefaultValue=”” Value=”” Type=”email” ValidationPattern=”^([w+-.%]+@[w-.]+.[A-Za-z]{2,}(s*;?s*)*)+$”/>

  • Get the email id’s and assign to another variable in the SMTPChannel.cs file as

Get-Email-Id(s)-in-SMTPChannel.cs

  • You need to add the Condition respective for Data Monitoring Alert as

Condition-For-DataMonitoring

Then the user can add separate group Emails for Data Monitoring alert as below.

UI-With-Data-Monitoring-FieldWrap Up

A mailing list or group email can be the bliss of a feature for businesses of any size. The SMTP Notification channel allows you to reach out to as many people as you want without having to re-write or copy-paste the same email multiple times.

Stay tuned!!

Conclusion

If you plan on creating and send group emails for an organization or a business purpose, then this SMTP Notification channel will ensure a productive, scalable and goal-driven approach to your group email campaigns.

 

The post Group Email using SMTP Notification Channel in BizTalk360 appeared first on BizTalk360.

Integration Down Under | How we are using Microsoft Integration features and related Azure technologies to improve our processes | Video and slides are available

Integration Down Under | How we are using Microsoft Integration features and related Azure technologies to improve our processes | Video and slides are available

I may be writing less on my blog as I frequently do, this will change soon, but this year of this is being very productive in terms of lectures. 4 in 4 months and more to come:

  • 3/30/2019 – Real case implementations using Azure Logic Apps and/or Microsoft Flows at Global Integration Bootcamp Madrid
  • 2/14/2019 – Integration Down Under | February 14, 2019 | How we are using Microsoft Integration features and related Azure technologies to improve our processes
  • 2/4/2019 – The NOS-addin – your (free) BizTalk Dev buddy! at Integration Monday
  • 1/30/2019 – XLVIII Porto.Data Community Meeting | How we use Microsoft Flow and PowerApps: Real cases scenarios

Now is time to share some resources before I start writing about other things and completely forget about this.

About my session

Session Name: How we are using Microsoft Integration features and related Azure technologies to improve our processes

Microsoft Integration features Session

Session Overview: In this session, I will show you real live scenarios on how we at DevScope are using Microsoft Integration features (Logic Apps, API Management, API’s) and related Azure technologies like PowerApps, Flows and Power BI to:

  • First, improve our internal processes like expenses reports, time reports and so on;
  • And, secondly, how the first step helps us out to extend our product and our business by exporting these same approaches and concepts to our clients
Microsoft Logic Apps and SmartDocumentor-Expenses

This will be a lightweight talk addressing some real scenarios and show them in action.

Integration Down Under – How we are using Microsoft Integration features and related Azure technologies to improve our processes

About Integration Down Under

Integration Down Under serves the Australian / New Zealand community interested in all things Microsoft integration. Endeavoring to have regular webinar presentations, usually on the 2nd Thursday of each month. Organized by a panel of five Australian and New Zealand integration experts, our guest speakers feature various Azure MVPs, members of the Microsoft product teams, and other prominent members of the Microsoft integration community.

Website: http://www.integrationdownunder.com/

Twitter: https://twitter.com/integration_du

YouTube channel: https://www.youtube.com/channel/UC5N-7y5XDeX0IY9mkssqRZQ

The post Integration Down Under | How we are using Microsoft Integration features and related Azure technologies to improve our processes | Video and slides are available appeared first on SANDRO PEREIRA BIZTALK BLOG.

Microsoft Integration Weekly Update: April 8, 2019

Microsoft Integration Weekly Update: April 8, 2019

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

 

Microsoft Announcements and Updates

 

Community Blog Posts

 

Videos

 

Podcasts

 

How get started with iPaaS design & development in Azure?

Feedback

Hope this would be helpful. Please feel free to reach out to me with your feedback and questions.

Connecting your Java microservices to each other? Here’s how to use Spring Cloud Stream with Azure Event Hubs.

Connecting your Java microservices to each other? Here’s how to use Spring Cloud Stream with Azure Event Hubs.

You’ve got microservices. Great. They’re being continuous delivered. Neato. Ok … now what? The next hurdle you may face is data processing amongst this distributed mesh o’ things. Brokered messaging engines like Azure Service Bus or RabbitMQ are nice choices if you want pub/sub routing and smarts residing inside the broker. Lately, many folks have gotten excited by stateful stream processing scenarios and using distributed logs as a shared source of events. In those cases, you use something like Apache Kafka or Azure Event Hubs and rely on smart(er) clients to figure out what to read and what to process. What should you use to build these smart stream processing clients?

I’ve written about Spring Cloud Stream a handful of times, and last year showed how to integrate with the Kafka interface on Azure Event Hubs. Just today, Microsoft shipped a brand new “binder” for Spring Cloud Stream that works directly with Azure Event Hubs. Event processing engines aren’t useful if you aren’t actually publishing or subscribing to events, so I thought I’d try out this new binder and see how to light up Azure Event Hubs.

Setting Up Microsoft Azure

First, I created a new Azure Storage account. When reading from an Event Hubs partition, the client maintains a cursor. This cursor tells the client where it should start reading data from. You have the option to store this cursor server-side in an Azure Storage account so that when your app restarts, you can pick up where you left off.

There’s no need for me to create anything in the Storage account, as the Spring Cloud Stream binder can handle that for me.

Next, the actual Azure Event Hubs account! First I created the namespace. Here, I chose things like a name, region, pricing tier, and throughput units.

Like with the Storage account, I could stop here. My application will automatically create the actual Event Hub if it doesn’t exist. In reality, I’d probably want to create it first so that I could pre-define things like partition count and message retention period.

Creating the event publisher

The event publisher takes in a message via web request, and publishes that message for others to process. The app is a Spring Boot app, and I used the start.spring.io experience baked into Spring Tools (for Eclipse, Atom, and VS Code) to instantiate my project. Note that I chose “web” and “cloud stream” dependencies.

With the project created, I added the Event Hubs binder to my project. In the pom.xml file, I added a reference to the Maven package.

 <dependency>
  <groupId>com.microsoft.azure</groupId>
  <artifactId>spring-cloud-azure-eventhubs-stream-binder</artifactId>
  <version>1.1.0.RC5</version>
</dependency>

Now before going much farther, I needed a credentials file. Basically, it includes all the info needed for the binder to successfully chat with Azure Event Hubs. You use the az CLI tool to generate it. If you don’t have it handy, the easiest option is to use the Cloud Shell built into the Azure Portal.

From here, I did az list to show all my Azure subscriptions. I chose the one that holds my Azure Event Hub and copied the associated GUID. Then, I set that account as my default one for the CLI with this command:

az account set -s 11111111-1111-1111-1111-111111111111

With that done, I issued another command to generate the credential file.

az ad sp create-for-rbac --sdk-auth > my.azureauth

I opened up that file within the Cloud Shell, copied the contents, and pasted the JSON content into a new file in the resources directory of my Spring Boot app.

Next up, the code. Because we’re using Spring Cloud Stream, there’s no specific Event Hubs logic in my code itself. I only use Spring Cloud Stream concepts, which abstracts away any boilerplate configuration and setup. The code below shows a simple REST controller that takes in a message, and publishes that message to the output channel. Behind the scenes, when my app starts up, Boot discovers and inflates all the objects needed to securely talk to Azure Event Hubs.

 @EnableBinding(Source.class)
@RestController
@SpringBootApplication
public class SpringStreamEventhubsProducerApplication {

public static void main(String[] args) {
SpringApplication.run(SpringStreamEventhubsProducerApplication.class, args);
}

@Autowired
private Source source;

@PostMapping("/messages")
public String postMsg(@RequestBody String msg) {

this.source.output().send(new GenericMessage<>(msg));
return msg;
}
}

How simple is that? All that’s left is the application properties used by the app. Here, I set a few general Spring Cloud Stream properties, and a few related to the Event Hubs binder.

 #point to credentials
spring.cloud.azure.credential-file-path=my.azureauth
#get these values from the Azure Portal
spring.cloud.azure.resource-group=demos
spring.cloud.azure.region=East US
spring.cloud.azure.eventhub.namespace=seroter-event-hub

#choose where to store checkpoints
spring.cloud.azure.eventhub.checkpoint-storage-account=serotereventhubs

#set the name of the Event Hub
spring.cloud.stream.bindings.output.destination=seroterhub

#be lazy and let the app create the Storage blobs and Event Hub
spring.cloud.azure.auto-create-resources=true

With that, I had a working publisher.

Creating the event subscriber

It’s no fun publishing messages if no one ever reads them. So, I built a subscriber. I walked through the same start.spring.io experience as above, this time ONLY choosing the Cloud Stream dependency. And then added the Event Hubs binder to the pom.xml file of the created project. I also copied the my.azureauth file (containing our credentials) from the publisher project to the subscriber project.

It’s criminally simple to pull messages from a broker using Spring Cloud Stream. Here’s the full extent of the code. Stream handles things like content type transformation, and so much more.

 @EnableBinding(Sink.class)
@SpringBootApplication
public class SpringStreamEventhubsConsumerApplication {

public static void main(String[] args) {
SpringApplication.run(SpringStreamEventhubsConsumerApplication.class, args);
}

@StreamListener(Sink.INPUT)
public void handleMessage(String msg) {
System.out.println("message is " + msg);
}
}

The final step involved defining the application properties, including the Storage account for checkpointing, and whether to automatically create the Azure resources.

 #point to credentials
spring.cloud.azure.credential-file-path=my.azureauth
#get these values from the Azure Portal
spring.cloud.azure.resource-group=demos
spring.cloud.azure.region=East US
spring.cloud.azure.eventhub.namespace=seroter-event-hub

#choose where to store checkpoints
spring.cloud.azure.eventhub.checkpoint-storage-account=serotereventhubs

#set the name of the Event Hub
spring.cloud.stream.bindings.input.destination=seroterhub
#set the consumer group
spring.cloud.stream.bindings.input.group=system3

#read from the earliest point in the log; default val is LATEST
spring.cloud.stream.eventhub.bindings.input.consumer.start-position=EARLIEST

#be lazy and let the app create the Storage blobs and Event Hub
spring.cloud.azure.auto-create-resources=true

And now we have a working subscriber.

Testing this thing

First, I started up the producer app. It started up successfully, and I can see in the startup log that it created the Event Hub automatically for me after connecting.

To be sure, I checked the Azure Portal and saw a new Event Hub with 4 partitions.

Sweet. I called the REST endpoint on my app three times to get a few messages into the Event Hub.

Now remember, since we’re dealing with a log versus a queuing system, my consumers don’t have to be online (or even registered anywhere) to get the data at their leisure. I can attach to the log at any time and start reading it. So that data is just hanging out in Event Hubs until its retention period expires.

I started up my Spring Boot subscriber app. After a couple moments, it connected to Azure Event Hubs, and read the three entries that it hadn’t ever seen before.

Back in the Azure Portal, I checked and saw a new blob container in my Storage account, with a folder for my consumer group, and checkpoints for each partition.

If I sent more messages into the REST endpoint, they immediately appeared in my subscriber app. What if I defined a new consumer group? Would it read all the messages from the beginning?

I stopped the subscriber app, changed the application property for “consumer group” to “system4” and restarted the app. After Spring Cloud Stream connected to each partition, it pumped out whatever it found, and responded immediately to any new entries.

Whether you’re building a change-feed listener off of Cosmos DB, sharing data between business partners, or doing data processing between microservices, you’ll probably be using a broker. If it’s an event bus like Azure Event Hubs, you now have an easy path with Spring Cloud Stream.

APM (Application Performance Monitoring/Management) Integration in BizTalk360

APM (Application Performance Monitoring/Management) Integration in BizTalk360

Introduction

BizTalk360 already has the capability to integrate with New Relic, in which user can get insights on real-time performance data . 

Application Performance Management or Application Performance Monitoring (APM) is an essential tool to help managing and monitoring the performance of an application.In today’s market a wide range of tools is available to optimize and monitor the performance of an application. AppDynamics is a well-known top application performance monitoring tool when it comes to APM. Knowing the importance of Application Performance Monitoring, BizTalk360 provides integration with AppDynamics from the v8.9.6.

Why do we integrate AppDynamics in BizTalk360?

For large organizations spend quite good investment when comes for complex network for optimization and monitor the performance of apps and related issues.AppDynamics is widely used by companies as an enterprise-wide monitoring solution and it has the capability to provide deep performance analytics of your configured environment .Considering the importance of monitoring the performance of  BizTalk Server environment in a single place, few of our customer requested us to integrate AppDynamics in BizTalk360. 

If you are already using AppDynamics, you can view the performance metrics of the BizTalk server environment across multiple widgets in AppDynamics dashboard from the BizTalk360 v8.9.6.

BizTalk360 Analytical Data in AppDynamics

Initially, BizTalk360 will provide capabilities to push BizTalk Server Analytics information to the AppDynamics. Some of the important BizTalk environment performance metrics categories are

  • BizTalk Server Health
  • Host Performance
  • Messaging
  • SQL Server
  • Throttling

All metrics have different counters, which are constantly being collected and pushed over to the AppDynamics Controller.

BizTalk Server Health Metrics has

  • CPU Usage
  • Memory Usage
  • Disk Free Space
  • Average Disk Queue Length

Message Performance

  • Documents Receive/Second
  • Documents Processed/Second
  • Inbound Latency (Sec)
  • Outbound Latency (Sec)
  • Outbound Adapter Latency (Sec)

Host Performance

  • Host Instance performance by CPU and Memory

Throttling Performance

  • Message delivery Throttling State
  • Message Publishing Throttling State
  • Message Delivery Outgoing Rate
  • Message Delivery Incoming Rate
  • Active Instance Count

How does BizTalk360 connect with AppDynamics

Every monitoring tool has a different core architecture. When looking on the surface, they might look similar, but when we look in detail of it, it becomes clear how different all the monitoring tools works.

AppDynamics supports different development languages, while we are using .NET because BizTalk360 is build on top of .NET framework.

AppDynamics provides a piece of software called Agent which is installed on the server to which application needs to be monitored. The Agent collects metrics and sends them to a server called the Controller. The controller processes the metrics and presents them via Web Browser.

The BizTalk360 Analytics service includes a sub-service called “AppDynamics” which is responsible to constantly push BizTalk Server performance data to Agent. The AppDynamics sub service executes every 70 seconds and checks for the data in the performance data service(another BizTalk360 Analytical sub service). The Controller, which processes the data, makes the data available through a web browser to the user. 

BizTalk360 collects the BizTalk Server Analytical data and the AppDynamics agent constantly picks up these data with the help of Windows performance counter data and sends it over to the controller. In AppDynamics, this data is available under the custom metric in the Metrics browser of each application.

 

Configuration of the AppDynamics Agent

For configuring the AppDynamics agent, you need to download and install the .NET Agent (32/64 bit) in the Application server (where the BizTalk360 Analytics Service is running) . Once the installation is successful, the .NET agent creates the coordinator service and configuration file which consists of the controller and application details.

The default location of the machine agent configuration file is located at:

For Windows Server 2008 or later: %ProgramData%AppDynamicsDotNetAgent

BizTalk360 collects all the BizTalk Server analytical data assigns it to  Windows performance counters, and updates the performance counters in the AppDynamics config file as below .

The data will be collected by the .NET Agent coordinator service and passes it to the respective controller as custom metric mentioned in the agent configuration file. 

To get the latest performance metrics/counters user need to restart BizTalk360 Analytics service, by doing this the new counters will be updated in AppDynamic config file. Agent then collects the newly introduced performance counter values and start updating in the respective  application metric browser configured in AppDynamics.

Series of steps to be covered after configuring AppDynamics .NET Agent 

  • By default when you install/upgrade BizTalk360 AppDynamics sub service will be in the paused state. You need to manually start the service by navigating to BizTalk360 Settings->Analytics health ->Analytics Service Status->AppDynamics .

  • Ensure performance data collection is enabled in Manage Analytics of BizTalk360 setting side for collecting the analytics data.

Custom Metrics in AppDynamics

Once the AppDynamics agent coordinator service starts pushing data, all the metrics will be available under the metric browser of the respective application in AppDynamics.

  • Once the data is populated, you can create dashboards which consist of different metrics data as shown below.

BizTalk Server High Availability

To maximize the uptime of the BizTalk Server solution it is important to monitor the availability of BizTalk Server environment. By enabling the performance counter in Analytics section, the BizTalk360 Analytics service will start to push data to AppDynamics with the help of the Agent coordinator. All the data are segregated based on server name and their corresponding metrics and counters.

Multiple Environments

You can configure and manage multiple BizTalk environments in BizTalk360. For adding multiple BizTalk Environments please refer this link.

For collecting the performance value of configured environments just enable performance data collection for each server by selecting the respective server name from the dropdown as below .

Environment1

Environment2

Once the Analytical service starts collecting the performance data and pushing it to  Agent coordinator , the same you can monitor in AppDynamics dashboard for both the environments. 

BizTalk360 High Availability

BizTalk360’s monitoring services and user interface can be installed in more than one places which makes the BizTalk360 as Highly available. It’s predominant to monitor the BizTalk Server and maximize the up-time of BizTalk360. By default, the BizTalk360 high availability services will be available as active and passive on the installed servers and make sure the BizTalk360 is healthy(always up).

To make AppDynamics data collection as highly available you have to install the AppDynamics .NET Agent on the machines where BizTalk360 Analytics services are installed. So when ever the analytics service changes its availability (active/passive) it will collect the data and push to AppDynamics.

Conclusion

With its latest release v8.9.6, we will be bringing the capability to push the BizTalk Server analytical performance data to AppDynamics for optimizing and monitoring BizTalk Server. If you have any feedback or suggestion, please write to us at support@biztalk360.com. You can get started to use AppDynamics integration by downloading the 30-day free trial of BizTalk360.

The post APM (Application Performance Monitoring/Management) Integration in BizTalk360 appeared first on BizTalk360.