Microsoft Flow Buttons Integration

Microsoft Flow Buttons Integration

Microsoft has been providing support for a lot of external services such as BTTN, Flic etc., to trigger a Microsoft Flow by pressing a button. BTTN is a physical button made by The Button Corporation, while Flic is an offering from Shortcut Labs. In this blog, we will take a look at how we can trigger a Microsoft Flow by integrating with BTTN service. Credits to the Microsoft team for offering these hooks to connect to external services.

Scenario

To explain the integration with Microsoft Flow buttons, let’s consider this scenario. We will have a virtual BTTN named Conference Room 1 Bttn which will be specific to a conference room at Contoso company. The purpose of this button is to seek immediate assistance from the IT Help desk team at Contoso for any issues in the conference room (such as WiFi not working, need extra VGA/HDMI cable connector, etc.,). When the virtual Bttn is pressed, a Microsoft Flow will run in the background that will automatically send an email to the IT Help desk team and notify them on their Microsoft Teams channel. When the agent picks up the case, the person who triggered the Bttn will be notified (via email) about the status of the ticket.

What is BTTN? How to get my BTTN?

BTTN or BT.TN is a very simple internet user interface which when pressed can be used to trigger any specific action. You can imagine BTTN as a physical button (Ex., the emergency red button on train engines or big machinery) which can trigger an action when pressed.

BTTN is available as a physical button for different purposes such as simple laundry pickup in one click, call a taxi with the click of a button, assistance when something is not right in the conference room, and so on. You can purchase these physical buttons from here.

BTTN has the built-in capability to integrate with the following services – Facebook, Twitter, Email, SMS, Microsoft Flows, IFTTT, Zapier, and HTTP.

Virtual BTTN

BTTN Corporation also has a virtual button (a smart cloud service) that has the capability to execute an action on one of the above-mentioned integrations in a couple of mouse clicks without having to write any code.

Getting your free Bttn

You can easily claim your virtual bttn from here. The virtual bttn is free of cost and you can use it for 30 days. Simply complete the cart payment step (of course without spending any money!) and you will see a confirmation message saying your virtual bttn will be delivered in 2 business days.

Registering your Bttn

However, it usually just takes about 6-8 hours for your virtual bttn to be delivered on to your mailbox. In the email, you will be presented with a link to register your bttn. Enter the Bttn device ID and the authentication code which is available in the registration link. These fields will be automatically populated if you click the link from your email. Accept the terms & privacy policy and click Register.

You will be prompted to create a password. Once done, log in with your email address and password. You will notice the screen that will display your virtual bttn.

  • Click Settings icon (gear icon) to change the name of your button to a meaningful name. Ex., Conference Room 1 Button

  • Click the Set Action icon (Lightning icon) to define the trigger for the Bttn — what action should be performed when the bttn is clicked. Ex., when bttn is pressed, execute a Microsoft Flow to send email to technical support team requesting assistance at the conference room.

  • Click the Rocket Launcher icon to launch your virtual bttn. We will hold on to this step for a while till we create the Flow execution steps.

Creating the Microsoft Flow

Now that we have created the virtual Bttn, let’s move to the next step of creating the Microsoft Flow. Microsoft Flow has pre-built Flow templates readily available for BTTN integration. In the search screen, enter “Bttn” to filter out the templates for Bttn.

We will use the predefined “Use Bttn to call technical support for meeting room” Microsoft Flow template for our scenario. Click the template.

To start using the template, you need to define the BTTN and authorize Microsoft Flow to be able to access the Bttn. Similarly, you need to configure Office 365 Outlook to be able to trigger emails when the Bttn is pressed.

Next, we will define the Flow steps to match our scenario.

Step 1: Trigger to execute when the Bttn is pressed

Step 2: Notification to Microsoft Teams Channel of IT Help desk team

Step 3: Trigger email with options (Acknowledge/Reject) to the support engineer

Step 4: Condition to execute depending on the option chosen by the support engineer. The ‘Yes’ block will execute if the support person acknowledges the request.

That’s it! You have now configured the Flow required for the scenario. Let’s see how the scenario will work in real time.

Demo

To get the scenario working, the first step is to trigger the virtual button. Navigate to your button home page and click the Rocket Launcher icon to launch your virtual bttn. You will see a new tab opening with a big red button (which is your virtual bttn that will trigger the Microsoft Flow that we created earlier). The status of the bttn will be in an Idle state.

Press the virtual Bttn once to trigger the Microsoft Flow. You will notice a yellow color light glowing around the button that signifies the Flow is executing. The status of the bttn will now be in the “Processing” state. Within the next few seconds, the IT Help desk team will receive the notification first on the Microsoft Teams channel with information about the issue.

The IT help desk team will also receive an email with acknowledgment options to their email id.

When the Acknowledge button is pressed, the flow will further execute to the condition part and execute the ‘Yes’ block that will trigger back a confirmation email to the registered bttn email id with the confirmation message.

Once the Flow is complete, you will notice that the status of the Flow button will be Green with the status as Positive Result.

Flow Run History and BTTN Logs

You can take a look at the Run History to view the list of runs of the flow and time taken to execute the flow.

Alternatively, you can take a look at the diagnostics information from the Bttn perspective through the Log section on your bttn homepage.

I hope you got a fair idea of how you can take advantage of Microsoft’s investment into integrating different services that open up a plethora of opportunities to integrate different systems. One of the slightest modifications to this scenario could be to directly log the issues into ServiceNow portal (if your organization is using ServiceNow for keeping track of the issues). You can use HTTP + Swagger action and define APIs for ServiceNow so that you can directly log tickets into ServiceNow from Microsoft Flow. Kent Weare demonstrated this scenario on Episode 27 of Middleware Friday. You can watch the video recording of the session here –

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”.

BizTalk Assessment: .NET Runtime Optimization Service causing high CPU usage

BizTalk Assessment: .NET Runtime Optimization Service causing high CPU usage

It is very important to optimize everything possible for BizTalk archive the best performance it can, especially if you are dealing with BizTalk Server DEV environments on Azure using Azure Virtual Machines. Again, as I explain in a previous post (here), we normally tend to be careful in choosing the VM Disk type and VM sizes mostly because of the price, so tuning the machine is very important, otherwise sometimes they tend to be a little slow and even the most insignificant things (that we think they are) can have a huge impact on BizTalk Server performance. One of the cases that I found these days was the .NET Runtime Optimization Service causing high CPU usage…

NET Runtime Optimization Service causing high CPU usage

… Indeed, and we don’t need to give explanations to explain that everything that may affects the CPU to cause a High Load (or usage) inherently affects and cause huge impact on BizTalk Server performance.

The .NET framework is regularly optimized by Microsoft using the .NET Runtime Optimization Service to make sure you get the best out of the system libraries and this is also done for your installed managed application. It is also more or less responsible for precompiling .NET assemblies in the background. Once it’s done, it will go away. And while you may see CPU usage almost near the 100%, the compilation happens in a process with low priority, so it tries not to steal the CPU for other processes you are doing. Once everything is compiled, assemblies will now be able to share pages across different processes and warm startup will be typically much faster.

Again, once the .NET Runtime Optimization Service once the service finishes his process it will go away, but in my case this process was already running several hours, so instead of waiting until for the computer to idle, we can force it to compile everything now! Solving this “issue” in a fast way (couple of minutes). For that you should:

  • Open a command prompt as administrator and run the adsutil command. To do this, Press the “Windows key” to open the Start menu and type “cmd” on the Start Search box, right-click in “Command Prompt” and select from the context menu the “Run as administrator” option.
  • And change to the following folder: “C:WindowsMicrosoft.NETFrameworkv4.0.30319” or “c:WindowsMicrosoft.NETFramework64v4.0.30319” (on a 64-bit operating system)
  • Type “ngen.exe executequeueditems”, and press ENTER
  • Wait until the operation is finished, I think you may receive the following message at the command prompt: “All compilation targets are up to date.”

This will process all pending work. Once this is done, the service will shutdown, as it has nothing else to do and you will see that the CPU will be back to the normality.

NET Runtime Optimization Service causing high CPU usage fixed

This can be important to be aware for the Administration team (or DevOps team) while planning your .NET framework update to make sure that everything run smoothly especially in the production environment, otherwise this can be one of the reasons (at least for a few minutes or hours) for you notice that our production applications are running a little slowly.

There is also a PowerShell provide by the Microsoft .NET team to speed up the optimization service: DrainNGenQueue.ps (see more here)

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

How to import a Postman collection into your Azure API Management

How to import a Postman collection into your Azure API Management

I have been working with API Management (APIM) for a long time, not only playing around with POC and testing new features but actually working and implementing real solutions for clients. I also delivered several API Management talks here in Portugal but I realize that I never wrote a blog post about this topic. That will change today hehe.

OpenAPI Specification (formerly Swagger Specification) is an API description format for REST APIs. An OpenAPI file allows you to describe your entire API, including:

  • Available endpoints (/users) and operations on each endpoint (GET /users, POST /users)
  • Operation parameters Input and output for each operation
  • Authentication methods
  • Contact information, license, terms of use and other information.

API specifications can be written in YAML or JSON. The format is easy to learn and is both human-readable and machine-readable.

This specification is widely used these days, especially in the Microsoft community, but unfortunately, not all of them use it and there are still quite a few APIs that do not expose any kind of specification/documentation like Swagger, WADL or even WSDL (this last for web services).

For REST APIs, I have seen several suppliers/partners use Postman Collections to easily test their APIs by emulating request but also to document the APIs by adding a name and descriptions to requests, sample requests, headers, and other metadata. Performing unit tests in all the operations and sharing them inside their teams.

Postman Collections are groups of requests that can be run together as a series of requests, against a corresponding environment. Using scripts, you can build integration test suites, pass data between API requests, and build workflows that mirror your actual use case of APIs.

import a Postman: Postman Collection Sample

See how to create a Postman Collection here: Creating collections

I have to say, I love Postman but I never used it for that propose, although it is an excellent tool, in my opinion, I believe that to document an API it has some limitations.

And in one of my recent projects the partner delivered me a Postman Collection as a way of documenting their REST API which led me to a “small” problem:

  • I didn’t want to replicate or manually expose the API inside the API Management…
  • and the API Management by default doesn’t allow me to import a Postman Collection

So, how can I import a Postman collection into my Azure API Management?

Fortunately for us, there are several free tools that allow you to convert (or transform) a postman collection (1.0 – 2.0) to formats of your choice that in our case Swagger. And one of these tools is APIMATIC.

For that you just need to:

  • Access APIMATIC API Transformer page: https://apimatic.io/transformer
  • Upload your Postman collection file description
  • Provide a File Description URL
  • And select the desired output format, in our case we selected “Swagger v2.0 (JSON)”

import a Postman: APIMATIC API Transformer Postman Collection to Swagger

  • And finally, click “Convert Now”

The output file will be automatically downloaded into your machine.

import a Postman: APIMATIC API Transformer Postman Collection to Swagger output

Now that I have my swagger file, I can now access my API Management in the Azure Portal and then:

  • From the left pane, select “Add API” and then select “OpenAPI specification” option

import a Postman: Azure API Management Add API

  • Import the file I had created previously in the APIMATIC web tool and click “Create”

import a Postman: Azure API Management Add API OpenAPI

It probably doesn’t do everything for you, you may need to optimize the query parameters or the URL template parameters, but at least this approach will do 90% of the work for you:

import a Postman: Azure API Management Add API OpenAPI result

This approach saved me several working hours on the task of exposing the Partner API in my API Management Portal.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Microsoft Integration Weekly Update: Aug 28

Microsoft Integration Weekly Update: Aug 28

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

WCF-SQL Adapter: Connecting to the LOB system has failed. A network-related or instance-specific error occurred while establishing a connection to SQL Server

WCF-SQL Adapter: Connecting to the LOB system has failed. A network-related or instance-specific error occurred while establishing a connection to SQL Server

I been delivering a lot of BizTalk Server Training Courses for Developers and for Administrator in the past years that I normally like to call them “Developing BizTalk Server version Course” or “BizTalk Server Operations, Administration, and Maintenance (OAM) course” – yes I know, I am very creative in terms of names – and one of the good things about using Azure Virtual Machines for that is that we can easily create several developer machines in a short period of time and manage them for not consuming resources. And normally on of the task that is present in both courses is to configure BizTalk Server environment, because I believe that all BizTalk Developers should know the basics of BizTalk Server in terms of installation, configuration and optimizations – tasks that are normally done by BizTalk Administrations – I can go further and say that, in my personal opinion, if you don’t know these basic concepts you are not truly a BizTalk Developer because many things can be done by configurations and not code.

One of these tasks is to install and configure BizTalk Server LOB adapters, in especially SQL Server adapter, since in a Developer standalone machine we will need to have SQL Server, so it is easy to test this LOB Adapter. However, if we create the Azure BizTalk Server 2016 Developer machine, configure BizTalk Server and then install and configure LOB adapters without doing anything more and you try to create a BizTalk Server solution using Visual Studio and generate SQL Server schemas:

  • In the Solution Explorer, right-click your project, click “Add”, and then click “Add Generated Items…”
  • In the “Add Generated Items – <Project Name> dialog box, select “Consume Add Service”, and then click “Add”.
  • Select the sqlBinding and properly configuring the URI

We will get the following error message:

Connecting to the LOB system has failed.

A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 – The remote computer refused the network connection.).

Connecting to the LOB system has failed: WCF-SQL Adapter

when we try to connect to the SQL Server configured.

Cause

This error may occur for several reasons like BizTalk Server cannot communicate with SQL Server machine because some firewall restrictions or SQL Server does not accept Remote Connections and so on.

However, in our case, we are talking about a standalone BizTalk machine that is trying to access a database in the local SQL Server.

Our problem was that by default what Microsoft could possibly configure wrong in terms of SQL Server protocols on the BizTalk Server developer image on Azure… is actually set up wrongly!

Connecting to the LOB system has failed: SQL Server 2016 Configuration Manager protocols

And as I described in my BizTalk Server Installation and configuration tutorial (see here), one of the important steps is to configure SQL Server Network Protocols, in special ensuring that TCP/IP is enabled and Shared Memory is disabled. You can see how to accomplish this using SQL Server Configuration Manager tool here.

The actual problem that was causing the connecting to fail while trying to connect to the LOB system, in this particular case the SQL Server is because the TCP/IP protocol is disabled.

Solution

To properly configure the protocols for SQL Server, especially the TCP/IP protocol, you should:

  • Press the “Windows key” to switch to Metro UI and type “SQL Server 20016 Configuration Manager” and click on “SQL Server 2016 Configuration Manager” option on Apps menu.
  • In SQL Server Configuration Manager windows, from the left-hand pane expand “SQL Server Network Configuration” option and then click “Protocols for MSSQLSERVER”
  • Verify that both “TCP/IP” and “Named Pipes” are enabled;
  • If not, right-click in the protocol, and then click “Enable”
  • Repeat to enable the other protocol if necessary.
  • Verify that “Shared Memory” is Disable.
  • If not, right-click Shared Memory, and then click “Disable”
  • In the left-hand pane, click “SQL Server Services”, right-click “SQL Server (MSSQLSERVER)”, and then click “Restart”.

Connecting to the LOB system has failed: SQL Server 2016 restart

  • Close SQL Server Configuration Manager.
  • Click “OK” and then “Apply”

After correctly configure the SQL Server protocols, again, especially the TCP/IP, you will be able to successfully connect and generate the SQL Server Schemas that you need for your BizTalk Server Solution.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Stef’s Monthtly Update – August 2017

Stef’s Monthtly Update – August 2017

Summer holidays are over, it is back to work and a few weeks later back into the trenches I learned a lot more about Azure Cosmos DB, Azure Search and the latest addition to the Platform Event Grid.

Month August

Microsoft launched a new service, Event Grid to support serverless events with intelligent routing and providing an uniform event consumption using a pub-sub model (similar to pub-sub we know from BizTalk Server). Like some integration minded folks I written three blogs about the service on my own blog:

Besides EventGrid, the Microsoft Pro Integration PG announced a Logic Apps Management (Preview) solution in OMS. And I have tried out this service too a wrote a blog about it:

Based on the release of the OMS solution for Logic App, I delved into monitoring and operations a bit. And saw many monitoring solutions when it comes to a serverless integration solution in Azure. You can read about this topic on the BizTalk360 blog:

The last blog post was inspired by Saravana’s article on LinkedIn: Challenges Managing Distributed Cloud Applications

To conclude managing a distributed cloud native solution with several Azure services is a challenge!

Codit

On the 1st of October I will join Codit. Why you might ask? The year contract at Macaw ends at the 30th of September and I realized that my skills, speaking engagements, passion and focus lies more with Azure, Integration, IoT. And this fits with the Codit corporate strategy, plus I have more than 100 integration focussed sparring partners.

Books

This month I haven’t read that much other than a book about rise of robots. The message in this book was rather grim and I felt that almost no one will have a job in 10 years or so. A scary, fascinating read.

Music

My favorite albums in August were:

  • Sons Of Crom – The Black Tower
  • Thy Art Is Murder – Dear Desolation
  • Steven Wilson – To The Bone
  • Akercocke – Renaissance In Extremis
  • Leprous – Malina

Running

I deciced I wanted to run another marathon and enrolled into the Tokyo Marathon at the end of February 2018. Therefore, I started running 4 to 5 times a week this month and I am making good progress

Next month will be my last with my current employer Macaw, before I start at Codit. Moreover, I will be speaking at the end of the month in Oslo for the Norwegian BizTalk User Group together with Eldert and Tomasso.

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years.

My new Pluralsight course about coordinating Java microservices with Spring Cloud, is out!

My new Pluralsight course about coordinating Java microservices with Spring Cloud, is out!

Home Cloud My new Pluralsight course about coordinating Java microservices with Spring Cloud, is out!

No microservice is an island. Oh sure, we all talk about isolated components and independent teams. But your apps and services get combined and used in all sorts of ways. It’s inevitable. As you consider how your microservices interact with each other, you’ll uncover new challenges. Fortunately, there’s technology that’s made those challenges easier to solve.

Spring Boot is hot. Like real hot. At this point, it’s basically the de facto way for Java developers build modern apps.

Spring Cloud builds on Spring Boot and introduces all sorts of distributed systems capabilities to your code. Last year I delivered a Pluralsight course that looked at building services with it. But that was only half of the equation. A big portion of Spring Cloud simplifies interactions between services. So, I set out to do a sequel course on Spring Cloud, and am thrilled that the course came out today.

2017.08.26-pluralsight-01

This course, Java Microservices with Spring Cloud: Coordinating Services, focuses on helping you build a more effective, maintainable microservices architecture. How do you discover services? Can you prevent cascading failures? What are your routing options? How does messaging play a role? Can we rethink data integration and do it better? All those questions get answered in this ~6 hour course. The six modules of the course include:

  1. Introducing Spring Cloud and microservices coordination scenarios. A chat about the rise of microservices, problems that emerge, and what Spring Cloud is all about.
  2. Locating services at runtime using service discovery. The classic “configuration management DB” can’t keep up with the pace of change in a microservices architecture. No, you need a different way to see a live look at what services exist, and where they are. Enter Spring Cloud Eureka. We take a deep look at how to use it to register and discovery services.
  3. Protecting systems with circuit breakers. What happens when a service dependency goes offline? Bad things. But you can fail fast and degrade gracefully by using the right approach. We dig into Spring Cloud Hystrix in this module and look at the interesting ways to build a self-healing environment.
  4. Routing your microservices traffic. A centralized load balancer may not be the right fit for a fast-changing environment. Same goes for API gateways. This module looks at Spring Cloud Ribbon for client-side load balancing, and Spring Cloud Zuul as a lightweight microproxy.
  5. Connecting microservices through messaging. Message brokers offer one convenient way to link services in a loosely coupled way. Spring Cloud Stream is a very impressive library that makes messaging easy. Add competing consumers, partition processing, and more, whether you’re using RabbitMQ or Apache Kafka underneath. We do a deep dive in this module.
  6. Building data processing pipelines out of microservices. It’s time to rethink how we process data, isn’t it? This shouldn’t be the realm of experts, and require bulky, irregular updates. Spring Cloud Data Flow is one of the newest parts of Spring Cloud, and promises to mess with your mind. Here, we see how to do real-time or batch processing by connecting individual microservices together. Super fun.

It’s always a pleasure creating content for the Pluralsight audience, and I do hope you enjoy the course!

Advertisements



Categories: Cloud, DevOps, General Architecture, Messaging, Microservices, Spring

Why IIS Application Pool automatically stopped when trying to browse or invoke BizTalk WCF Services?

Why IIS Application Pool automatically stopped when trying to browse or invoke BizTalk WCF Services?

This week I received a call of one of my customers reporting me a strange issue. Strange because everything was running fine and suddenly, for no apparent reason, each time someone tried to invoke or browse WCF Services exposed by BizTalk Server the IIS Application pool configured for that services automatically stopped.

If we tried to browse the services we got the following error message:

Login failed for user ‘Domain/user’
Exception Details: System.SqlClient.SqlException: Login failed for user ‘domain/user’

Application Pool automatically stopped : IIS Login failed for user

Of course, with that type of error description, the first step we did was to reenter the credentials for that user in the application pool but… the same problem persisted.

When we investigated the event viewer for additional error details we found in the Security log the following message:

Log Name:      Security
Source: Microsoft-Windows-Security-Auditing
Date: {date}
Event ID: 4625
Task Category: Logon
Level: Information
Keywords: Audit Failure
User: N/A
Computer: {machine name}
Description:
An account failed to log on.
Subject:
Security ID: SYSTEM
Account Name: {account name}
Account Domain: {domain name}
Logon ID: 0x3E7
Logon Type: 4
Account For Which Logon Failed:
Security ID: NULL SID
Account Name: {account name}
Account Domain: {domain name}
Failure Information:
Failure Reason: The user has not been granted the requested logon type at this machine.
Status: 0xC000015B
Sub Status: 0x0

Cause

So, at this point, we knew that this was a security policies problem and we suspect that some automatic process created by sysadmins where changings something in the Local Security Settings on the BizTalk Server machine.

After some research, we found out that the user account that is running the Application Pool – in which the BizTalk WCF Services are exposed – requires having “Log on as batch job” rights

Note: By definition, this security setting (Log on as a batch job), allows a user to be logged on by means of a batch-queue facility and is provided only for compatibility with older versions of Windows. For example, when a user submits a job by means of the task scheduler, the task scheduler logs that user on as a batch user rather than as an interactive user.

Solution

To add user rights in the “Log on as batch job” policy, you need to:

  • Press the “Windows key” to open the Start menu.
  • Type “secpol.msc” and click on “Local Security Policy” option from the Search window.
  • On the left pane, select “Local Policies” and the “User Rights Assignment”

Application Pool automatically stopped : Local Security Policy log on as a batch job

  • Right click on “Log on as batch job” and click Properties
  • Click “Add User or Group…” and add the desired user

Application Pool automatically stopped : Local Security Policy log on as a batch job Add User

  • Click “OK” and then “Apply”

Note: Of course, for you to do the above steps, you need to have administrator rights on the computer or be part of specific domain groups in case of Security Settings policies are governed by some AD domain groups.

Note: this problem can also occur and the same solution can be applied to SharePoint.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Routing with a custom Event Grid Topic

Routing with a custom Event Grid Topic

Event Grid Topic is a part of Event Grid, a new Platform Service, which provides intelligent event routing through filters and event types. Moreover, it offers a uniform publish-subscribe model similar to the model of the BizTalk runtime. However, we are talking events here and not messaging. Event Grid is a managed service in Azure with service fabric underneath. Some of the characteristics of Event Grid are discussed in one of Tom Kerhove’s latest’s posts : Exploring Azure Event Grid.

Event Grid offers custom event routing capabilities with an Event Grid Topic. Consequently, a Topic can be provisioned through the Azure Portal. And once the Topic becomes available you can hook it up with one of more subscribers.

Routing with Event Grid Topic

Custom events can be pushed to an Event Grid Topic, which can have multiple subscribers. Subsequently, the subscription is set on either Event Type and/or filters (Prefix, Suffix). Hence, a broadcast of a single event to multiple handlers can be accomplished. Therefore, each handler can operate on the event.

The consumption of the events will be through the custom Event Grid Topic as shown above. Futhermore, consumers can be a Function, Logic App, WebHook or Azure Automation currently. And the mechanism of subscriptions in Logic App and Functions is through WebHooks, which I will eloborate more about in event subscribers.

Custom Events to Event Grid Topic

Custom events need to adhere to a schema, which includes five mandatory string properties and a required data object. Subsequently, a custom event needs these properties. Therefore, a .NET client for instance can leverage the System.Net.Http namespace using a HttpClient. Hence to be able to sent a custom event with a .NET client to the Event Grid Topic, you’ll need to know the endpoint (URL) and SAS-Key.

Let me explain here. First of all, a Event Grid Topic requires either a SAS-key or key authentication, however the last one is easier to implement in a .NET client. Hence, you add a default request header with the key “aeg-sas-key” with the value in key1 found in the Azure Event Grid Topic Overview.

To actually sent the event, you can use the PostAsync method. This method of the HttpClient requires the content (event data) and Endpoint URL, which also can be found in the Azure Event Grid Topic Overview.

Event content

The content has to event schema. Therefore, the payload could look like:

[
{“Data”:
{“WindSpeed”:6.2,
“Beaufort”:0,
“Type”:”WindDetails”,
“Location”:”Amsterdam”,
“Lat”:52.373888888888892,
“Lng”:4.889444444444444
},
“Id”:”a72f1473-d763-43c0-ad49-13dacf9158d3″,
“Subject”:”WindDetails”,
“EventType”:”WindSpeedEvent”,
“EventTime”:”2017-08-24T14:32:15.5814874Z”
}
]

In bold you can see the event details (data) and Id, Subject, EventType and EventTime, which are the mandatory string properties. You might ask yourself now, there are only four string properties and one data, where’s the topic property? Probably once the event above is published to the Event Grid Topic, the topic property is added to the event.

[{
{“WindSpeed”:6.2,
“Beaufort”:0,
“Type”:”WindDetails”,
“Location”:”Amsterdam”,
“Lat”:52.373888888888892,
“Lng”:4.889444444444444
},
“Id”: ” a72f1473-d763-43c0-ad49-13dacf9158d3″,
“Subject”: “WindDetails”,
“EventType”: “WindSpeedEvent”,
“EventTime”: “2017-08-24T14:01:57.4354747Z”,
  “topic”: “/subscriptions/0bf166ac-9aa8-4597-bb2a-a845afe01415/resourceGroups/RG_EventGridSample/providers/Microsoft.EventGrid/topics/SampleTopic”
}]

The Event Grid uses HTTP response codes to acknowledge receipt of events. Hence the event above sent to a custom event topic will provide the following response :

{StatusCode: 200, ReasonPhrase: ‘OK’, Version: 1.1, Content: System.Net.Http.StreamContent, Headers:
{
x-ms-request-id: e6c1fbf3-f295-49b3-ad13-b26c22b60313
Date: Thu, 24 Aug 2017 14:41:19 GMT
Server: Microsoft-HTTPAPI/2.0
Content-Length: 0
}}

The Http Status code is 200, which is OK (event delivered). In addition I suggest you read Event Grid message delivery and retry.

Event Subscribers

A custom Event Grid Topic can have one of more subscriptions (event handlers). A subscription is an instruction for the Topic to tell it “I want this event”. In addition, the instruction can contain filters (pre- post) and/or an EventType. The Event Grid itself supports multiple subscriber types like WebHooks. And depending on subscriber type, the Event Grid has a mechanism to guarantee delivering the event to the subscriber. Do for WebHooks it’s a 200-OK, similar to when a .NET client is delivering an event to the Event Grid Topic.

Azure Function or Logic App can use the WebHook mechanism to subscribe to events on a custom Event Grid Topic. As a result, the subscription is created in the Azure Event Grid Topic containing an URL for which the Event Grid Topic can deliver the events to (POST). In addition, the Event Type can be specified, i.e. by default this is all. And finally filters can be applied, which are optional. To conclude a custom event with a certain type can be published to an Event Grid Topic, which has one of more subscribers interested in the events of a certain type.

Note the subscription of a Logic to events in the Event Grid Topic is done through a Logic App Trigger. However, to subscriber to a specific Event Type you will need to edit the subscription in the Event Grid Topic to set it from all to the required one.

A function can subscribe to events using a WebHook trigger. You create a subscription in the Event Grid Topic by providing the URL of the WebHook trigger function and Event Type (and optional filters if necessary).

Sample Scenario

To have a better understanding of routing custom events with a Event Grid topic let us look how custom events are sent and how they are consumed by multiple subscribers. Therfore, I will discuss the following scenario with you using serveral Azure services like :

  • An Event Grid Topic
  • An Azure Functions
  • A Logic Apps
  • A .NET Client
  • A WebHook (RequestBin)
  • An Azure Service Bus Queue

The .Net client sends a custom event to a custom Event Grid Topic provisioned in Azure. Subsequently, a Azure Function, Logic App and WebHook (RequestBin) will subscribe to an event of the Type WindSpeedEvent. First of all the Function will process the event by enriching it with a calculated value of Beaufort, and sent the enriched event to a Service Bus queue. Furthermore, the Logic App will evaluate the windspeed and send an email if windspeed is higher than a specified value. And finally RequestBin will just consume the event.

The following diagram shows the event flow of the scenario.

Sent a custom event

A custom event can be sent to a Event Grid Topic using a .NET client. In our scenario the custom event is of the type WindSpeedEvent, containing a few fields, including WindSpeed in meters per second and no known Beaufort (0) yet.

The Event Topic has three subscribers:

  • Logic App
  • Function
  • WebHook (Request Bin)

Each will receive the event, as each has subscribed to the Topic with events of Type WindSpeedEvent. Hence, in the Azure Function Monitor Pane I observed the consumption of the event.

Subsequently, in the Logic Run History I observed the consumption of the event.

Finally, when refreshing the RequestBin page I see the event in its raw format. And this is smilar to the Event Grid Quickstart Create and route custom events with Azure Event Grid.

To conclude, each subscriber recieves the event of the Type WindSpeedEvent.

The Event Grid Topic in our scenario has three subscribers, see the screenshot of the Event Grid Topic Overview below.

Summary

Custom Event handling with an Event Grid Topic is easy to comprehend. Also it opens doors to many scenario’s ranging from IOT to Website Traffic monitoring. In this post I focused only on a custom event handled by several subscribers. However, Event Grid has more to offer in handling events from other sources like Azure Subscriptions, resource groups, and other. Finally, more publishers and handlers will be available in the future. To conclude Event Grid in my opinion is a great addition to other serverless capabilities in Azure. However I like to emphasise it is event capability in Azure compatible with other serverless components like Logic Apps and Functions.

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years.