Integration Down Under

Integration Down Under

In February I went on vacation to Australia for almost a month, and had decided to throw in some work-related activities as well (fortunately I have a very understanding wife 🙂 ). I had an amazing time here, catching up with old friends, and meeting new ones. I love the integration community, and always have a lot of fun with these people. In this post, I will go into some of the work-related highlights of my vacation.

Sydney was basically our central hub throughout the vacation, where we returned a couple of times during these weeks. My old colleague Rene Brauers lives here, as does Mick Badran, so we have had a lot of time catching up and being shown around.

We also did a Meet Up organized by Simon and Rene, where I was joined on stage by Steef-Jan, my good friend and fellow MVP from the Netherlands, as well as Jon and Kevin.

When I had booked my vacation, I found out that Ignite would be held when I was in Australia, so I decided this would be a good opportunity to visit this as well. Here at Ignite, it was confirmed for me that Integration is alive and kicking, with many integration sessions. Even Scott Guthrie had a nice part on it in the keynote, which confirms for me Microsoft is sharing my vision that integration keeps getting more important in these days.

A large part of the Pro Integration team was represented at Ignite as well, with Jim, Kevin, Jon and Jeff, and they did a great session showing Microsoft’s vision on Hybrid Integration. Also Dan was here, and he had an awesome session on messaging in Azure, showing the capabilities of the Service Bus stack.

Furthermore, my good friends and fellow MVP’s Martin and Daniel did a couple of great sessions as well, it’s really cool to see how much love the integration space is getting. And of course the best thing about events like this, is socializing after the sessions are done ;-).

For a full overview of all the sessions at Ignite, you can see all the videos here.

After Ignite, we got invited by Daniel to join him for a tour of Brisbane. I have to say, I really love this place, it’s not too crowded, and just has a great scenery.

In Melbourne we did a lot of sight-seeing, and met up with Bill, Jim and Jeff, for some more socializing.

We also did a webcast here at Bill’s house, after having had some of the best BBQ steaks.

As you can tell, we had a great time in Australia, and I want to thank everyone who made this such a great experience for us. Of course, we did not only do work-related stuff, and if you ever happen to find yourself in Australia, I can definitely recommend going to the Blue Mountains, one of the most beautiful places we have been so far.

Using the Application map and Alerts in Application Insights to detect errors in your API App

Using the Application map and Alerts in Application Insights to detect errors in your API App

When you create a Web API or API App it’s essential to monitor it while it is running. Most importantly, you want to detect failures before most of your customers do. You also want to discover and fix performance issues and want to know what the users are doing with your Web API like are they using the latest features for example.
Application Insights is an extensible Application Performance Management (APM) service for web developers that monitors your running Web API. It tells you about failures and performance issues, and helps you analyze how customers use your app.

Perform the following steps to use the Application map and Alerts in Application Insights:
1.    Add the Application Insights SDK to the API App
2.    Use the Application Insights API in the API App for custom events and metrics
3.    Use the Application Map in Application Insights to drill down errors
4.    Set Alerts in Application Insights

 

Step 1: Add the Application Insights SDK to the API App

Right-click your API app project in Solution Explorer, and choose Add, Application Insights Telemetry.

1 Add Application Insights in Visual Studio project
Note
In Visual Studio 2015, there’s also an option to add Application Insights in the New Project dialog.)
 

Continue to the Application Insights configuration page:

  – Select the account and subscription that you use to access Azure.
  – Select the resource in Azure where you want to see the data from your app. Usually you create a separate resource for each app.
  – Click Register to go ahead and configure Application Insights for your web app. Telemetry will be sent to the Azure portal, both during debugging and after you have published your app.
2 Register your app with Application Insights
 
 

Step 2: Use the Application Insights API in the API App for custom events and metrics

In Application Insights, a custom event is a data point that you can use to find out what users are doing with the API App, or to help diagnose issues. The Application Insights API for custom events and metrics is the same API that the standard Application Insights data collectors use.

 
Use the following lines of code in your application to send an event whenever a document is not found:

using Microsoft.ApplicationInsights;
private TelemetryClient telemetry = new TelemetryClient();
telemetry.TrackEvent(“[OrderManager.GetOrderById] Document not found”);

 
03 custom events
 
 

Step 3: Use the Application Map in Application Insights to drill down errors

There are plenty of ways to explore your telemetry data in Application Insights. One option is to use the Application Map in the Azure Portal. An Application Map is a visual layout of the dependency relationships of your application components. Each component shows KPIs such as load, performance, failures, and alerts, to help you discover any component causing a performance issue or failure.

 

To open the Application Map go to the Azure portal and then navigate to the API App that you created.

  – In the App Service blade, click Application settings.
  – In the overview panel click on VIEW MORE IN APPLICATION INSIGHTS
04 Azure Portal - Application Insights
 
Click on App map from the Application Insights blade to open the Application map.
05 Azure Portal - Application Insights
 
Click on the error or warning to further investigate.
06 Azure Portal - Application Insights - Application map
 
When you click on the error a new blade opens with an overview of the Failed Requests.
07 Azure Portal - Application Insights - Application map - Top Errors
 
Click on the error to see the properties of the failed HTTP request.
08 Azure Portal - Application Insights - Application map - 404 Errors
 
Click on the link “All available telemetry for this operation” to see the telemetry and custom events.
09 Azure Portal - Application Insights - Application map - 404 Errors - Detail
 
 

Step 4: Set Alerts in Application Insights

Application Insights can also alert you to changes in performance or usage metrics in your API App. You can use the Metric alerts tell you when any metric crosses a threshold value for some period – such as response times, exception counts, CPU usage, or page views.

 
Click on Alerts to open the Alert rules blade, and then click on the Add alert button.
10 Azure Portal - Application Insights - Alerts
 
Use the Failed request metric to set an alert if a HTTP request to the API App fails and returns an error to the client.
If you check the box “Email owners…”, alerts will be sent by email to an administrator for example.
11 Azure Portal - Application Insights - Add Alert
 
You then get a Failed Request Alert email when an alert changes state between inactive and active.
12 Azure Portal - Application Insights - Email about Alert
 
The current state of each alert is also shown in the Alert rules blade.
13 Azure Portal - Application Insights - Fired Alert

   

  

Conclusion

Application Insights is really a very good way to monitor your API App. It’s easy to add to your code and it has many great features like for example Alerts and the Application map. It’s also very mature and with the Basic pricing option, you pay based on the volume of telemetry your application sends, with a 1 GB free allowance per month.  This free data allowance gives you a great way to try out Application Insights as you get started!

Here it is my revamped blog, Thanks BizTalk360

Here it is my revamped blog, Thanks BizTalk360

Thanks team BizTalk360. Welcome to my new blog. Yes I have a new blog or better yet it has been styled and created for me. Just before I embarked on my trip down under Saravana gave me an offer to revamp my blog. Happily, I accepted his kind offer to create a new blog for me. And to migrate my content from my old blog to this new amazing blog.

Hard working team

The BizTalk360 team took over my blog and started working hard to get this blog up and running. They migrated the content, build the about me page and resources. Great work guys!!!

The old blog

The Azure Thoughts, EAI Challenges blog, I have is over 10 years old. I started this on the blogger.com and the style changed a few times in the past until it reached its current state. Saravana gave me the offer to have a complete new type of blog similar to BizTalk360 blog, Sandro’s and Nino’s new blogs. They all look amazing to me and that’s what I wanted too. A professional looking blog!

Not an UI/UX guy

My expertise is as many of you know Microsoft Integration, Azure and Data Science and I am not a UI/UX guy nor a designer. To have a cool looking blog has been a long time wish and now it was handed to me.

Thank you all!

Thanks, Saravana and the BizTalk360 team for this tremendous effort and work. The blog at WordPress.com is a new experience for me and I love the extra capabilities of this platform that exceeds the previous one I worked on. I hope that you my readers will enjoy this new and more professional layout, which I hope will more appealing you and that it will provide a great user experience. Looking forward to create some new content on this blog.

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 6 years. View all posts by Steef-Jan Wiggers

Stef’s Monthly Update – February 2017

Stef’s Monthly Update – February 2017

Last month was a busy month and in February most of my time I spend on the road or plane. Anyways, what has Stef been up to in February?

In this month, I have also written a few guest blogs for BizTalk360 blog and did a demo for the Middleware Friday Show. The blog posts are:

The show can be found in Middleware Friday show 5th episode about Serverless Integration.
During my trip in Australia and New-Zealand I did a few short interviews, which you can find on YouTube:

·       Mick Badran
·       Wagner Silveira
·       Martin Abbott
·       Daniel Toomey
·       Bill Chesnut
·       Rene Brauwers

Besides the interviews a few Meetups took place, one in Auckland, another one in Sydney and a live webinar with Bill Chesnut in Melbourne. In Auckland I talked about the integration options we have today. An integration professional in the Microsoft domain had/has WCF and BizTalk Server. With Azure the capabilities grow to Service Bus, Storage, BizTalk Services (Hybrid Connections), Enterprise Integration Pack, On Premise Data Gateway, Functions, Logic Apps, API Management and Integration Account.

After my talk in Auckland I headed out to the Gold Coast to meet up with the Pro Integration Team (Jim,Jon, Jeff and Kevin) and Dan Rosanova. They were all at the Gold Coast because of Ignite Australia, and here’s a list of their talks:

During my stay, we went for a couple of drinks and had a few good discussions. One night Dean Robertsoncame over and we all had dinner. After the Gold Coast Dan Toomey, took me, Eldert and his wife to Brisbane for a day sightseeing.

The next week after Auckland, Gold Coast and Brisbane I returned to Sydney for the Meet up organized bySimon and Rene. My topic was “Severless Integration”, which dealt with the fact that we integration professional will start building more and more integration solutions in Azure using Logic Apps, API Management and Service Bus. All these services are provisioned, management and monitored in Azure. In the talk I used a demo, which I also described in Serverless Integration with Logic Apps, Functions and Cognitive Services.

In Sydney I was joined on stage with Jon, Kevin and Eldert. We had about 45 people in the room and we went for drinks after the event.

The next day Eldert and me went to Melbourne to meetup with Bill, Jim and Jeff who were there to do a Meet up. The PG had split up to do meetups in both Sydney and Melbourne. In Melbourne, we did two things, we visited Nethra, who survived the Melbourne car rampage 25th of January and did a live Webinarat Bill house in Beaconsfield.

Overall the trip to Australia and New Zealand was worthwhile. The meet ups, the PG interaction in Australia, the community and hospitality were amazing. Thanks Rene, Miranda, Mick, Nicki, Simon, Craig,Abhishek, James, Morten, Jim, Jon, Jeff, Kevin, MartinDan Rosanova, Bill, Mark, Margaret, Johann,Wagner and many others I met during this trip. It was amazing!!!

Although February was a short month I was able to find a little bit of time to read. I read a few books on the plane to Australia, New Zealand and back:

  • Together is better, a little book of inspiration by Simon Sinek. I read this book as I shared aninterview (Millennials in the Workplace) with him on Facebook. It tells a short story about three young people escaping from a playground that has a playground king to find a better place. The story is about leadership with the message that leaders are students, need to learn and to take care of their people and inspire.
  • Niet de kiezer is gek by Tom van der Meer. On March 15th, we will have a general election for a new upcoming government. And we as voters are more aware of the what each party has to offer than the parties think we know. The access to information, because of digitalization has made voters more informed on the situation in our country, how politicians operate and vocal.

My favorite albums that were released in February were:
·       Soen                                 –            Lykaia
·       Immolation                     –            Atonement
·       Persefone                        –            Aathma
·       Ex Deo                             –            The Immortal Wars
·       Nailed To Obscurity        –            King Delusion

In February I did a couple of runs, including a half just before my trip started. During my busy travel schedule, I ran with the same frequency, but cut the number of miles to prevent to wear myself out.

There you have it Stef’s second Monthly Update and I can look back again with a smile. Accomplished a lot of things and exciting moments are ahead of me in March.

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 6 years. View all posts by Steef-Jan Wiggers

Why we want to collect usage data!

BizTalk Server 2016 was our tenth release of the product bundled with a ton of new functionality

One of the things we added was enhanced support for Microsoft to collect usage data from the BizTalk environment. You can enable telemetry collection during the installation of BizTalk Server 2016 , although the default option is to opt-out from sending this data to Microsoft.

Microsoft use the usage data to help us improve our products and services, including the focuse of the actual usage to get BizTalk in the right direction. Read our privacy statement to learn more.

The data gathered includes overall information and counts of the different artifacts. Data is sent to Microsoft once a day or when you restart one of your host instances.

There are two keys in the registry that store this information regarding your telemetry data gathering and submission to Microsoft.

HKEY_LOCAL_MACHINESOFTWAREMicrosoftBizTalk Server3.0CEIPEnabled
And
HKEY_LOCAL_MACHINESOFTWAREWOW6432NodeMicrosoftBizTalk Server3.0CEIPEnabled

If these values are set to “1” telemetry data collection is turned on.

You can also go here and download a the registry update as a .reg file to enable Telemetry in your BizTalk Server 2016 environment.

We gather the count only of the following artifacts in BizTalk

  • Send Ports
  • Receive Location
  • Adapters
  • Hosts
  • Partners
  • Agreements
  • Schemas
  • Machines

We appreciate you taking the time to enable telemetry data to help us drive the product going forward.

Oporto City is ready to receive Oporto Global Integration Bootcamp – March 25, 2017 – Oporto, Portugal

Oporto City is ready to receive Oporto Global Integration Bootcamp – March 25, 2017 – Oporto, Portugal

I am really excited to announce that all the arrangements for first Oporto Global Integration Bootcamp are almost finalized and I can now release 90% of the event agenda. This event will be held at DevScope offices in Oporto on March 25, 2017, between 09:00 – 17:00.

Oporto Global Integrattion Bootcamp: BizTalk Server 2016, Logic Apps, Service Bus, Enterprise Integration Pack, API Management, On-Premise Gateway, Hybrid Integration, Microsoft Flow

What is Global Integration Bootcamp?

Global Integrattion Bootcamp: BizTalk Server 2016, Logic Apps, Service Bus, Enterprise Integration Pack, API Management, On-Premise Gateway, Hybrid Integration, Microsoft Flow

This is a free, event driven by user groups and communities around the world, backed by Microsoft, for anyone who wants to learn more about Microsoft’s integration story. In this full-day boot camp, we will deep-dive into Microsoft’s integration stack with hands-on sessions and labs, delivered to you by the experts and community leaders. In this Boot Camp, we will focus on:

  • BizTalk Server 2016: BizTalk Server 2016, what’s new, and using the new Logic Apps adapter.
  • Logic Apps: Creating Logic Apps using commonly-used connectors.
  • Service Bus: Build reliable and scalable cloud messaging and hybrid integration solution
  • Enterprise Integration Pack: Using the Enterprise Integration Pack (EIP) with Logic Apps
  • API Management: How does API management help you organize your APIs and how does it increase security?
  • On-Premise Gateway: Connecting to on-prem resources using the On-Premise Gateway
  • Hybrid Integration: Hybrid integrations using BizTalk Server and Logic Apps
  • Microsoft Flow: Learn to compose flows with Microsoft Flow

But much more. Porto will be joining locations all over the globe holding this event on the same day. Check out the global website for information about the global organizers and other locations, or follow the Twitter hashtag #integrationbootcamp.

Oporto Global Integration Bootcamp Agenda

09:00 Registration opens and welcome

10:00 BIZTALK 2016 IN A HYBRID WORLD
The integration landscape has definitely evolved to be a hybrid, significant on-premise investment has been accumulated over the years while at the same time cloud computing brought new challenges and new ways of implementing integration. Let’s navigate through the innovations in both worlds and how BizTalk and the cloud currently live together. – Ricardo Torre

11:00 BIZTALK OCTOPUS DEPLOY
How to deploy BizTalk solutions with Octopus – José António Silva & Pedro Sousa, DevScope

11:30 Coffee-break

11:45 UNLEASH THE POWER OF IOT WITH SHAREPOINT 
SharePoint is becoming modern, there are modern sites ready for mobile, a modern framework to develop web parts, but what about embracing modern concepts?

Internet of things will be everywhere in a blink of an eye and probably you are already dealing with it every day without even knowing.

In this session, we will explain how to collect data from sensors, send it to SharePoint and how to display it in a modern dashboard using modern SharePoint sites.

The possibilities are endless, from temperature sensors to access control devices you can have all this data inside your SharePoint Intranet with a modern look and feel. – João Ferreira, BindTuning

12:45 THE SPEAKER NIGHTMARE: Eval Forms & OCR & Logic Apps & Power BI
In this session, I will show and explain a real live demo on how we can easily build a robust solution for process evaluation forms, using an OCR software and easily integrate them with Power BI to present the results in an interactive and beautiful way. But most important: How you can educate your enterprise Developers and IT pros users to easily extend capabilities for power users, who understand their business challenges the best, and allowing them to use their familiar tools like: OCR software to process Evaluation forms and quickly build and deliver Power BI solutions to build Interactive Data dashboards. And at the same time integrate these tools, platforms or systems and a very quick and robust way using integrate feature on Azure, like, Logic Apps, API Apps and Azure Functions. How to start from a simple solution and evolve them enabling new functionalities. – Sandro Pereira, Microsoft Integration MVP

13:15 Lunch and Networking

Register

The agenda is not completely defined, you see that there are some sessions to be announced and can be subject to changes in terms of order of the sessions.

Thanks to sponsorship from DevScope, this event will be free of charge, including the catering, however, capacity will be limited in the venue so don’t delay and reserve your ticket at https://www.eventbrite.com/e/oporto-global-integration-bootcamp-tickets-31508629158. I’m looking forward to welcoming you to the Oporto Global Integration Bootcamp on March 25, 2017!

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community. View all posts by Sandro Pereira

BizTalk Server advanced WCF custom binding techniques and challenges

BizTalk Server advanced WCF custom binding techniques and challenges

Working in secure channel with SOAP and WCF sometime could be a very complex activity.
BizTalk Server provides many adapters able to cover any requirement and in case of very complex challenge we can use the WCF-Custom adapter to implement more complex and specific binding settings with very high granularity.

The biggest issues are normally related to the binding (security), customization and troubleshooting.

Sometime we need to face very complex security challenges and the strategy to use to solve the challenge quick as possible is critical.

In case of complex binding the best strategy to use is using a .Net approach in the first step and switching in BizTalk in a second time.

We can use a classic sample as below, a mutual certificate authentication in SOAP 1.2 and TLS encryption with a Java service.

I see two main advantages using the .Net configuration file approach:

  1. Intellisense

Visual Studio provides a very useful intellisense approach and it’s very easy to extend and change the binding and test it very quickly.

  1. Documentation and support

In a security challenge the possibility to use the web resources in the web space is crucial, most of the documentation is related on using the WCF .Net approach and you will find a lot of samples using Web.config or App.Config file approach.
For that reason a .Net approach is faster and easier to use and test.

A binding section for mutual certificate via TLS looks as below.

<bindings>

<customBinding>

<binding
name=”MyBinding“>

<security
requireSignatureConfirmation=”false


authenticationMode=”MutualCertificate


enableUnsecuredResponse=”true


allowSerializedSigningTokenOnReply=”false


defaultAlgorithmSuite=”Basic256Sha256


messageSecurityVersion=”WSSecurity10WSTrustFebruary2005WSSecureConversationFebruary2005WSSecurityPolicy11BasicSecurityProfile10


securityHeaderLayout=”Lax“>

<secureConversationBootstrap
requireSignatureConfirmation=”false” />

</security>

<textMessageEncoding
messageVersion=”Soap12writeEncoding=”utf-8“></textMessageEncoding>

<httpsTransport
requireClientCertificate=”true


authenticationScheme=”Negotiate


useDefaultWebProxy=”true


manualAddressing=”false” />

</binding>

And below the behaviour section.

<behavior
name=MyBehaviour“>

<clientCredentials>

<clientCertificate
findValue=”mydomain.westeurope.cloudapp.azure.comstoreLocation=”LocalMachinestoreName=”Myx509FindType=”FindBySubjectName” />

<serviceCertificate>

<defaultCertificate
findValue=” mydomain -iso-400storeLocation=”LocalMachinestoreName=”TrustedPeoplex509FindType=”FindBySubjectName” />

</serviceCertificate>

</clientCredentials>

</behavior>

When we are sure about our tests and that everything is running we I can easily switch using BizTalk Server and create the custom bindings.

The WCF custom adapter in general provides the same sections, what we need to do is create a WCF-Custom adapter and a Static Solicit Response Send Port, after that we can easily insert our bindings and behaviors.

In case of specific settings we can import the bindings as well , a great feature offered by BizTalk is the possibility to import and export our bindings, in this way we can easily experiment very fast any complex binding and import this binding in our WCF-Custom adapter in a second time.

Sometime external services require very complex customization and we need to override protocol or messaging behaviour in the channel, for instance some service doesn’t accept the mustUnderstand in the SOAP header

or we need to impersonate a specific user by certificate in the header or just manage a custom SOAP header.
I’m my experience best strategy to use is developing the custom behaviour in a WCF .Net project, this is the faster way to test the WCF behavior without we need to manage GAC deployments, Host Instances restarts and so on.
when the WCF behavior works we can easily configure it in the BizTalk port.

Using a .Net approach we need to add the WCF behavior by reference.

Configure it in .config file and test/debug it.

When everything is working, we will be able to add the behavior in BizTalk adding the component in GAC and adding the behavior in the BizTalk port.
The WCF-Custom BizTalk Server adapter offers a very good level of customization, selecting the bindings and behavior tabs.

The most complex side in this area is the security and the messaging inspection, I recommend two things to do for troubleshooting, one using Fiddler or WireShark and the second the WCF logging, I recommend to use together as they compensate them each other.

Fiddler is a very powerful free tool, easy to use, just run it and use it.
In case of BizTalk Server we need to configure the framework to use Fiddler, at this point BizTalk offers many easy ways to do that.

By the port if we like to affect the port only.

By the adapter host handler if we want to affect to all the artefacts under it.

For deep level sniffing and we need to sniff Net TCP or other protocols I recommend WireShark, a bit more complex to use but this is the tool.

To configure the WCF logging we simply need to add the section below in the BizTalk configuration file to affect BizTalk services only, in the Machine config file if we want to affect all the services in the entire machine or in the Web.Config to affect the specific service.

<!– DIAGNOSTICS –>

<system.diagnostics>

<sources>

<source
name=”System.ServiceModel.MessageLogging” >

<listeners>

<add
type=”System.Diagnostics.DefaultTraceListenername=”Default” >

<filter
type=”” />

</add>

<add
initializeData=”c:logsmessagesClient.svclogtype=”System.Diagnostics.XmlWriterTraceListener


name=”messagestraceOutputOptions=”None” >

<filter
type=”” />

</add>

</listeners>

</source>

<source
propagateActivity=”truename=”System.ServiceModelswitchValue=”Error,ActivityTracing“>

<listeners>

<add
type=”System.Diagnostics.DefaultTraceListenername=”Default“>

<filter
type=”” />

</add>

<add
name=”ServiceModelTraceListener“>

<filter
type=”” />

</add>

</listeners>

</source>

</sources>

<sharedListeners>

<add
initializeData=”c:logsapp_tracelogClient.svclogtype=”System.Diagnostics.XmlWriterTraceListener, System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089


name=”ServiceModelTraceListenertraceOutputOptions=”Timestamp” >

<filter
type=”” />

</add>

</sharedListeners>

</system.diagnostics>

Author: Nino Crudele

Nino has a deep knowledge and experience delivering world-class integration solutions using all Microsoft Azure stacks, Microsoft BizTalk Server and he has delivered world class Integration solutions using and integrating many different technologies as AS2, EDI, RosettaNet, HL7, RFID, SWIFT. View all posts by Nino Crudele

Microsoft Integration Weekly Update: March 13

Microsoft Integration Weekly Update: March 13

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

  • C#/BizTalk Developer Novo Technologies Modesto, CA, US
  • Sr Biztalk Developer Stafflabs Inc Princeton, NJ, US
  • BizTalk Developer First Tech Federal Credit Union Rocklin, CA, US
  • Biztalk Developer Jobspring Partners Los Angeles, CA, US
  • EDI Integration Developer Seaboard Foods Shawnee, KS, US

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Getting started with Live Unit Testing in Visual Studio 2017

Getting started with Live Unit Testing in Visual Studio 2017

Visual Studio 2017 has a new feature called Live Unit Testing. Live Unit Testing is currently in the Enterprise edition of Visual Studio 2017 and it’s available for C# and VB projects that target the .NET Framework. This Live Unit Testing automatically runs the impacted unit tests in the background as we edit code, and visualizes the results and code coverage live, in the editor, in real-time.

In this blog, we will discuss how Live Unit Test in Visual Studio 2017 works.

How to start Live Unit Testing

Enabling Live Unit Testing is as simple as going to the Test command at the top-level menu bar and starting it as shown in image below.

NewLiveUnitTestingWindow

Live Unit Testing works with three popular unit testing frameworks: MSTest, xUnit and NUnit. When using these, we will need to ensure that the adapters and frameworks meet or exceed the minimum versions given below:

  • For xUnit we will need xunit.runner.visualstudio version 2.2.0-beta3-build1187 and xunit 2.0 (or higher versions)
  • For NUnit we will need NUnit3TestAdapter version 3.5.1 and NUnit version 3.5.0 (or higher versions)
  • For MSTest we will need MSTest.TestAdapter 1.1.4-preview and MSTest.TestFramework 1.0.5-preview (or higher versions)

Live Unit Testing experience

Once we enabled Live Unit Test, its helps us to quickly see whether the code we are writing is covered and if the tests that covers it are passing, without leaving the editor. Unit test results and coverage visualizations appear on a line-by-line basis in the code editor as shown in sample image below:

StartingLiveUnitTesting

Note: The dash indicates that the code does not have any test coverage. The red X signifies the line was executed by at least one unit test and failed. The green check mark signifies that the code was executed and any tests that ran, passed.

The live feedback also serves to notify us instantly if thee change has broken the program – if inline visualizations shift from green “✓”s to red “×”s, we know we broke one or more tests.

At any point in time we can hover over the “✓” or “×” to see how many tests are hitting the given line as seen in image below.

CoveredUnitTest

We can click on the check or “×” to see what tests are hitting the given line as shown in image below.

UnitTestHittingPage

When hovering over the failed test in the tool tip, it expands to provide additional info to give more insight into the failure as shown in image below.

TestInsight

At any time, we can temporarily pause or completely stop live unit testing; for example, when we are in the middle of a refactoring and we know that our tests will be broken for a while. It is as simple as going to the Test command in top level menu bar and clicking the desired action, as shown below.

LiveUnitTestStop

Conclusion

Live Unit Testing will improve our developer productivity, test coverage and quality of software.

The post Getting started with Live Unit Testing in Visual Studio 2017 appeared first on BizTalk360.

Mulesoft IPO and what it means for Microsoft System Integrators

Mulesoft IPO and what it means for Microsoft System Integrators

A friend shared a couple of links with me recently about the Mulesoft IPO which is happening soon and it got me thinking about how this might affect us in the Microsoft integration world. First off those links:

It is really interesting to see a major move like this by one of the big iPaas players. Mulesoft is a company I have followed for some time, in particular during the years when Microsoft were making such a marketing disaster around their integration offering up until 2015. At that point Microsoft has lost their way in the integration space there was a point when there was a serious consideration about switching to focus on Mulesoft for my customers. The main driver for this wasn’t technical. It was all driven by the fact that Mulesoft made such a great marketing story and such a lot of noise in the industry that it was difficult not to want to follow it. At the time I held fire because I wanted to make a major bet on Microsoft Azure Cloud and I felt that the changes Microsoft were looking to make around their integration technologies would eventually pay off in the long run and the marketing as part of the Azure brand would fix the problems Microsoft were experiencing pre-2015.

If we take a look at some of the interesting points from the Mulesoft articles however we notice that:

  • Mulesoft is growing at 70%
  • Subscription recenue is at $150m per year
  • Professional Services is $35 per year
  • Their growth looks to have dropped off slightly in 2016 compared to 2015
  • They now have 1071 customers (I assume this is subscription paying customers)
  • New customer value nearly doubled from $77k to $169k (assume per year)
  • Mulesoft is expected to go public at something north of $1.5 billion

The interesting bit between those articles which is a bit of a conflict is the first one suggests that Mulesoft is “rapidly approaching cashflow from operations breakeven and net income profitability” where as the article from CNBC suggests that Mulesoft “lost $50million of $188 million in revenue in 2016 and $65 million in 2015”. Im not sure which is right but lets assume its somewhere in the middle.

On paper Mulesoft would seem to be an interesting investment for customers and investors however I feel there are a few threats to Mulesoft which will make the future from the current industry positioning and the various things Microsoft are doing which would make an interesting discussion.

Threat 1 – Pricing Model

I think one thing has been missed however in the analysis and that is how the iPaaS landscape is starting to change. A similar thing happened with API Management (APIM) a couple of years ago. If you think back to 2013-2015 people were going crazy about API Management and it was viewed as a premium service which companies were paying 100K+ per year for a proxy in front of their API which offered added value features. The problem was that those APIM vendors were typically niche vendors who only did APIM. Eventually along came Microsoft and Amazon who offered APIM as a commodity service. They didn’t need to charge a high premium because if you use their APIM then you are likely to be using many of their other services too. This cross sell ability of the big cloud players meant you could now get APIM for 5 times cheaper in some cases. You might sacrifice some nice to have features but the core capability was there. If we now compare this with the iPaaS world. If you compare this to the iPaaS world today and look at some of the main players on Gartner or Forrester you will see a similar pattern in their pricing models. The pricing is often quite vague with things like “per connection” or “for N connections” with no real definition of what a connection is. Some examples are below:

In all of these cases the pricing model boils down to having to contact the vendor and get their sales team involved before you can start. Not very “cloudy” in my opinion.

If we now consider how iPaaS is changing from a premium offering to a commodity, in particular driven by Microsoft with their Logic Apps offering the pricing model has these key differences:

  1. The price is publically displayed
  2. The price is charged on a per action basis which is genuinely pay as you go rather than paid for by “Compute unit” which means that the customer is usually paying for a % more capacity than they need just like in the old om premise server capacity models

If we consider the typical new Mulesoft customer who is spending approximately 169,000 per year for Mulesoft compute units then the equivalent is on Logic Apps you would get in the region of 4,000,000,000 actions per year. I think I would consider the costing models of Logic Apps to be a proper per usage cost model vs a per compute unit cost model used by most of the other vendors and perhaps this is going to be the evolution of Generation 2 iPaaS as other vendors follow this trend moving to a more Serverless model.

While Mulesoft has been in a great position for the last couple of years and made great progress, I wonder if the public offering is coordinated to this new threat of Generation 2 iPaaS which is genuinely Serverless whereas Generation 1 may look like Platform as a Service but it is clearly tied to underlying server infrastructure which is abstracted from the end customer. I would guess from my playing around with the product that it would take a reasonably big re-architecture of their product to be able to support a similar cost model to Logic Apps in particular when Mulesoft seems to runs on AWS rather than its own cloud fabric.

Threat 2 – Cross Sell

Mulesoft has a limited set of products focused around the integration space. They cover:

  • iPaaS
  • API Management
  • Connectors
  • Message Queue

This is the core bits of most integration platforms and they state that when you need something they don’t do then you should use “best of breed”. This is a valid approach and one used for a long time by vendors, but when competing against the big cloud vendors who have other stuff on their cloud the question is do I want:

  • Option A – Go to another vendor, start a whole procurement process, evaluate options and N months later I can start using another product
  • Option B – Click 3 buttons and have the capability on the cloud im already using

I would argue that in todays world of agility and speed option B is much more popular than an IT procurement exercise.

If we think about the Azure offering, the secret sauce to the Microsoft Integration platform is that you have the rest of the Azure cloud to use as illustrated below.

The key difference from this cross sell capability at the vendor is that companies like Amazon & Microsoft can make a platform play where they provide the platform for holistic solutions for the entire enterprise. This can be things such as classic infrastructure, through PaaS and to innovative stuff like machine learning, BOT frameworks, Big Data, Block Chain. Mulesoft is not in this platform level cloud game and can only offer a specialised niche around integration.

Thread 3 – Democratisation of Integration

One of the big themes in integration today is the democratization of integration. Two of the key elements within this are:

  1. Allowing the citizen integrator to be involved in the integration solutions the organisation uses
  2. Opening insights into the integration solutions your organisation has

In the first case, at present Mulesoft has no offering and no visibility of any offering that ive seen aimed at the Citizen Integrator. This is slightly strange as their marketing and blogging teams are usually all over big industry themes in integration but they seem to be giving this one a wide berth. The only stuff ive seen is forums which suggest teaching the citizen integrator to be a developer. If you compare this to the Microsoft offering where you have a very solid offering around Power Apps and Flow which are part of the integration suite but specifically aimed to empower the Citizen Integrator.

In the 2nd case analytics, insights and interesting stuff from your integration solutions is one of the best things about modern integration which can allow the business to get real added value from integration. Mulesoft has the analytics you would expect for their APIM offering and it also has a business events capability within the management console. While this ticks the basic boxes of reporting and insights it is lacking in the things the other major cloud vendors can offer. For example with Microsoft you have the ability to use Operations Manager Suite, Power BI, Cortana Analytics Suite, Application Insights all along side your integration solution to give you deep insights which can be targeted at different audiences such as an IT Pro or Business User or even a customer. The power to build a much richer solution is there.

Threat 4 – Target Customers

The fourth big threat is also associated with the pricing model. Mulesoft is only really relevant for big enterprise customers. They have 1000 of those but they are in a place where lots of established names such as Oracle, Tibco, Microsoft and various others already have major products with much higher customer numbers. EG: Microsoft BizTalk Server has 10,000+ customers. While Mulesoft may have made some inroads in winning customers by replacing their traditional integration platform, or more likely complimenting it with an iPaaS capability, this area is a competitive area. I said a few years ago at the Integrate conference that I felt the next big place for System Integrators would be with iPaaS products who could offer solutions for the SME companies. This is exactly where the Microsoft Logic Apps offering hits the nail on the head. With Azure an SME can setup a cloud scale, enterprise ready integration product and spend next to $0. They could build 1 simple interface to begin with and pay as they go. Over time its feasible they could grow significantly and just pay more as they use more. This opens up a world where Microsoft could conceivably have hundreds of thousands of SME customers using their iPaaS offering in a way none of the above vendors could compete with.

Its difficult to see how Mulesoft could compete in this space with their current cost model and its difficult to see how the cost model could change with the current product architecture.

Threat 5 – Questionable Innovation

If you look at the Mulesoft product offering over the last few years and consider how it has evolved, changed and how they have innovated then you could argue the answer is “not that much”. In the last couple of years the main new features are:

  • A new mapper
  • Anypoint MQ – JMS
  • Monitoring

The reality is those 3 key areas are basic product capabilities required of ESB/iPaaS offerings so id hardly call that innovation.

Instead in the last couple of years Mulesoft have focused on getting as much return for the product they had through fantastic marketing and PR creating awareness in the industry. While they have lots of success you could argue that their ability to execute and completeness of vision has been overtaken by other vendors and also the integration world has been evolving.

In a post IPO world, is it likely that investment in R&D will see many new innovations when there will be drivers to reduce losses?

Threat 6 – Security Story

Following on from the innovation question, I also wonder about the positioning of Mulesofts product stack in terms of collaboration with security products that are out there. If you consider the Microsoft world for a moment we see security in fundamental places like Azure Active Directory, Azure Active Directory B2C, Role Based Access Security, API Management security stories, multifactor authentication all giving Azure customers a fantastic hybrid security model covering the enterprise and customers. You then add to the mix Azure offerings such as Security Centre, Azure Advisor and Operations Manager Suite which all look at your solutions and tell you how they are doing against good practices, if there are any vulnerabilities and other good things like that.

The Integration Platform from Microsoft inherits all of this good stuff.

In the Mulesoft space, outside of the security used by its connectors to talk to an application there is a very limited security or governance story. I believe in the coming years this is one of the key areas customers will focus on much more when their cloud maturity increases.

Threat 7 – Post IPO Changes

I would suggest this is the biggest threat to Mulesoft, after an IPO many companies change in various ways. Some examples might include:

  • Some of your good staff who were here for the IPO opportunity may move on to the next opportunity
  • You now have to change from an attractive looking proposition to a business that makes a profit
  • Its not so easy to go back to the industry for additional rounds of funding like Mulesoft have done a few times in recent years

I feel the biggest challenge is when the company now has to start being profitable. Well the challenge is that the customers you already have are paying a lot per customer for the services (based on what the articles above suggest) so its probably difficult to sell more compute to your existing customers. This leaves 2 avenues:

  1. Sell to more customers
  2. Reduce costs

Selling to more customers is going to be difficult, 3 years ago few people had heard of Mulesoft but today you see their ads everywhere and its hard to come across an organisation who hasn’t already heard of Mulesoft. Based on the numbers being mentioned in those articles, im not sure if even doubling their customer numbers from 1000 to 2000 would get them into regular profit. That’s before the fact that customers are becoming wiser to the challenge that iPaaS is not all about marketing blagware and buzz words and also realize that integration today doesn’t always need to be expensive.

This leaves reducing costs as a likely course of action and that means less noise and activity from Mulesoft.

Prediction

While I think it’s a great time to go public, I do wonder if the future for Mulesoft could be similar to what happened to APIGee when they went public (http://uk.businessinsider.com/why-google-spent-625-million-on-apigee-2016-11?r=US&IR=T). The problem is they are a niche company and cannot easily cross sell other services which they do as they don’t have the platform that the big cloud players have.

My prediction will be that as vendors need to move towards being a Generation 2 iPaaS vendor this is where Mulesoft will struggle. They have had major investment so far but will they need to rearchitect their core product to compete in the future? If you look at their products they have spent a lot of time over the last couple of years trying to bolt bits on so it to meet their sales commitment and lots of investment around sales, marketing and promotion, but there has been limited real product innovation in this time?

If Mulesoft have a similar journey to APIGee then one thing is for sure, Amazon and Google have not got much of an iPaaS offering so you could see an obvious acquisition target which would boost their cloud offerings. The only question around that however is with major investors like Salesforce, Cisco and Service Now you do wonder if that would be feasible.

From a Microsoft Integration persons perspective, all of this is fantastic news. Times have been tough for use for a few years competing with the marketing power of Mulesoft when Microsoft had up until recently been investing so little in marketing their integration stack and equally as little effort in selling it. Since that time though they now have a far superior integration suite and a genuine cloud platform offering which suits most customers. If Mulesoft turn out to have a bunch of internal challenges as result of the transition from private to public company then this will make life a lot simpler and perhaps my linked in ad’s will eventually stop spamming me with Mulesoft 10 times per day J

One thing is for sure it will certainly be interesting to watch the journey of Mulesoft in the public world and taking it to the next level and I wish them all the best.

The post Mulesoft IPO and what it means for Microsoft System Integrators appeared first on Microsoft Integration & Cloud Architect.