2nd Annual Global Integration Bootcamp a Success!

2nd Annual Global Integration Bootcamp a Success!

DZLx-R1UQAEhxvOLast Saturday I had the great privilege of organising and hosting the 2nd annual Global Integration Bootcamp in Brisbane. This was a free event hosted by 15 communities around the globe, including four in Australia and one in New Zealand!

It’s a lot of work to put on these events, but it’s worth it when you see a whole bunch of dedicated professionals give up part of their weekend because they are enthusiastic to learn about Microsoft’s awesome integration capabilities.

The day’s agenda concentrated on Integration Platform as a Service (iPaaS) offerings in Microsoft Azure. It was a packed schedule with both presentations and hands-on labs:

It wasn’t all work… we had some delicious morning tea, lunch and afternoon tea catered by Artisan’s Café & Catering, and there was a bit of swag to give away as well thanks to Microsoft and also Mexia (who generously sponsored the event).

0a4f5bc5-e350-401a-b98d-a26c7cd6a502-originalOverall, feedback was good and most attendees were appreciative of what they learned. The slide decks for most of the presentations are available online and linked above, and the labs are available here if you would like to have a go.

I’d like to thank my colleagues Susie, Lee and Adam for stepping up into the speaker slots and giving me a couple of much needed breaks! I’d also like to thank Joern Staby for helping out with the lab proctoring and also writing an excellent post-event article

Finally, I be remiss in not mentioning the global sponsors who were responsible for getting this world-wide event off of the ground and providing the lab materials:

  • Martin Abbott
  • Glenn Colpaert
  • Steef-Jan Wiggers
  • Tomasso Groenendijk
  • Eldert Grootenboer
  • Sven Van den brande
  • Gijs in ‘t Veld
  • Rob Fox

Really looking forward to next year’s event!

Integration Down Under… is UP & RUNNING!

Integration Down Under… is UP & RUNNING!

For years now Integration Monday has been faithfully giving us webinars almost every week. There have been some outstanding sessions from international leaders in the integration space including MVPs, members of the Microsoft product team, and other community members. For the Asia Pacific community, however, it has always been a challenge to participate in the live sessions due to the unfriendly time zone.  (I certainly know what a struggle it was to present my own session last October at 4:30am!)  Even from the listener’s perspective, it is usually nicer to be able to join a live webinar and ask questions rather than to consume the recordings afterwards.

Thanks to the initiative of veteran MVP Bill Chesnut (aka “BizTalk Bill”) and the sponsorship of his employer SixPivot, we now have a brand new webinar series starting up in a friendlier time slot for our APAC community! Integration Down Under is launching its inaugural webinar session on Thursday, 8th February at 7:00pm AEST. You can register for this free event here.

This initial session will introduce the leaders and allow each of us to present as very short talk on a chosen topic:

There are already more than twenty registrations even though the link has been live for only a few days. I hope that this is a good sign of the interest within the community!

Feeling really fortunate to be part of this initiative, and looking forward to delivering my intro to Event Grid talk! It will be a slightly scaled down version of what I presented at the Sydney Tech Summit back in November. Hope to see you there!

Serverless Logging & Alerting with Service Fabric & Azure Event Grid

Serverless Logging & Alerting with Service Fabric & Azure Event Grid

(This post was originally published on Mexia’s blog on 1st September 2017)

Microsoft recently released the public preview of Azure Event Grid – a hyper-scalable serverless platform for routing events with intelligent filtering. No more polling for events – Event Grid is a reactive programming platform for pushing events out to interested subscribers. This is an extremely significant innovation, for as veteran MVP Steef-Jan Wiggers points out in his blog post, it completes the existing serverless messaging capability in Azure:

  • Azure Functions – Serverless compute
  • Logic Apps – Serverless connectivity and workflows
  • Service Bus – Serverless messaging
  • Event Grid – Serverless Events

And as Tord Glad Nordahl says in his post From chaos to control in Azure, “With dynamic scale and consistent performance Azure Event grid lets you focus on your app logic rather than the infrastructure around it.”

The preview version not only comes with several supported publishers and subscribers out of the box, but also supports customer publishers and (via WebHooks) custom subscribers:

EventGridPubsSubs

In this blog post, I’ll describe the experience in building a sample logging mechanism for a service hosted in Azure Service Fabric. The solution not only logs all events to table storage, but also sends alert emails for any error events:

image

Creating the Event Grid Topic

This was an extremely simple process executed in the Azure Portal. Create a new item by searching for “Event Grid Topic”, and then supply the requested basic information:

image

Once created, the key items you will need once the topic is created is the Topic Endpoint and the associated key:

image

Creating the Event Publisher

As mentioned previously, there are a number of existing Azure services that can publish events to Event Grid including Event Hubs, resource groups, subscriptions, etc. – and there will be more coming as the service moves toward general availability.  However, in this case we create a custom publisher which is a service hosted in Azure Service Fabric. For this sample, I used an existing Voting App demo which I’ve written about in a previous blog post, modifying it slightly by adding code to publish logging events to Event Grid.

The first requirement was storing the topic endpoint and key in the parameter files, and of course creating the associated configuration items in the ServiceManifest.xml and ApplicationManifest.xml files (this article provides information about application configuration in Service Fabric):

image

Note that in a production situation the TopicKey should be encrypted within this file – but for the purposes of this example we will keep it simple.

Next step was creating a small class library in the solution to house the following items:

  • The Event class which represents the Event Grid events schema
  • A LogEvent class which represents the “Data” element in the Event schema
  • A utility class which includes the static SendLogEvent method
  • A LogEventType enum to define logging severity levels  (ERROR|WARNING|INFO|VERBOSE)

To see an example of how to create the Event class, refer to fellow Azure MVP Eldert Grootenboer’s excellent post.  The only changes I made were to assign the properties for my custom LogEvent, and to add a static method for sending a collection of Event objects to Event Grid (notice how the Event.Subject field is a concatenation of the Application Name and the LogEventType – this will be important later on):

image

image

The utility method that creates the collection and invokes this static method is pretty straight forward:

image

This all makes it simple to embed logging calls into the application code:

image

Creating the Event Subscribers

Capturing All Events

The first topic subscription will be an Azure Function that will write all events to Azure table storage. Provided you’ve created your Function App in a region that supports the Event Grid preview (I’ve just created everything aside from the Service Fabric solution within the same resource group and location), you will see that there is already an Event Grid Trigger available to choose. Here is my configured trigger:

image

As you can see, I’ve also configured a Table Storage output. The code within this function creates a record in the table using the Event.Subject as a partition and the Event.Id as the row key:

image

Using the free Azure Storage Explorer tool, we can see the output of our testing:

image

Alerting on ERROR Events

Now that we’ve completed one of the two subscriptions for our solution, we can create the other subscription which will use a filter on ERROR events and raise an alert via sending an email notification.

The first step is to create the Logic App (in the same region as the Event Grid) and add the Event Grid Trigger. There are a few things to watch out for here:

  • When you are prompted to sign in, the account that your subscription belongs to may or may not work. If it doesn’t, try creating a Service Principal with contributor rights for the Event Grid topic (here is an excellent article on how to create a service principal)
  • The Resource Type should be Microsoft.EventGrid.topics
  • The Suffix field contains “ERROR” which will serve as the filter for our events
  • If the Resource Name drop-down list does not display your Event Grid topic at first, type something in, save it and then click the “x”; the list should hopefully appear. It is important to select from the list as just typing the display name will not create the necessary resource ID in the topic field and the subscription will not be created.

You can then follow this with an Office365 Email action (or any other type of notification action you prefer). There are four dynamic properties that are available from the Event Grid Trigger action (Subject, ID, Event Type and Event Time):

image

After saving the Logic App, check for any errors in the Overview blade, and then check the Overview blade for the Event Grid Topic – you should see the new subscription created there:

image

Finally, we can test the application. My Voting demo service generates an exception (and a ERROR logging event) when a vote is cast for a null/empty candidate (see the ERROR entry in the table screenshot above). This event now triggers an email notification:

image

Summary

So this example may not be the niftiest logging application on the market (especially considering all of the excellent logging tools that are available today), but it does demonstrate how easy it is to get up and running with Event Grid. You’ve seen an example of using a custom publisher and two built-in subscribers, including one with intelligent filtering. To see how to write a custom subscriber, have a look at Eldert’s post “Custom Subscribers in Event Grid” where he uses an API App  subscriber to write shipping orders to table storage.

Event Grid is enormously scalable and its consumption pricing model is extremely competitive. I doubt there is anything else quite like this on offer today. Moreover, there will be additional connectors coming in the near future, including Azure AD, Service Bus, Azure Data Factory, API Management, Cosmos DB, and more.

For a broader overview of Event Grid’s features and the capabilities it brings to Azure, have a read of Tom Kerkhove’s post “Exploring Event Grid”. And to understand the differences between Event Hub, Service Bus and Event Grid, Saravana Kumar’s recent post sums it up quite nicely. Finally, if you want to get your hands dirty and have a play, Microsoft has provided a quickstart page to get you up and running.

Happy Eventing!

Great Experience at INTEGRATE 2017 in London!

Great Experience at INTEGRATE 2017 in London!

Last week I had the privilege not only of attending the INTEGRATE 2017 conference in London, but presenting as well. A huge thanks to Saravana Kumar and BizTalk360 for inviting me as a speaker – what a tremendous honour and thrill to stand in front of nearly 400 integration enthusiasts from around the world and talk about Hybrid Connectivity! Also, a big thanks to Mexia for generously funding my trip. 14-DMT_NickHhauenstein

With 380+ attendees from 52 countries around the globe, this is by far the biggest Microsoft integration event of the year. Of those 380, only four of us that I know of came from APAC: fellow MVP speakers Martin Abbott from Perth and Wagner Silveira from Auckland NZ, as well as Cameron Shackell from Brisbane who manned his ActiveADAPTER sponsor stand. Wagner would have to take the prize for the furthest travelled with his 30+ hour journey!

I’ve already published one blog post summarising my take on the messages delivered by Microsoft (which accounted for half of the sessions at the event). This will soon be followed by a similar post with highlights of the MVP community presentations, which in addition to BizTalk Server, Logic Apps, and other traditional integration topics also spanned into the new areas of Bots, IoT and PowerApps.

iPhone Import 156Aside from the main event, Saravana and his team also arranged for a few social events as well, including networking drinks after the first day, a dinner at Nando’s for the speakers, and another social evening for the BizTalk360 partners. They also presented each of the BizTalk360 product specialists with a beautiful award – an unexpected treat!

You have to hand it to Saravana and his team – everything went like clockwork, even keeping the speakers on schedule. And I thought it was a really nice touch that each speaker was introduced by a BizTalk360 team member. Not only did it make the speakers feel special, but it provided an opportunity to highlight the people behind the scenes who not only work to make BizTalk360 a great product but also ensure events like these come off. I hope all of them had a good rest this week!

As with all of these events, one of the things I treasure the most is the opportunity to catch up with my friends from around the globe who share my passion for integration, as well as meeting new friends. In my talk, I commented about how strong our community is, and that we not only integrate as professionals but integrate well as people too.

01-SundayNight-Mikael  21-Pizza_TomCanter

Arriving a day and a half before the three day event, I had hoped to conquer most of the jet-lag early on. But alas, the proximity to the solstice in a country so far North meant the sun didn’t set until past 10:30pm while rising just before 4:30am – which is the time I would involuntarily wake up each day no matter how late I stayed up the night before! Still, adrenalin kept me going and the engaging content kept me awake for every session.

And no matter what.., there was always time for a beer or two! Smile

22-PostPizza_TomCanter

I look forward to the next time I get to meet up with my integration friends! If you missed the event in London, you’ll have a second chance at INTEGRATE 2017 USA which will be held in Redmond on October 25-27. And of course, if you keep your eyes on the website, the videos and slides should be published soon.

(Photos by Nick Hauenstein, Dan Toomey, Mikael Sand, and Tom Canter)

The New Azure Hybrid Connections

The New Azure Hybrid Connections

(This post was originally published on Mexia’s blog on 19th June 2017)

Microsoft recently announced that Azure BizTalk Services (MABS) is officially being retired. This was no great surprise, as those who actually used this service and its VETER pipelines to build integrations were well aware that the tooling was cumbersome, the DevOps story was terrible, scalability was severely limited, and the management capabilities left much to be desired. Logic Apps and the Enterprise Integration Pack already have already far surpassed the capabilities of MABS for cloud-based integration and B2B (EDI) scenarios. However, the one really useful feature of MABS was the free Hybrid Connections capability – free because this feature never made it out of preview mode.

Image result for hybrid connection images

Hybrid Connections allowed you to easily connect your Web App or Mobile service to an on-premises resource without making any changes to your corporate network, traversing NATS, routers, firewalls etc. with a purely codeless solution. In fact, you could literally “lift & shift” your existing on-prem website to Azure and not even have to alter the connection string to your database. Moreover, it worked at the transport layer so there was no dependency on WCF or .NET. I was so intrigued by the capabilities of this service that I authored a Pluralsight course on it, as well as creating a webcast and writing several blog posts.

With the obvious signs over the past year or so that MABS was on its way out, this had us wondering what would happen to Hybrid Connections? Other non-network related technologies like Service Bus Relay and the newer On-Premises Data Gateway certainly offer some viable alternatives, but nothing that permitted the same flexibility as Hybrid Connections. Fortunately, late last year we got our answer – the new Azure Relay.

A New Offering

imageAzure Relay became generally available on 27 March 2017, less than five months after the preview was announced. This service actually is comprised of two capabilities: the WCF Relay (which is the new name of the existing Service Bus Relay), and the new version of Hybrid Connections. This version of the latter is everything that the former version was, but much more:

  • It is no longer hosted in a sunsetted technology (lives in Azure Service Bus)
  • A published API means that the capability is no longer confined to Azure Web Apps and Mobile Services
  • Reliance on web sockets means it is truly a cross-platform solution

In my Pluralsight course and in my previous webcast, I proved how easy it was to enable a single Azure hosted web site to talk to two separate on-premises resources (a web service and a SQL Server database). That capability exists in the new Hybrid Connections and can be set up in exactly the same way; a convenient downloadable manager agent can be installed in seconds which will complete the listener setup and allow you to flow messages into your network. I was easily able to recreate the same demo scenario in my webcast with the new version.

But even more compelling was the experience at using the API to build a more flexible solution, for example connecting an Azure hosted VM to an on-premises resource. Here, the supplied samples on GitHub (conveniently for both .NET and Node) really prove the extensive capabilities of this service.

Starting Simple

The simple sample introduces you to the Hybrid Connections API by building a simple relay that bi-directionally exchanges blocks of text over a connection. I had this up & running in minutes just by deploying the server app on my local Hyper-V hosted VM and deploying the client application on an Azure-hosted VM. There were a few minor glitches to get over, for example how to correctly format the SQL Server connection string to talk to the local SQLEXPRESS instance I had running. This post helped me realise that I needed to explicitly include the port number in the connection string, as well as ensure that the TCP/IP settings for SQLEXPRESS used a static IP address (which is does not by default).

Of course you need to first create the Hybrid Connection in Azure, which you do in the portal by first creating an Azure Relay:

image

This sets up the Service Bus namespace that is associated with the relay artefacts. Then you can create the Hybrid Connection:

2017-05-29_20-40-59

2017-05-29_20-44-47

By going into the Shared Access Policies menu item for the Relay, you can copy the Primary Key for the “RootManageSharedAccessKey” – a necessary item for configuring the applications that will use the relay. (Of course you can and should configure your own policies for Manage, Send and Listen – but I didn’t for this example).

When you open any of the solutions from the GitHub samples, you will need to update the Microsoft.Azure.Relay Nuget package before compiling the solution:

2017-05-29_20-50-31

Then it’s a matter of copying the solution to your servers, running the Client app on the Azure hosted machine and the Server app on the on-prem machine. I was impressed by the speed of the connection. Messages typed into the client appeared instantly in the server console window with absolutely no perceptible delay! Given the fact that under the covers, the relay uses Service Bus queues to transfer messages, I would have expected and happily tolerated a slight latency here – but there was none! I even opened multiple instances of the client to see how that would work:

2017-05-29_21-16-17

As expected, all of the messages were visible in the server-hosted console, along with logging of sessions opening and closing:

2017-05-29_21-16-46

Although the application in this format isn’t particularly useful for any production scenarios I can think of, it certainly proves that hybrid connectivity is no longer limited to Web Apps.

Port Bridging

The next step was to try out the Port Bridge sample, which allows you to set up a hybrid connection that allows you to setup multiplexed socket tunnels that can bridge TCP and Named Pipe socket connections across the relay. This is an amazingly flexible demonstration as once you deploy the code, you can add SSH-like connections through configuration only, including imposing firewall-like restrictions – all without any involvement of your physical network configuration.

figure1

Again, I was keen to prove that this would allow connectivity across multiple nodes and ports, so I resorted again to my previous demo example and set up connections to both an locally hosted WCF service (on port 80) and my SQL Server database (on port 1433). The WCF service was really nothing more than a simple echo function that also outputs the name of the local server (same as used in my Guestbook demo in the webcast):

image

And for the database connectivity, all I needed to do was prove that a simple query would return the appropriate number of rows in a table:

image

I created two console test apps:

  1. One that called the WCF service
  2. One that executed a scalar query of the number of rows in the Refreshments table above

After testing both console apps locally on the Hyper-V VM, I then deployed these to my Azure VM, along with the portbridge solution (which was also deployed locally). The next step was to create the Hybrid Connection in Azure, which I did by adding to the existing Relay. The trick here was to ensure that the name of the relay (normally anything you like it to be) had to be the same name as your target host name (in this case, it was “devdan10”, the name of the VM hosted in my local Hyper-V).

Getting the configuration right took me a few goes, but eventually got there. Here is the configuration for the on-prem Server Agent console:

image

Notice on Line 10 that I’ve referenced the name of my target host (“devdan10”) and listed the port numbers which I will allow to be opened to the relay.

And here is the configuration for the Client Agent service running in Azure:

image

The local and remote ports happen to be the same here (lines 10 & 16), but they could be different – allowing you to map different ports between the remote client and the target local machine. The firewall rules in this case ensure that the requests come from the client machine – but these can also be adjusted to be as lenient or strict as desired.

There were no changes at all required for the WCF test client, in fact I was able to use “localhost” as the hostname in the client endpoint definition (line 53)!

image

One thing I did need to do was ensure that the connection string in the SQL test client used a TCP protocol and (again) a reference to the local machine rather than the target host name (this is presumably because the portbridge client service maps everything from the local machine on that port to the target remote server):

image

Note that you do need to use SQL Authentication rather than Windows Authentication, as there is no support for Active Directory authentication here.

With that in place I was able to run tests with both clients to prove connectivity over the ports I had configured for forwarding. The first test screenshot show the WCF client test app successfully invoking the service remotely. The window in the background is the console for the Port Bridge Client Agent that runs on the Azure machine, forwarding requests to the remote on-prem machine by way of the Hybrid Connection relay:

image

The Port Bridge Service Agent console windows runs on the on-prem machine, and logs the connections:

image

By the way, it is worth mentioning that both the Port Bridge Service Agent and the Client Agent can be installed and run as a Windows service.

The second test is the same thing but with the SQL Server database test client, correctly retrieving the row count from the on-prem machine:

image

How good is that?? You now have the capability to open of a port-based bridge between cloud and on-premises machines by adding configuration to the client and server agent services, meaning you can successfully connect your IaaS VMs and cloud services [almost] as easy as you connect your Web Apps and Mobile Apps.

In a week’s time I will be speaking about Hybrid Connectivity at the INTEGRATE 2017 conference in London. Unfortunately there won’t be time for demos, but this blog post might make for a handy follow-up reference.

Microsoft MVP – It’s All About Community

Microsoft MVP – It’s All About Community

(This post was originally published on Mexia’s blog on 07 April 2017)

Last weekend I had the great privilege of attending my first Microsoft MVP event – the MVP Community Connection held in Sydney. This was an invitation to all Australian MVPs to come together over two days to network, receive central communication from the program managers, and learn some best practices in how we can better serve the community.

MVP_Logo_Secondary_Blue286_RGB_72ppiFirst I should explain what an MVP is. The Microsoft Most Valuable Professional award is presented to recognised experts in the technical community who regularly and voluntarily share their passion and knowledge with others, including Microsoft itself. The award celebrates an individual’s deep commitment to serving the community and promoting awareness of Microsoft’s great products through a variety of channels – be it running user groups, speaking at events, blogging, mentoring, answering questions on forums, creating and sharing free software, etc.  Aside from recognition, the award includes a number of benefits and privileges, including direct access to the Microsoft product teams via email distribution lists and Yammer, invitations to early previews and Product Group Interaction (PGI) meetings, exam vouchers, free software and subscriptions, and more. In return, Microsoft expects us to use these benefits and privileges to facilitate our continued involvement in the community and also to provide valuable feedback to Microsoft to help them design and build better products. The award lasts for one year, after which it may be renewed if the candidate has continued to demonstrate exemplary commitment and service.

The award is always granted in a particular Microsoft technological area or product for which the candidate has demonstrated expertise. My award was presented in February this year in the category of Azure – one of the broadest areas which happens to include enterprise integration. This was presumably in recognition of my efforts in running two user groups, speaking at multiple events (including Ignite Australia), writing posts on both Mexia’s and my personal blog, and promoting Microsoft events and announcements on social media. Yet, I always feel that I should be doing much more to serve an integration community that is always so willing to share and so appreciative of the efforts of those who do share!

This latest MVP event in Sydney was designed to allow us to come together and share ideas about how to improve our community leadership. It began with some fun networking activities on Friday afternoon & evening, followed by a more formal schedule of sessions on Saturday. It was a nice touch to arrive at the venue on Saturday morning and be presented with a personalised appreciation card complete with a custom Lego figurine. It quickly became evident that each Lego figurine was carefully selected to match the physical characteristics of the recipient in terms of hair colour, gender, etc!

image

iPhone Import 021 (2)After a warm welcome by Community Program Manager Lana Montgomery and a program update by Chris Olson from the DX team, there was a keynote about Digital Transformation by Technical Evangelist Vaughan Knight. This was followed by a series of enlightening “Skills Sessions” presented by prominent members of the MVP community including Troy Hunt, Orin Thomas, Marc Kean, Robert Crane, Leon Tribe and David Gardiner, spanning topics from developing your personal brand to blogging like a pro to recharging your user group. These were followed by some round table discussions around various topics relating to community involvement such as growing technical communities, embracing emerging technologies, supporting entrepreneurs and start-ups, and assisting young student technologists. The day wrapped up with some closing remarks by Lana and then a trip to the Helm Bar in Darling Harbour for some refreshing drinks before heading to the airport.

The event was not only well organised, informative and enjoyable, but I also found it truly inspiring to meet up with so many other enthusiastic technologists! While you might expect it to be intimidating to be a newbie in the company of so many extraordinarily brilliant and accomplished individuals, I actually found everyone to be extremely friendly and welcoming. I made many new friends that day as well as catching up with some old friends like Bill Chesnut, Mick Badran, Shane Hoey and Martin Abbott. Kudos to Lana for her exceptional organisational skills and her contagious positive energy! Really looking forward to more occasions like these, as well as the continuing support from Microsoft which helps us MVPs better support the wider community.

Australian MVP Community

Photos courtesy of Lana Montgomery

Great Week at Ignite Australia!

Great Week at Ignite Australia!

TLGsLast week I had the opportunity to attend Microsoft Ignite on the Gold Coast, Australia. Even better – I had a free ticket on account of agreeing to serve as a Technical Learning Guide (TLG) in the hands-on labs. This opportunity is only open to Microsoft Certified Trainers (MCTs) and competition was evidently keen this year – so I am glad to have been chosen. Catching up with fellow MCTs like Mark Daunt and meeting up with new ones such as Michael Schmitz was a real pleasure. Of course the down side was that I missed quite a few breakout sessions during the times I was rostered. Nevertheless, I still got to see some of the most important sessions to me, particularly those that centred around Azure and integration technologies. Please have a read of my summary of these on my employer’s blog.

By and far this was my best Australian Ignite/Tech-Ed event experience for many reasons, including:

  1. The Pro-Integration team from Redmond came all the way out to Australia show everyone what the product group is doing with Logic Apps, Flow, Service Bus, and BizTalk Server
  2. I was chosen to present an Instructor-Led Lab in Service Fabric – my first ever speaking engagement at Ignite
  3. I had the rare opportunity to catch up with some fellow MVPs from Perth and Europe.

It was truly phenomenal to see enterprise integration properly represented at an Australian conference, as it is typically overlooked at these events. In addition to at least four breakout sessions on hybrid integration, Scott Guthrie actually performed a live demo of Logic Apps in his keynote! This was a good shout-out to the product team that has worked so hard to bring this technology up to the usability level it now enjoys. I’m glad that Jim Harrer, Jeff Holland, Jon Fancey and Kevin Lam were there to see it!

iPhone Import 015Teaching the lab in Service Fabric was a thrilling experience, but not without some challenges. The lab itself was broken and required a re-write of the second half, which I had pre-prepared and uploaded to One-Drive here so the students could progress. The main lab content is only available to Ignite attendees, however if you want to have a go at a similar lab you can try these ones available from Microsoft:

Despite the frustration that some attendees expressed about the lab errata and the poor performance of the environment, I was pleased that all the submitted feedback relating to the speaker was very positive! Smile

iPhone Import 050iPhone Import 037 (2)Finally, perhaps the best part of events like these is the ability to catch up with old friends and meet some new ones. It was a pleasure to hang out with Azure MVP Martin Abbott from Perth and meet a few of his iPhone Import 047colleagues. It was also great to see Elder Grootenboer and Steef-Jan Wiggers from the Netherlands, who happened to travel to Australia this month on holidays and to speak at some events. Steef-Jan also took time to include me in a V-Log series he’s been working on with various integration MVPs, recording his 3-minute interview with me at the top of Mount Coot-tha on a sunny Brisbane Saturday! And Mexia’s CEO Dean Robertson & myself got to enjoy a nice dinner out with the Microsoft product group and the MVPs.

All good things must come to an end, but it was definitely a memorable week! Now it’s time to start getting ready for the Brisbane edition of the Global Integration Bootcamp on Saturday, 25th March, to be followed not long after by the Global Azure Bootcamp on Saturday 22nd April! I’ve got a few demos and presentations to prepare – but now with plenty of inspiration from Ignite!

Month of Achievements!

Month of Achievements!

“When it rains, it pours.”

Well, I must say I’ve had a pretty remarkable run the past few weeks! I can’t remember any point in my professional life where I’ve enjoyed so much reward and recognition in such a short time span.

mcsa-cloud-platform-certified-2017-smFor starters, after six weeks and probably 50-60 hours of study, I managed to pass my MS 70-533 Implementing Azure Infrastructure Solutions exam. Since I’d already passed the MS 79-532 Developing Azure Solutions exam a couple of month earlier, this earned me the coveted Microsoft Certified Solutions Associate (MCSA) qualification in Cloud Platform. I now have just one more exam to pass to earn the Microsoft Certified Solutions Expert (MCSE) qualification.

MVP_Logo_Preferred_Cyan300_RGB_72ppiThen just last week I was granted my first  Microsoft Most Valuable Professional (MVP) Award  in Azure! This is incredibly exciting to say the least, and not only a surprise to myself to have been identified with so many incredibly accomplished leaders in the industry, but also to many others because of the timing. Microsoft has revamped the MVP program so that now awards will be granted every month instead of every quarter, and everyone will have their review date on July 1st. I’m looking forward to using my newfound benefits to contribute more to the community.

And if that wasn’t enough, this week Mexia promoted me to the position of Principal Consultant! Aside from the CEO (Dean Robertson), I am the longest serving employee, and the journey over the last seven and a half years has been nothing short of amazing. Seeing the company grow from just three of us into the current roster of nearly forty has been remarkable enough, but I’ve also watched us become a Microsoft Gold Partner and win a plethora of awards (including ranking #10 in the Australia’s Great Place to Work competition) along the way. Working with such an awesome team has afforded me unparalleled opportunities to grow as an IT professional, and I will strive to serve both our team and our clients to the best of my ability in this role.

So what else is on the horizon? Well for starters, next week I join the speaker list at Ignite Australia for the first time delivering a Level 300 Instructor-Led Lab in Azure Service Fabric. I’m also organising and presenting for the first ever Global Integration Bootcamp in Brisbane on 25th March, and doing the same for the fifth Global Azure Bootcamp on 22nd April. And somewhere in there I need to fit in that third Azure exam… It’s going to be a busy year!

Busy Times…Again

Busy Times…Again

You may have noticed that I haven’t been too active on the social media / blogging front of late. It certainly isn’t because there isn’t much to write about…especially when you consider the release of BizTalk Server 2016 (including the Logic Apps Adapter), the General Availability of Azure Functions, and many other integration events leading up to these! And for those on the certification path, there’s news of the refresh of the Azure exams as well.

In fact, it is that very last item that accounts for a good deal of my scarcity in the blogging world of late. My employer is keen for as many of us as possible to earn the Microsoft Certified Solution Expert (MCSE) accreditation in Azure. I’ve already passed first of three required exams, MS 70-532 Developing Microsoft Azure Solutions after several weeks of after hours study (hours that might have been spent blogging). That accomplishment has earned me this nice little badge:

exam-532-developing-microsoft-azure-solutionsI’m now currently studying for the next exam, MS 70-533 Implementing Microsoft Azure Infrastructure Solutions. Passing this exam will earn me a Microsoft Certified Solutions Associate (MCSA) qualification in Cloud Platform. However, it won’t stop there as I’ll need to pass one more exam – MS 70-534 Architecting Microsoft Azure Solutions in order to attain the coveted MCSE in Cloud Platform and Infrastructure. All I can say is that I’ll be doing a lot of studying over the Christmas holidays…

Aside from studying for exams, I’ve also been heavily tasked at work as Mexia has had a profoundly successful sales year in 2016 – which translates into an overload of work! No wonder we’re heavily recruiting right now, looking for those “unicorns” that can help us remain as the best integration consultancy in Australia. There has been a fair amount of travel lately, and as Mexia’s only Microsoft Certified Trainer (MCT) I will continuing to deliver courses in BizTalk Server Development, BizTalk Server Administration, BizTalk360, and Azure Readiness. That means many more hours preparing all of that content.

But it hasn’t stopped me from speaking, at least not entirely. Aside from regular presentations at the Brisbane Azure User Group (including this one on Microsoft Flow), I’ve also been a guest presenter at Xamarin Dev Days in Brisbane where I talked about Connected and Disconnected Apps with Azure Mobile Apps.

Looking forward to writing posts more regularly again after this exam crunch is over. There’s a lot of exciting things happening in the integration world right now!

New Online API Mapping Tool

New Online API Mapping Tool

One of the many challenges with an integration project is typically the mapping of messages from one API to another. The difficulty most often lies not with the technical implementation (although some former projects mapping SAP iDocs to EDI X12 are still giving me nightmares), but rather with forming the specification of the mapping itself, including understanding the semantical meaning behind each element. This is difficult because it requires expert knowledge of both the source and target system, as well as an analysts who can correct “draw the connecting line” between the two. The correct end result is only achieved through significant collaboration amongst the relevant parties.

The BizTalk Mapper goes a long way to facilitating this task with it’s graphical mapping interface. Aside from providing the developer a means of rapidly implementing a transformation, it also servers as a visual representation of the mapping that can be understood by a business analyst (if not too complex):

biztalkmap

(image courtesy of MSDN)

There are two problems with this approach, however:

  1. It requires BizTalk Server, which is not only expensive, but also may be overkill for a solution that can easily be implemented in WCF, REST, or another platform;
  2. The mapping must be implemented by a developer before it can be shown to analysts and business users for discussion and validation. This usually entails a number of iterative cycles until the mapping is correct.

Enter api-map.com – a new free online tool created by my colleague Joseph Cooney specifically to address these particular challenges. api-map provides a medium to formulate, display and share mapping documentation which can eventually be handed over to a developer for implementation on any chosen platform.

As a first step, the tool allows you to upload schemas (either JSON or XML) with the ability to display, edit and annotate them:

image

These schemas can then be used to define mappings, even providing automatic  hints along the way using very clever heuristics. You can specify direct mappings or indicate that a transformation is required – including a description of the necessary condition and/or logic that defines the transformation. You can also map multiple source elements to a single target, and specify constant values to be assigned where appropriate.

Once this is completed, you can then display the mapping in a clear visual diagram that is easily understood by any analyst. Even better, you can combine multiple diagrams into one composite “end-to-end” view – providing a traceability which you cannot achieve within BizTalk maps. This is incredibly useful in the situation where canonical business schemas are employed within an ESB (a common scenario for most of my projects). And by selecting any element involved in a mapping, you get an independent end-to-end view of all elements involved in a mapping:

SNAGHTML8a9568c

Finally, when everything has been sorted, you can export the mapping to a handy Excel spreadsheet, serving as documentation within a source repository for developers to work from:

image

A few other nifty features include the ability to tag items to make them searchable, join teams in order to share project artefacts, and an option to attach images of a user interface to clarify the association of an element with a system control.

Watch Joseph’s video to see a live demonstration of the tool. Still in beta, Joseph is continually adding new features, but already I believe this will be a handy utility on many of my upcoming projects!