Persistence and Recoverability on the Microsoft Platform

Recently I added an article to the integration playbook which compares the different approaches used for Durability, Persistence and Retry across the various microsoft technologies such as BizTalk, Logic Apps, Event Grid, Event Hubs, Service Bus Messaging and Functions

You can read more here – https://www.integration-playbook.io/docs/durable-messaging-on-the-microsoft-platform

Message Correlation on Microsoft Integration Platform

Ive recently added an article in the Integration Playbook talking about how messages can be correlated between processing instances. In particular this compares the approach in BizTalk against the approach you can use in Logic Apps by combining Logic Apps and Service Bus

More info – https://www.integration-playbook.io/docs/message-correlation

Azure AD Set Passwords to Not Expire

This blog post is more of a reminder for myself as much as anything. I had a need to mark some service accounts in Azure AD so that their passwords dont expire.

The aim was that we had a few service accounts used in a couple of places and we wanted to have a controlled process to change their passwords.

To do this we did the following:

  • Create a group to store associate all of the service accounts for our project for easy management
  • Add all of the service accounts to that group
  • Run a script which will check every member of the group and to change the password policy so the password doesnt expire

I had a look online and couldnt really find a resource showing how to do this which didnt use the old Office 365 mso powershell functionality so I thought id share this for anyone else who might find it useful.

Below is the script I used and usually run each time we might need a new service account where we want more granular control of the changing of passwords for service accounts.

Set-ExecutionPolicy -ExecutionPolicy Unrestricted

install-module azuread
get-module azuread


function ProcessUsers([string] $groupName)
{
    Write-Host 'Processing Users Function strted'
     
    $ServiceAccountsGroup = Get-AzureADGroup -SearchString $groupName -All $true
    Write-Host 'Group Found' $ServiceAccountsGroup.DisplayName
    Write-Host 'Group Found' $ServiceAccountsGroup.ObjectId


    $groupMembers = Get-AzureADGroupMember -ObjectId $ServiceAccountsGroup.ObjectId -All $true

    Foreach ($member in $groupMembers)
    {
        Write-Host $member.DisplayName

        $user = Get-AzureADUser -ObjectId $member.ObjectId
        
        Write-Host 'Pre-update Password Policy: ' $user.PasswordPolicies
        Set-AzureADUser -ObjectId $user.ObjectId -PasswordPolicies DisablePasswordExpiration

        $user = Get-AzureADUser -ObjectId $member.ObjectId
        Write-Host 'Post-update Password Policy: ' $user.PasswordPolicies
        Write-Host 'AccountEnabled: ' $user.AccountEnabled

        Write-Host ''
        Write-Host ''
    }

    Write-Host 'Processing Users Function Ended' 
}


$cred = Get-Credential
Connect-AzureAD -Credential $cred
ProcessUsers -groupName '<Group name goes here>'
Write-Host 'All Done'

Inserting lots of rows into SQL with a Logic App and a Stored Procedure

Inserting lots of rows into SQL with a Logic App and a Stored Procedure

When you are working with API’s and Logic Apps and there is lots of rows of data involved you will sometimes come up with the following problems:

  1. An API often pages the data once you go beyond a certain number of records
  2. When you want to insert lots of rows with a Logic App into SQL you will usually have a loop which iterates over a dataset and does inserts
    1. This takes a long time to execute
    2. There is a cost implication to your implementation when you pay for each action

I recently had a scenario in this space and used quite a cool approach to solve the problem which I wanted to share.

Scenario

The scenario I had started in Shopify. When I add products & collections to my online store in Shopify I wanted to have a daily extract from Shopify to synchronise these new product/collections to my Azure SQL database which I use for reporting with Power BI.

To achieve this I would have a Logic App with a nightly trigger which would take the following actions:

  • Clean the table of which product is in which collection
  • Extract all products in collections via the Shopify API
  • Insert them all into the SQL table

The end result is I have a table which has all of the products in each collection listed for my analysis.

At a high level the scenario looks like the below diagram:

Implementation

As I mentioned above the problem is two folded here, when we query Shopify there many be thousands of products so we need to use a paging approach to query their API, secondly I want to insert into SQL in batches to try to minimise the number of action calls on SQL to improve performance and reduce cost.

Lets look at how I did this.

Paging API calls to Shopify

When it comes to the Shopify API you are able to execute a GET operation against the collection and it will return the products within it. If you have lots of products you can get them in pages. I chose to get 250 per time and you need to pass a page index to the API as a query parameter. The below picture shows you what a call to Shopify would look like with the paging parameters set.

Once I can make this single call I can then use a loop around the call to Shopify, but before I do this I need to know how many pages there are. I can do this by executing a GET against the collections API with the count extension on the url. This will return me the number of products in collections. You can see this below.

From the response I can parse the count and then I would set a variable which is the number of pages which I will work out with a calculation of dividing the number of products by the number of products I will get per page. I will also add 1 to this so I get 1 more page than the count incase the division is not a whole number. The calculation is shown below.

add(div(body(‘Parse_JSON_-_Count’)?[‘count’], 250),1)

Now I know the number of pages I can implement the loop where I will increment the page index each time until we have matched the number of pages. Within the loop we will get the next page of data from the API as shown in the picture below.

SQL Json Insert

It would be possible to just call the insert action for SQL in the logic app but if there are say 10000 products then the loop will do 10000 iterations which will take quite a while to run and also there is a cost associated with that. I wanted to look at options for inserting the data in batches. If I could insert the entire page returned from the API as a batch then with my 250 records at a time I could reduce the 10000 iterations down to 40. That should be a lot less time and a much lower cost.

To do this I developed a stored procedure where I passed the entire JSON string from the API response to the stored procedure as an NVARCHAR(max) parameter. In the stored procedure I was fortunate that the format of the json in this case was very table/row like making it easy to do this insert. I used SQL’s OPENJSON feature and was able to insert the entire page of data from the API in a simple insert statement as you can see in the SQL below.

Summary

Once it was all put together I was able to run my Logic App to refresh my SQL database each night and the process took 10 seconds to copy across 2500 records. This took 10 iterations of the loop.

That’s a nice and easy to support and run Logic App which does a nice job in this case.

Accelerating Business Opportunities with Power Apps and Integration

Accelerating Business Opportunities with Power Apps and Integration

Recently I have been looking at some opportunities to utilise the new Model-Driven capabilities in Power Apps. I spent some time at Integrate 2018 chatting to Kent Weare about some of its capabilities and realised it was a great fit for some of the architecture challenges we have. Before I go into some of the opportunities in a sample architecture lets consider an existing setup.

Existing Architecture

In the existing architecture we have a cloud hosted integration platform which the company uses to integrate partners into Dynamics CRM Online and some existing on premise line of business applications. The cloud integration platform is able to support partners submitting data via multiple channels. In this case we have a traditional SFTP and batch based mechanism which old school partners still use. With this pattern we use BizTalk where it excels on the IaaS part of the platform to manage multiple partners submitting different file formats all being converted to a canonical format and then messages are loaded into systems via helper functions on Azure which implement the service façade pattern.

You can see this in the diagram below represented by Partner B.

We also have partners who use more modern approaches to integration where we expose an API via Azure APIM which allows them to submit data which is saved to a queue. BizTalk will process the queue and reuse the existing functionality to load data into our core systems.

The Challenge

While we support 2 example channels in this architecture, we have a massive partner network with different capabilities and some partners even use a person to person and email based interactions. If you imagine a person in a call centre is sent an email with some data or a form in the post and they will type the data into systems manually.

As the application architecture expanded there were more systems these users would need to work with and we needed to find efficiencies to optimise the user entering data. The more records a user can enter in 1 day the bigger the potential cost savings.

The challenge with this was to provide a new form to enter data that was simple and quick. We initially looked at options like Microsoft Forms and Cognitio Forms which could allow us to create forms to capture data but they missed ticking boxes on some of the key non functional requirements such as security and authentication. We needed something which had more features than these options which were good but too simple.

Above we do have Dynamics CRM but the key problem with that like our other applications is that it is tied to a product backlog which means our changes and optimisations would need to fit within an agile release process which was delivering change in a complex system. What we really needed was a sandbox type application where we could build a simple App without many dependencies which would then integrate with our processes.

Proposed Architecture

Coming back to the discussion with Kent, I could see that model driven Power Apps is really like a cut down version of Dynamics and looking at some of the apps in the samples and that people are building you could see straightaway this could be a great opportunity. The Power Apps environment allowed us to build some forms and a data model very quickly to model the data we need users to capture.

We then implemented a logic app which would fire on the update of a record which would check for a field being set to indicate that the record was ready to be published. The logic app would extract the data from the Power App. The really cool bit was that I can use the Dynamics connectors in Logic Apps because the Power App is really just a Dynamics instance. The Logic App puts a message on a queue which is then used to reuse our existing integration.

The below picture represents the architecture from the perspective of the new Power App. Please note that to keep the diagram simple I have omitted the existing B2B SFTP and API integrations so that we can focus on the Power Apps bit.

From this point I now have a pretty simple Power App which can allow these users to input data manually into our process which we think can save a few minutes per record based on manually keying the record in the old ways.

The benefits of Power Apps though are way beyond just this, first off the key to empowering rapid change is that its in an isolated app focusing on just this use case. I don’t have to worry about all of the many features within a bigger CRM implementation. When it comes to implementing changes and regression testing things are much simpler.

At the same time the licensing is slightly different with Power Apps our users are using P1 licenses which aren’t that expensive and good for users who just run the Power App. we use P2 Power Apps licenses for those users who need to admin and develop the Power App.

We also get for free the integration with Azure AD so that our users have a good authentication story. This was one of the challenges with our previous considered options. The products we looked at which provided out of the box forms capability seemed to lack the ability to authenticate then restrict the users to just certain users and to then know who filled in which form. This is a key requirement.

When it comes to many of the other security scenarios as existing Dynamics users we have already gone through the governance around what Dynamics is, how it works, its security, etc. The model driven Power App seems to be just the same in terms of capabilities.

At one time we were considering building an ASP.net app for our users and when you consider everything PaaS on Azure offers for very little cost it would seem an attractive option, but compared to these new more powerful Power Apps I think removing the considerations about hosting, security, custom coding, design experience, etc you get so much out of the box that it’s a compelling argument to try the Power App.

At this point Power Apps seems to be offering a great opportunity for us to build those utility applications and system of engagement applications on an enterprise ready platform but without lots of custom development. Really focusing on delivering business value there seems to be loads of places we could use this.

Hopefully we can provide more info about Power Apps as our journey progresses.

Discussions about BizTalk Support Product Lifecycle at Integrate 2018

At the recent Integrate 2018 summit the Q&A drew some contentious questions from the audience about the next version of BizTalk and when it is going to be.  What was clear is that the product teams new approach of having a customer feedback driven back log means they have been busy and successful in delivering changes to Logic Apps and also the BizTalk feature pack and having just completed those they have not planned the next major release of BizTalk.

Now that being said, the team should have expected these questions because they always come up and I think an answer of “we aren’t ready to talk about that yet and we will get back to you” would have been fine, but there was a bit of fluff around the answers given which resulted in the audience drawing their own conclusions in a negative way.  After such a great conference I found myself wishing the Q&A had never taken place as this miscommunication at the end sent a lot of people away with a degree of confusion.

With that said in the pub later we were talking about the idea of product support lifecycles and I have always felt the problem around Microsoft tech was that there is too much info out there on the subject and that is actually detrimental to the intention.  I decided to test this idea by looking at the support lifecycle for some other vendors.  First off lets recap Microsoft’s position.

Microsoft BizTalk Server

Lets start by using the link below where community members have a nice easy to follow interpretation of the Microsoft Support Lifecycle for BizTalk.

https://social.technet.microsoft.com/wiki/contents/articles/18709.biztalk-server-product-lifecycle.aspx

VersionRelease DateEnd main supportEnd Extended Support
BizTalk Server 201612/01/201601/11/202201/11/2027
BizTalk Server 2013 R207/31/201407/10/201807/11/2023
BizTalk Server 201306/12/201307/10/201807/11/2023
BizTalk Server 201011/14/201001/12/201601/12/2021
BizTalk Server 200906/21/200907/08/201407/09/2019

You can see from the above table there is still some kind of support available for 5 versions of BizTalk covering up to 9 years from now.  Even a 9 year old version of BizTalk is still available under extended support for over 1 more year.

Now we have a picture of the Microsoft position, lets take a look at some of the other vendors out there.

Mulesoft

Below I have summarised some information from https://www.mulesoft.com/legal/versioning-back-support-policy

VersionRelease DateEnd of Standard SupportEnd of Extended Support
4.1March 20, 2018March 20, 2020 or laterMarch 20, 2022 or later
3.9October 9, 2017October 9, 2019October 9, 2021
3.8 – long term supportedMay 16, 2016November 16, 2018November 16, 2021
3.7July 9, 2015Nov 16, 2017Nov 16, 2019
3.6Jan 15, 2015Jan 15, 2017N/A
3.5 – long term supportedMay 20, 2014July 15, 2016 *July 15, 2019 *

Points to note:

  • MuleSoft provides Standard Support for the latest released minor version of the Mule runtime.
  • Once a new minor version for a major version is released, the previous minor version will receive Standard Support for an additional 18 months. All minor versions for a major version will receive Standard Support for a minimum of 2 years.
  • Starting with Mule 3.7 and later, after Standard Support ends, MuleSoft will offer Extended Support for an additional 2 years. Mule 3.5 and 3.8 will receive Extended Support for a total of 3 years.
  • Extended Support versions are only available on CloudHub for applications already deployed on it
  • Once a new major version is released, MuleSoft will continue to offer Standard Support for at least one minor version of the previous major version for a minimum of 3 years.
  • Once a minor version is outside the Standard Support and Extended Support windows, MuleSoft will provide End of Life Support.

My interpretation of the Mulesoft position compared to BizTalk is that the current version of Mulesoft has committed support for 2 years less than the current version of BizTalk and extended support for 5 years less than the current version of BizTalk.

Jitterbit

If we take a look at Jitterbit, their documentation states, “Jitterbit is committed to supporting a version for 12 months from the release date”.  So effectively each release is under support for a committed 12 months only.  It may be longer in reality but if we look at the example of their last version to end of life you can see below it was only supported for 1 year which seems fairly consistent.

Jitterbit Harmony Local Agent8.232017-05-052017-05-072018-05-07

You can find more info on the link below.

https://success.jitterbit.com/display/DOC/End-of-Life+Policy

My interpretation of the comparison of BizTalk vs Jitterbit is that Jitterbit are only committing to year on year support versus long term commitments from Microsoft.

Oracle Fusion Middleware

From the below link I was able to findout some basic info about Oracle Fusion.

http://www.oracle.com/us/support/library/lsp-middleware-chart-069287.pdf

VersionReleaseMain SupportExtended Support
Fusion Middleware 12c (12.2.x)Oct 2015April 2016Oct 2020

Talend

I could not find any specific information on the current versions or their support lifecycles however the below links provide some background info.

https://www.talend.com/legal-terms/us-support-policy/

http://talend.tips/release-history/

Dell Boomi

I could not find any information online about support life cycle commitments or versions released.  I did find some information on the release notes for each release which is available below to give an idea of how often change occurs.

http://help.boomi.com/atomsphere/GUID-0F0CDC3D-855B-411D-BB1F-65DC8042AB88.html

Snap Logic

I could not find any information on the support life cycle policies for Snap Logic.  There are release notes available on the below link:

https://docs-snaplogic.atlassian.net/wiki/spaces/SD/pages/49855/SnapLogic+Release+Notes

Tibco

I found the below Tibco document (dated March 2018) which indicates that the latest version of Tibco is under support until 30-Nov 2020.

http://supportinfo.tibco.com/docs/TIBCOEndofSupportInformation-Integration.pdf

Azure Logic Apps

Interestingly I can not find any public information about the public position on the support life cycle for Logic Apps.

Conclusion

The simple fact is that Microsoft have a publicly stated duration of support for the current version of BizTalk which is 5 years longer than any other vendor I am able to find information on!  That means if “BizTalk is dead”, based on info in the public domain all of the other vendors are going to be dead well before it.

With that said it is important to consider that iPaaS products may mean we need to think about the lifecycle in a different way because the idea of upgrading is not really the same thing, but with that in mind we do need to consider that the example of MABS from Microsoft was an iPaaS product which some customers bet on and was then deprecated.  With the volatility in the iPaaS market and the expected consolidation of vendors with things such as the recent acquisition of Mulesoft by SalesForce it may be worth considering if the absence of a communicated lifecycle on iPaaS is a good thing for customers.

I would draw the conclusion that at this stage the recent discussions around BizTalk product lifecycle are really just a case of under par communications and marketing on the Microsoft side which allow the myth to be created that BizTalk product lifecycle is a problem.  If you look at the hard facts actually it has by far the strongest story I can find.

If we compare the 2 biggest competitors, with BizTalk we are 4 years from the end of support of the current version, that’s 2 year more than a brand new release of Mule gets so maybe things aren’t so bad after all.

When data in CRM is updated I want to send it to another application

When data in CRM is updated I want to send it to another application

Having worked a lot with Dynamics CRM/365 over the last few years I thought it would be interesting to discuss a common use case and some of the architecture patterns you may consider to implement the solution.

Lets imagine a scenario where the business requirement is as follows:

  • The user will be updating a customers record in Dynamics 365
  • When the user saves the change we need the change to be synchronised with the billing system

Now at this point I am going to deliberately ignore flushing out these requirements too much. Any experiences integration person will now be thinking of a number of functional and non-functional questions they would want to get more information about, but the above is the typical first requirement. We will use this vagueness to allow us to explore some of the considerations when we look at the options that are available to solve the problem. One thing to note is I am going to consider this to be a 1 way interface for this discussion.

Option 1 – CRM Custom Plugin – Synchronous

In option 1 the CRM developer would use the extensibility features of Dynamics. This allows you to write C# code which will execute within the CRM runtime environment as a plugin. With a plugin you can configure when the code will execute. Options include things like:

  • When an entity is updated but before the save is made
  • When the entity is updated but after the save is made
  • As above but on other commands such as created/deleted

The below picture shows what this scenario will look like

Good things:

  • This is probably the quickest way you can get the data from the commit in CRM to the other application
  • This is probably the simplest way you can do this integration with the minimum number of network hops
  • This solution probably only needs the skill set of the CRM developer

Things to consider:

  • You would be very tightly coupling the two applications
  • You would have some potential challenges around error scenarios
    • What happens if the save to the other app works but the save to CRM doesn’t or visa-versa
  • The custom plugin is probably going to block the CRM users thread while it makes the external call which is asking for performance issues
  • You would need to consider if you would do the call to the other application before or after saving the data to CRM
  • You would need to consider where to store the configuration for the plugin
  • There would be error and retry scenarios to consider
  • There would be the typical considerations of tightly coupled apps
    • What if the other app is broken
    • What if it has a service window
  • Errors are likely to bubble up to the end user
  • You will have OOTB (out of the box) CRM plugin tracing diagnostics but this may require some custom code to ensure it logs appropriate diagnostic information

Option 1.5 – CRM Custom Plugin – Asynchronous

In this option the solution is very similar to the above solution with the exception that the developer has chosen to take advantage of the asynchronous system jobs feature in CRM. The plugin that was developed is probably the same code but this time the configuration of the plugin in CRM has indicated that the plugin should be executed out of process from the transaction where the user is saving a change. This means that the commit of the change will trigger a system job which will be added to the processing queue and it will execute the plugin which will send data to the other application.

The below picture illustrates this option.

Good things:

  • The synchronize transaction will no longer block the users thread when they save data
  • The system jobs gives a degree of troubleshooting and retry options if the other system was down compared to option 1
  • This only required CRM developer skills

Things to consider:

  • There may be other things on the processing queue so there is no guarantee how long it will take to synchronize
  • You may get race conditions if another transaction updates the entity and you haven’t appropriately covered these scenarios in your design
    • Also think about the concurrency of system jobs and other plugins
  • I have seen a few times where option 1 is implemented then flipped to option 2 due to performance concerns as a workaround
    • This needs to be thought about upfront
  • You may struggle to control the load on the downstream system
  • Again there is a tight coupling of systems. CRM has explicit knowledge of the other application and a heavy dependency on it
    • What if the app is down
    • What if there are service windows
  • Error scenarios are highly likely and there could be lots of failed jobs

Option 2 – CRM out of the Box Publishing to Azure Service Bus

Option 1 and 1.5 are common ways a CRM developer will attempt to solve the problem. Typically they have a CRM toolset and they try to use a tool from that toolset to solve the problem as bringing in other things was traditionally a big deal.

With the wide adoption of Azure we are starting to see a major shift in this space. Now many Dynamics projects are also including Azure by default in their toolset. This means CRM developers are also gaining experience with tooling on Azure and have a wider set of options available. This allows a shift in the mindset that not everything has to be solved in CRM and actually doing stuff outside of CRM offers many more opportunities to build better solutions while at the same time keeping the CRM implementation pure and focused on its core aim.

In this solution the CRM developer has chosen to add an Azure Service Bus instance to the solution. This means they can use the OOTB plugin (not a custom one) in CRM which will publish messages from CRM to a queue or topic when an entity changes. From here the architecture can choose some other tools to get messages from Service Bus to the destination application. For simplicity in this case I may choose an Azure Function which could allow me to write a simple bit of C# to do the job.

The below solution illustrates this:

Good things:

  • No custom coding in CRM
  • The Service Bus plugin will be much more reliable than the custom one
  • The Service Bus plugin will get a lot of messages out to Service Bus very fast by comparison to the custom plugin in 1.5 which will bottleneck on the downstream system probably
  • Service Bus supports pub/sub so you can plugin routing of messages to other systems
  • The Azure Function could be developed by the CRM developer quite easily with a basic C# skillset
  • Service Bus offers lots of retry capabilities
  • The queue offers a buffer between the applications so there is no dependency between them
  • The function could be paused in downtime so that CRM can keep pumping out changes and they will be loaded when the other app is back online
  • The solution will be pretty cheap, you will pay a small cost for the service bus instance and per execution for the function. Unless you have very high load this should be a cheap option

Things to consider:

  • The key thing to remember here is that the solution is near realtime. It is not an instant synch. In most cases it is likely the sync will happen very quickly but the CRM System Jobs could be one bottleneck if you have lots of changes or jobs in CRM. Also the capability of the downstream system may be a bottleneck so you may need to consider how fast you want to load changes
  • The only bad thing is that there are quite a few moving parts in this solution so you may want to ensure you are using appropriate management and monitoring for the solution. In addition too CRM System jobs you may want to consider Service Bus 360 to manage and monitor your queues and also Application Insights for your Azure Functions

Option 3 – Logic App Integration

In option 3 the developer has chosen to use a Logic App to detect changes in CRM and to push them over to the other application. This means that the CRM solution is very vanilla, it doesn’t even really know that changes are going elsewhere. In the above options a change in CRM triggered a process to push the data elsewhere. In this option the Logic App is outside CRM and is periodically checking for changes and pulling them out.

Typically the Logic App will check every 3 minutes (this is configurable) and it will pull out a collection of changes and then 1 instance of the logic app will be triggered for each change detected.

The logic app will then use an appropriate connector to pass the message to the downstream application.

The below picture shows what this looks like.

Good things:

  • There is nothing to do in CRM
  • The Logic App will need monitoring and managing separate to CRM
  • The Logic App is not part of the CRM developers core skill set, but they are very simple to use so it should be easy to pick this up
  • The Logic App has a lot of features if you run into more advanced scenarios
  • The Logic App has connectors for lots of applications
  • You may be able to develop the solution with no custom code
  • The Logic App has some excellent diagnostics features to help you develop and manage the solution
  • The Logic App has retry and resubmit capabilities
  • The solution will be pretty cheap with no upfront capital cost. You just pay per execution. Unless you have very high load this should be a cheap option
  • This option can also be combined with Service Bus and BizTalk Server for very advanced integration scenarios

Things to consider:

  • Is the polling interval going to be often enough
  • Only the most recent change will be extracted, if a particular row has been updated 3 times since the last trigger you will get the latest stage
  • It may require some more advanced patterns to control the load if the downstream system is a bottleneck. This may be beyond the CRM developers Logic App skills

Option 4 – SSIS Integration

The next option to consider is an ETL based approach using SSIS. This approach is quite common for CRM projects because they often have people with SQL skills. The solution would involve setting up an SSIS capability and then purchasing the 3rd party Kingswaysoft SSIS connectors which includes support for Dynamics.

The solution would then pull out data from CRM via the API using a fetch xml query or OData Query. It would then push the changes to the destination system. Often SSIS would be integrating at database level which is its sweetspot but it does have the capability to call HTTP endpoints and API’s.

Although the diagrams look similar, the big difference between the Logic App approach and SSIS is that SSIS is treating the records as a batch of data which it is attempting to process in bulk. The Logic App is attempting to execute a separate transaction for each row it pulls out from the CRM changes. Each solution has its own way of dealing with errors which makes this comparison slightly more complex, but typically think of the idea of a batch of changes vs individual changes.

In the SSIS solution it is also very common for the solution to include a staging database between the systems where the developer will attempt to create some separation of concern and create deltas to minimize the size of the data being sent to downstream systems.

Good things:

  • You can process a lot of data very quickly
  • Common approach on CRM projects
  • Kingswaysoft product is mature
  • Predominantly configuration based solution
  • Sometimes error scenarios can be complex

Things to consider:

  • Capital cost for 3rd party software and probably maintenance too
  • Need to consider where to host SSIS (Azure VM or On Premise VM) – Cost associated with this
  • Possible license cost for SQL depending on organisation setup
  • You will sync on a schedule, how often does it need to be
    • The more frequent the less data each time
    • Cant be too frequent
  • How will you monitor and schedule the SSIS package

There is no right or wrong answer based on the original 2 line requirement we got, but you can see each solution has a lot to think about.

This emphasises the importance of asking questions and elaborating on the requirements and working out the capabilities of the applications you will integrate with before choosing which option to take. As a general rule I would recommend not to jump too quickly to option 1 or 1.5. As an integration guy we usually frown upon these kind of options because of the way they couple applications and create long term problems even though they might work initially. I think the other 3 options (2-4) will be relatively easy to choose between depending on the requirements elaboration but with option 1 and 1.5 I would only choose these in niche cases and I would do so only with full buy in from your architecture team that you have a justifiable reason for choosing it that has been documented enough to be able to explain later when someone comes along and asks WTF?

One other factor to consider which we didn’t touch on too much above. I have kind of assumed you have an open toolset on todays typical CRM and Azure project. It may also be the case that your project has some constraints which may influence your decision to choose one option over the other. I hope in these cases the above considerations will help you to validate the choice you make or also give you some ammunition if you feel that you should challenge the constraint and consider another option.

Can the user modify the Integration solution – Discuss

Recently I had a chat with a few people at a company about integration solutions and a question came up which I remember not having been asked in a while. “Will the business users be able to modify the solution after its live”. Back in the day you used to get asked this quite often but I don’t remember being asked this one for a while. I do know that the technology landscape has changed a lot since I last got asked this and after a very interesting discussion I thought id share my thinking of this space.

Why would the user want to modify the solution

There are a few different things that come up as the reason for this desire. It’s a good idea in this discussion with a customer to find out their drivers, they may not always be the same. Some of the common ones include:

  • IT guys are expensive so we don’t want to constantly need to pay a premium for changes
  • IT take too long to make the changes
  • A change may seem very simple yet IT seem to turn it into a project
  • We need to react to customer’s needs and changes quickly

What are the ways in which the user may modify the solution

Different solutions will have different things that can be changed or tuned but if we think of common solutions we see these are some of the areas we can change things:

  • Config settings in the solution which are used to drive decision logic. EG a threshold for an order which is considered large. Depending upon where the config settings are stored they may be able to be modified by the business user
  • Business Rules may be editable by a business user
  • Flow logic, the solution may have a workflow which could be edited by the user
  • On boarding new customers/partners is a part of the integration solution the business user may wish to take control of
  • Data mapping and transformation rules

Im sure there are probably plenty of other areas but above are just a few off the top of my head

Should the user be allowed to modify the solution

Assuming the reason that the business users want to be able to take control of modifying the solution or parts of it is valid then a decision over if this should be allowed or not is likely to come down to the desire of the product owner or sponsor and the governance areas of the solution.

What we need to consider is that if we elevate the business users permissions to control parts of the integration solution, does this mean we trade off any other parts of the solution. In this area we need to consider the areas of the solution that the business user will often have low regard for such as performance, testing, application lifecycle management, security and risk. If for example we go and grant permissions for the business user to modify the work flow of an integration process then the changes are they will reactively go an do this in response to a change required by their customer. They may not think to communicate the planned change, they may not test it properly and may think their change has worked by they don’t understand that they have now broken 3 things down stream which have been affected by this change.

The one think you can guarantee is that the business user will very rarely have an understanding of dependencies and consequences within the system. I say this quite boldly because often many IT people do not have this understanding either. When it comes to making changes, even if you have a very good IT architecture, treating the change like pulling out a piece of a Jenga puzzle is a good approach to make. First you need to work out if this is a good piece to be messing around with or not. Maybe its pretty safe and you can let your business user get on with it and at other times you need to be very cautious and through.

Having a view on the changes the business would like to make overlayed with some kind of heat map of your architecture will tell you the safe Jenga pieces for the business to take control of and the areas of risk when you need to keep in IT.

When you then let the business make these changes themselves, you still need to implement a change management process for them to follow. You don’t want to be in a position where the business user made a change which wasn’t reflected back into the system source configuration so that next time there is a major release the change is regressed. Teams still need to work together.

Types of solution

Once you have identified areas and rules around the business user making changes in isolation, I guess you have laid out your rules for engagement as you begin democratizing integration in certain areas.

I would expect that you find solutions would fall into certain types which are more and less acceptable for the business to change.

Citizen Integrator

Citizen Integrator solutions are those ones which were either built by the business or built by IT but can be handed over to be looked after day to day by business users. These may be solutions for individuals or solutions for teams.

In this space you will find super users in the business such as Excel guru’s begin to thrive. They can build solutions for their team and themselves and really innovate.

One of the challenges from an IT perspective is that if the Citizen Integrator builds a solution that becomes too important or if it becomes relevant to be managed by regulatory rules which the business function may not be aware of but IT have dedicated functions to support.

How do you as an IT function stop bob the HR junior intern from building the mission critical staff holiday system replacement project without anyone in IT being aware?

Id expect the business user to be able to make the following changes in this type of solution:

Lightweight Low Risk Integration

Lightweight integration projects may be good candidates for business users to “modify”. I would consider light weight to mean not lots of load, not very complex and fairly well isolated. In this scenario you may choose to allow a business super user to some changes in certain areas but it is likely some areas may require a more advanced skillset.

Id expect the business user to be able to make the following changes in this type of solution:

Mission Critical & High Risk

In a mission critical integration solution I would expect that there would be a pretty thourgh change management process that would control changes to the system. In these cases the consequences of breaking something out-weigh the benefits of quick changes. I would expect that most changes would involve a degree of impact analysis followed by a controlled change process involving making the change, deploying it in a repeatable way, testing the change and making the change live.

The intention of the process overhead is to remove risk of things going wrong and sometimes removing the risk of politics in the organisation if something was broken that affected the core business.

I would expect in this case the attitude to the business super user making changes would be:

Those examples above are just some of the ones you might come across. It really depends on the organisation and its attitude to risk and change. An example you might find is a mission critical system which has certain parts of the system which are very safe for business users to modify. An example might be an architecture where Dynamics CRM is used to provide a lot of the settings and configuration for a customer and then integration processes use these settings as required. This gives the user a place where they can safely modify some parts of the system which are used by others. I think the key point here though is it comes back to a heat map of your architecture so you know the safe Jenga pieces and the unsafe ones.

Technology Comparison

Up until this point I have tried to think about things agnostic of technology, but if we also bring in this angle it gets even more interesting. Back a number of years ago when we had a small set of tools available there were only a few limited choices in this space, but now we have many more choices.

Flow & Power Apps

If we know we have requirements for the business user to take an active part in the solution and its maintenance then Flow and Power Apps give us an ecosystem which is built for these users. In a solution we can incorporate these tools to provide a safe(ish) area for the business user to do their bits. We can even give the business user control of the entire solution if its appropriate.

Flow and Power Apps should be a part of our integration architecture anyway and give us a big tick in the box of empowering the business user to be an active stakeholder in an integration solution.

Logic Apps

Logic Apps are a really interesting one, they have the features or a high power mission critical integration tool but the ability to sandbox logic apps in resource groups means it is possible for us to use some logic apps for those IT only use cases and have other Logic Apps where the business user could be granted access to Azure and the resource group to be able to manage and modify if it was appropriate.

BizTalk

BizTalk is one of the tools where there are not that many choices for the business user. It is unlikely we would want the business user to make changes to anything which is not then handed over to IT for deployment. That said in an agile environment a BizTalk developer and business subject matter expert working closely on a solution can be a very good way to work.

Rules, Cross Reference Data, port settings and Configuration settings are the most likely candidates for a desire to change but I think the risks of doing this without an ALM process would outweigh any benefits.

One point to note with BizTalk is that BizTalk 360 provides a number of features which can allow a business user to manage their integration solution. While BizTalk might be one of the less friendly tools to allow a business super user to make changes, BizTalk 360 can allow the person to manage their messages and process instances if they want. This can be done in a safe way.

SSIS & Data Factory

Azure Data Factory and SSIS are like BizTalk in that it would be difficult to get a business user to be able to do anything with them. They are a fairly closed environment and require a significant skill set to do anything with them. Id see these as IT only tools.

Service Bus

Service Bus is an interesting one, you could imagine a scenario where the business user might request that all new customer messages are now also sent to a new application they are buying. Conceptually its not too much of a leap to see an advanced user setting up a new subscription from the new customer topic to the new application worker queue. In the real world however I can imagine that most of these changes would require some additional work around the applications either sending or receiving messages so I think the service bus change would be unlikely to be done in isolation.

I think a business user may also struggle to understand message flows without an upfront visualization tool.

With this in mind I suspect Service Bus would be unlikely to be a good fit for requirements for business users to modify.

Event Grid

My first thought is that event grid would fall into the same space as service bus messaging, but maybe in the future as the solution matures the fact that the data is events rather than messages may mean that there are certain scenarios that the business may dynamically change their interest in. In wonder about scenarios like a university clearing time where perhaps a super user gets a really cool idea of a way to improve the likelihood of a new student signing up and would like to explore events for just 2 or 3 conditions for a couple of days. Self service subscription to events in this nature could be a really powerful way of experimentation within the business.

Summary

I think the answer to the initial question has changed a lot over the last few years, its great to be in the position where we have loads of technical options and rather than being limited in how we could include those requirements into a solution its now a case of making sure we don’t go too over the top and remember that there is still an important place for governance and best practices in addition to agility and flexibility.

Exciting times really

Documenting a Logic Apps solution

Documenting a Logic Apps solution

One of the biggest challenges in integration projects over the years is how to manage the relationship between the implementation of your solution and the documentation describing the intention of the solution and how it works and how to look after it.

We have been through paradigm shifts where projects were writing extensive documentation before a single line of code was written through to the more agile approaches with leaner documentation done in a just in time fashion. Regardless of these approaches the couple of common themes that still exist is:

  • How do we relate the documentation to the implementation?
  • How do we keep the documentation up to date?
  • How do we make it transparent and accessible?

Ever since the guys at Mexia in Brisbane introduced me to Confluence, I have been a huge supporter of using this for integration. I have found the shift from people working over email and in word documents to truly collaborating in real time in Confluence to be one of the biggest factors for success in projects I have worked on. The challenge still remained how to create the relationship between the documentation and the implementation.

Fortunately with Azure we have the ability to put custom tags on most resources. This gives us a very easy way to start adding links to documentation to resources in Azure.

In the below article Ill show how we use this with a Logic Apps solution.

The Resource Group + Solution Level Documentation

In this particular case most of our solution is encapsulated within the resource group. This works great as we can have a document describing the solution. Below shows the confluence page we have outlining the solution blueprint and how it works.

This solution documentation can then be added to the tags of the resource group as shown in the picture below.

The Logic App + Interface Specific Documentation

Next up we have the interface specification from our interface catalogue. Below is an example page from confluence to show a small sample of what one of these pages may look like. The specification will typically include things like analysis, mappings and data formats, component diagrams, process diagrams, etc.

In the logic app we are able to use the tags against the logic app to create a relationship between our documentation and the implementation. In the below example I have added a link to the solution blueprint the locig app implements and also some specific documentation for that interface.

API Connector and Application specific connection documentation

Next up we have application connector and API documentation. In the resource group we have a set of API connectors we have created. In this case we have the SQL one for connecting to our SQL Azure database as shown below.

In its tags we can now add a link to a page in confluence where we will document specifics related to this connection.

In this particular case one of the interesting things is how we will be using the MERGE command in SQL to upsert a record to the database, we have documented how this works in Confluence and we can easily link to it from the SQL connector. Below is an example of this documentation.

Summary

In summary you can see that tags enables us to close the gap between implementation and documentation/guidance.  This is a feature we should be able to use a lot!