Introducing Serverless360 Resource Map for Azure

Introducing Serverless360 Resource Map for Azure

At Integrate 2020 we announced the release of Resource Map which is a new feature in Serverless360. The aim of the feature is to help you organise your cloud estate and keep it structures within a logical model which will help to demystify the complexity of viewing your estate through the physical deployment model which the Azure Portal gives you. Resource Map will allow you to group resources into logical scopes which will make sense to a non Azure Expert and it will help your team keep on top of keeping a clean and well organised environment.

The below picture shows how we will have a tree representation of your estate where resources can be given to scope. When a scope is tidy it will show up as green and when there is clean up required it will show as red or yellow indicating resources arent organised.

Once your resources are mapped to a scope, you can then indicate which resources are dev/test/production versions of each other covering all of your test environments so you can view a cross tab indicating which resources belong to which environment.

You will also be able to do things like automate the allocation of resources in your map and do things like cost analysis below:

I also added a couple of videos which extend on my demos from Integrate 2020

Intro to Resource Map

Setting up Resource Map Manually

More info is on this post on the Serverless360 blog below.

Resource Map for Azure in Serverless360

The post Introducing Serverless360 Resource Map for Azure appeared first on Microsoft Integration & Cloud Architect.


Introducing Serverless360 Resource Map for Azure was first posted on June 3, 2020 at 11:38 am.
©2019 “Microsoft Integration & Cloud Architect“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at michael_stephensonuk@yahoo.co.uk

Pragmatic Approach to Configuring Logic App Parameters

Recently I added to the Integration Playbook an article talking about how we handle configuration settings for Logic Apps on the local dev box and devops pipelines with App Config, Key Vault and Pipeline Variables.  There are a few videos walking through the approach

https://www.integration-playbook.io/docs/using-bam-with-azure-integration-services

The post Pragmatic Approach to Configuring Logic App Parameters appeared first on Microsoft Integration & Cloud Architect.


Pragmatic Approach to Configuring Logic App Parameters was first posted on June 2, 2020 at 6:29 pm.
©2019 “Microsoft Integration & Cloud Architect“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at michael_stephensonuk@yahoo.co.uk

Service Bus + Terraform

I have worked with Azure Service Bus for years and one of the biggest challenges was always how to manage change on the Queues and Topics within a Service Bus Namespace. You could use a tool like Service Bus Explorer to manually make the changes, but this approach is manual and error prone. You can do exports and imports of configuration but a bit like managing SQL Script changes it’s tough to work out the delta and deploy just what’s needed to your various environments. If only there was a way to do this just like you do for any other Azure component with CI and CD pipelines.

Fortunately there is and you can read more about it in my article posted on Serverless360 blog – https://www.serverless360.com/blog/alm-with-azure-service-bus

The post Service Bus + Terraform appeared first on Microsoft Integration & Cloud Architect.

Persistence and Recoverability on the Microsoft Platform

Recently I added an article to the integration playbook which compares the different approaches used for Durability, Persistence and Retry across the various microsoft technologies such as BizTalk, Logic Apps, Event Grid, Event Hubs, Service Bus Messaging and Functions

You can read more here – https://www.integration-playbook.io/docs/durable-messaging-on-the-microsoft-platform

Message Correlation on Microsoft Integration Platform

Ive recently added an article in the Integration Playbook talking about how messages can be correlated between processing instances. In particular this compares the approach in BizTalk against the approach you can use in Logic Apps by combining Logic Apps and Service Bus

More info – https://www.integration-playbook.io/docs/message-correlation

Azure AD Set Passwords to Not Expire

This blog post is more of a reminder for myself as much as anything. I had a need to mark some service accounts in Azure AD so that their passwords dont expire.

The aim was that we had a few service accounts used in a couple of places and we wanted to have a controlled process to change their passwords.

To do this we did the following:

  • Create a group to store associate all of the service accounts for our project for easy management
  • Add all of the service accounts to that group
  • Run a script which will check every member of the group and to change the password policy so the password doesnt expire

I had a look online and couldnt really find a resource showing how to do this which didnt use the old Office 365 mso powershell functionality so I thought id share this for anyone else who might find it useful.

Below is the script I used and usually run each time we might need a new service account where we want more granular control of the changing of passwords for service accounts.

Set-ExecutionPolicy -ExecutionPolicy Unrestricted

install-module azuread
get-module azuread


function ProcessUsers([string] $groupName)
{
    Write-Host 'Processing Users Function strted'
     
    $ServiceAccountsGroup = Get-AzureADGroup -SearchString $groupName -All $true
    Write-Host 'Group Found' $ServiceAccountsGroup.DisplayName
    Write-Host 'Group Found' $ServiceAccountsGroup.ObjectId


    $groupMembers = Get-AzureADGroupMember -ObjectId $ServiceAccountsGroup.ObjectId -All $true

    Foreach ($member in $groupMembers)
    {
        Write-Host $member.DisplayName

        $user = Get-AzureADUser -ObjectId $member.ObjectId
        
        Write-Host 'Pre-update Password Policy: ' $user.PasswordPolicies
        Set-AzureADUser -ObjectId $user.ObjectId -PasswordPolicies DisablePasswordExpiration

        $user = Get-AzureADUser -ObjectId $member.ObjectId
        Write-Host 'Post-update Password Policy: ' $user.PasswordPolicies
        Write-Host 'AccountEnabled: ' $user.AccountEnabled

        Write-Host ''
        Write-Host ''
    }

    Write-Host 'Processing Users Function Ended' 
}


$cred = Get-Credential
Connect-AzureAD -Credential $cred
ProcessUsers -groupName '<Group name goes here>'
Write-Host 'All Done'

Inserting lots of rows into SQL with a Logic App and a Stored Procedure

Inserting lots of rows into SQL with a Logic App and a Stored Procedure

When you are working with API’s and Logic Apps and there is lots of rows of data involved you will sometimes come up with the following problems:

  1. An API often pages the data once you go beyond a certain number of records
  2. When you want to insert lots of rows with a Logic App into SQL you will usually have a loop which iterates over a dataset and does inserts
    1. This takes a long time to execute
    2. There is a cost implication to your implementation when you pay for each action

I recently had a scenario in this space and used quite a cool approach to solve the problem which I wanted to share.

Scenario

The scenario I had started in Shopify. When I add products & collections to my online store in Shopify I wanted to have a daily extract from Shopify to synchronise these new product/collections to my Azure SQL database which I use for reporting with Power BI.

To achieve this I would have a Logic App with a nightly trigger which would take the following actions:

  • Clean the table of which product is in which collection
  • Extract all products in collections via the Shopify API
  • Insert them all into the SQL table

The end result is I have a table which has all of the products in each collection listed for my analysis.

At a high level the scenario looks like the below diagram:

Implementation

As I mentioned above the problem is two folded here, when we query Shopify there many be thousands of products so we need to use a paging approach to query their API, secondly I want to insert into SQL in batches to try to minimise the number of action calls on SQL to improve performance and reduce cost.

Lets look at how I did this.

Paging API calls to Shopify

When it comes to the Shopify API you are able to execute a GET operation against the collection and it will return the products within it. If you have lots of products you can get them in pages. I chose to get 250 per time and you need to pass a page index to the API as a query parameter. The below picture shows you what a call to Shopify would look like with the paging parameters set.

Once I can make this single call I can then use a loop around the call to Shopify, but before I do this I need to know how many pages there are. I can do this by executing a GET against the collections API with the count extension on the url. This will return me the number of products in collections. You can see this below.

From the response I can parse the count and then I would set a variable which is the number of pages which I will work out with a calculation of dividing the number of products by the number of products I will get per page. I will also add 1 to this so I get 1 more page than the count incase the division is not a whole number. The calculation is shown below.

add(div(body(‘Parse_JSON_-_Count’)?[‘count’], 250),1)

Now I know the number of pages I can implement the loop where I will increment the page index each time until we have matched the number of pages. Within the loop we will get the next page of data from the API as shown in the picture below.

SQL Json Insert

It would be possible to just call the insert action for SQL in the logic app but if there are say 10000 products then the loop will do 10000 iterations which will take quite a while to run and also there is a cost associated with that. I wanted to look at options for inserting the data in batches. If I could insert the entire page returned from the API as a batch then with my 250 records at a time I could reduce the 10000 iterations down to 40. That should be a lot less time and a much lower cost.

To do this I developed a stored procedure where I passed the entire JSON string from the API response to the stored procedure as an NVARCHAR(max) parameter. In the stored procedure I was fortunate that the format of the json in this case was very table/row like making it easy to do this insert. I used SQL’s OPENJSON feature and was able to insert the entire page of data from the API in a simple insert statement as you can see in the SQL below.

Summary

Once it was all put together I was able to run my Logic App to refresh my SQL database each night and the process took 10 seconds to copy across 2500 records. This took 10 iterations of the loop.

That’s a nice and easy to support and run Logic App which does a nice job in this case.

Accelerating Business Opportunities with Power Apps and Integration

Accelerating Business Opportunities with Power Apps and Integration

Recently I have been looking at some opportunities to utilise the new Model-Driven capabilities in Power Apps. I spent some time at Integrate 2018 chatting to Kent Weare about some of its capabilities and realised it was a great fit for some of the architecture challenges we have. Before I go into some of the opportunities in a sample architecture lets consider an existing setup.

Existing Architecture

In the existing architecture we have a cloud hosted integration platform which the company uses to integrate partners into Dynamics CRM Online and some existing on premise line of business applications. The cloud integration platform is able to support partners submitting data via multiple channels. In this case we have a traditional SFTP and batch based mechanism which old school partners still use. With this pattern we use BizTalk where it excels on the IaaS part of the platform to manage multiple partners submitting different file formats all being converted to a canonical format and then messages are loaded into systems via helper functions on Azure which implement the service façade pattern.

You can see this in the diagram below represented by Partner B.

We also have partners who use more modern approaches to integration where we expose an API via Azure APIM which allows them to submit data which is saved to a queue. BizTalk will process the queue and reuse the existing functionality to load data into our core systems.

The Challenge

While we support 2 example channels in this architecture, we have a massive partner network with different capabilities and some partners even use a person to person and email based interactions. If you imagine a person in a call centre is sent an email with some data or a form in the post and they will type the data into systems manually.

As the application architecture expanded there were more systems these users would need to work with and we needed to find efficiencies to optimise the user entering data. The more records a user can enter in 1 day the bigger the potential cost savings.

The challenge with this was to provide a new form to enter data that was simple and quick. We initially looked at options like Microsoft Forms and Cognitio Forms which could allow us to create forms to capture data but they missed ticking boxes on some of the key non functional requirements such as security and authentication. We needed something which had more features than these options which were good but too simple.

Above we do have Dynamics CRM but the key problem with that like our other applications is that it is tied to a product backlog which means our changes and optimisations would need to fit within an agile release process which was delivering change in a complex system. What we really needed was a sandbox type application where we could build a simple App without many dependencies which would then integrate with our processes.

Proposed Architecture

Coming back to the discussion with Kent, I could see that model driven Power Apps is really like a cut down version of Dynamics and looking at some of the apps in the samples and that people are building you could see straightaway this could be a great opportunity. The Power Apps environment allowed us to build some forms and a data model very quickly to model the data we need users to capture.

We then implemented a logic app which would fire on the update of a record which would check for a field being set to indicate that the record was ready to be published. The logic app would extract the data from the Power App. The really cool bit was that I can use the Dynamics connectors in Logic Apps because the Power App is really just a Dynamics instance. The Logic App puts a message on a queue which is then used to reuse our existing integration.

The below picture represents the architecture from the perspective of the new Power App. Please note that to keep the diagram simple I have omitted the existing B2B SFTP and API integrations so that we can focus on the Power Apps bit.

From this point I now have a pretty simple Power App which can allow these users to input data manually into our process which we think can save a few minutes per record based on manually keying the record in the old ways.

The benefits of Power Apps though are way beyond just this, first off the key to empowering rapid change is that its in an isolated app focusing on just this use case. I don’t have to worry about all of the many features within a bigger CRM implementation. When it comes to implementing changes and regression testing things are much simpler.

At the same time the licensing is slightly different with Power Apps our users are using P1 licenses which aren’t that expensive and good for users who just run the Power App. we use P2 Power Apps licenses for those users who need to admin and develop the Power App.

We also get for free the integration with Azure AD so that our users have a good authentication story. This was one of the challenges with our previous considered options. The products we looked at which provided out of the box forms capability seemed to lack the ability to authenticate then restrict the users to just certain users and to then know who filled in which form. This is a key requirement.

When it comes to many of the other security scenarios as existing Dynamics users we have already gone through the governance around what Dynamics is, how it works, its security, etc. The model driven Power App seems to be just the same in terms of capabilities.

At one time we were considering building an ASP.net app for our users and when you consider everything PaaS on Azure offers for very little cost it would seem an attractive option, but compared to these new more powerful Power Apps I think removing the considerations about hosting, security, custom coding, design experience, etc you get so much out of the box that it’s a compelling argument to try the Power App.

At this point Power Apps seems to be offering a great opportunity for us to build those utility applications and system of engagement applications on an enterprise ready platform but without lots of custom development. Really focusing on delivering business value there seems to be loads of places we could use this.

Hopefully we can provide more info about Power Apps as our journey progresses.

Discussions about BizTalk Support Product Lifecycle at Integrate 2018

At the recent Integrate 2018 summit the Q&A drew some contentious questions from the audience about the next version of BizTalk and when it is going to be.  What was clear is that the product teams new approach of having a customer feedback driven back log means they have been busy and successful in delivering changes to Logic Apps and also the BizTalk feature pack and having just completed those they have not planned the next major release of BizTalk.

Now that being said, the team should have expected these questions because they always come up and I think an answer of “we aren’t ready to talk about that yet and we will get back to you” would have been fine, but there was a bit of fluff around the answers given which resulted in the audience drawing their own conclusions in a negative way.  After such a great conference I found myself wishing the Q&A had never taken place as this miscommunication at the end sent a lot of people away with a degree of confusion.

With that said in the pub later we were talking about the idea of product support lifecycles and I have always felt the problem around Microsoft tech was that there is too much info out there on the subject and that is actually detrimental to the intention.  I decided to test this idea by looking at the support lifecycle for some other vendors.  First off lets recap Microsoft’s position.

Microsoft BizTalk Server

Lets start by using the link below where community members have a nice easy to follow interpretation of the Microsoft Support Lifecycle for BizTalk.

https://social.technet.microsoft.com/wiki/contents/articles/18709.biztalk-server-product-lifecycle.aspx

Version Release Date End main support End Extended Support
BizTalk Server 2016 12/01/2016 01/11/2022 01/11/2027
BizTalk Server 2013 R2 07/31/2014 07/10/2018 07/11/2023
BizTalk Server 2013 06/12/2013 07/10/2018 07/11/2023
BizTalk Server 2010 11/14/2010 01/12/2016 01/12/2021
BizTalk Server 2009 06/21/2009 07/08/2014 07/09/2019

You can see from the above table there is still some kind of support available for 5 versions of BizTalk covering up to 9 years from now.  Even a 9 year old version of BizTalk is still available under extended support for over 1 more year.

Now we have a picture of the Microsoft position, lets take a look at some of the other vendors out there.

Mulesoft

Below I have summarised some information from https://www.mulesoft.com/legal/versioning-back-support-policy

Version Release Date End of Standard Support End of Extended Support
4.1 March 20, 2018 March 20, 2020 or later March 20, 2022 or later
3.9 October 9, 2017 October 9, 2019 October 9, 2021
3.8 – long term supported May 16, 2016 November 16, 2018 November 16, 2021
3.7 July 9, 2015 Nov 16, 2017 Nov 16, 2019
3.6 Jan 15, 2015 Jan 15, 2017 N/A
3.5 – long term supported May 20, 2014 July 15, 2016 * July 15, 2019 *

Points to note:

  • MuleSoft provides Standard Support for the latest released minor version of the Mule runtime.
  • Once a new minor version for a major version is released, the previous minor version will receive Standard Support for an additional 18 months. All minor versions for a major version will receive Standard Support for a minimum of 2 years.
  • Starting with Mule 3.7 and later, after Standard Support ends, MuleSoft will offer Extended Support for an additional 2 years. Mule 3.5 and 3.8 will receive Extended Support for a total of 3 years.
  • Extended Support versions are only available on CloudHub for applications already deployed on it
  • Once a new major version is released, MuleSoft will continue to offer Standard Support for at least one minor version of the previous major version for a minimum of 3 years.
  • Once a minor version is outside the Standard Support and Extended Support windows, MuleSoft will provide End of Life Support.

My interpretation of the Mulesoft position compared to BizTalk is that the current version of Mulesoft has committed support for 2 years less than the current version of BizTalk and extended support for 5 years less than the current version of BizTalk.

Jitterbit

If we take a look at Jitterbit, their documentation states, “Jitterbit is committed to supporting a version for 12 months from the release date”.  So effectively each release is under support for a committed 12 months only.  It may be longer in reality but if we look at the example of their last version to end of life you can see below it was only supported for 1 year which seems fairly consistent.

Jitterbit Harmony Local Agent 8.23 2017-05-05 2017-05-07 2018-05-07

You can find more info on the link below.

https://success.jitterbit.com/display/DOC/End-of-Life+Policy

My interpretation of the comparison of BizTalk vs Jitterbit is that Jitterbit are only committing to year on year support versus long term commitments from Microsoft.

Oracle Fusion Middleware

From the below link I was able to findout some basic info about Oracle Fusion.

http://www.oracle.com/us/support/library/lsp-middleware-chart-069287.pdf

Version Release Main Support Extended Support
Fusion Middleware 12c (12.2.x) Oct 2015 April 2016 Oct 2020

Talend

I could not find any specific information on the current versions or their support lifecycles however the below links provide some background info.

https://www.talend.com/legal-terms/us-support-policy/

http://talend.tips/release-history/

Dell Boomi

I could not find any information online about support life cycle commitments or versions released.  I did find some information on the release notes for each release which is available below to give an idea of how often change occurs.

http://help.boomi.com/atomsphere/GUID-0F0CDC3D-855B-411D-BB1F-65DC8042AB88.html

Snap Logic

I could not find any information on the support life cycle policies for Snap Logic.  There are release notes available on the below link:

https://docs-snaplogic.atlassian.net/wiki/spaces/SD/pages/49855/SnapLogic+Release+Notes

Tibco

I found the below Tibco document (dated March 2018) which indicates that the latest version of Tibco is under support until 30-Nov 2020.

http://supportinfo.tibco.com/docs/TIBCOEndofSupportInformation-Integration.pdf

Azure Logic Apps

Interestingly I can not find any public information about the public position on the support life cycle for Logic Apps.

Conclusion

The simple fact is that Microsoft have a publicly stated duration of support for the current version of BizTalk which is 5 years longer than any other vendor I am able to find information on!  That means if “BizTalk is dead”, based on info in the public domain all of the other vendors are going to be dead well before it.

With that said it is important to consider that iPaaS products may mean we need to think about the lifecycle in a different way because the idea of upgrading is not really the same thing, but with that in mind we do need to consider that the example of MABS from Microsoft was an iPaaS product which some customers bet on and was then deprecated.  With the volatility in the iPaaS market and the expected consolidation of vendors with things such as the recent acquisition of Mulesoft by SalesForce it may be worth considering if the absence of a communicated lifecycle on iPaaS is a good thing for customers.

I would draw the conclusion that at this stage the recent discussions around BizTalk product lifecycle are really just a case of under par communications and marketing on the Microsoft side which allow the myth to be created that BizTalk product lifecycle is a problem.  If you look at the hard facts actually it has by far the strongest story I can find.

If we compare the 2 biggest competitors, with BizTalk we are 4 years from the end of support of the current version, that’s 2 year more than a brand new release of Mule gets so maybe things aren’t so bad after all.

When data in CRM is updated I want to send it to another application

When data in CRM is updated I want to send it to another application

Having worked a lot with Dynamics CRM/365 over the last few years I thought it would be interesting to discuss a common use case and some of the architecture patterns you may consider to implement the solution.

Lets imagine a scenario where the business requirement is as follows:

  • The user will be updating a customers record in Dynamics 365
  • When the user saves the change we need the change to be synchronised with the billing system

Now at this point I am going to deliberately ignore flushing out these requirements too much. Any experiences integration person will now be thinking of a number of functional and non-functional questions they would want to get more information about, but the above is the typical first requirement. We will use this vagueness to allow us to explore some of the considerations when we look at the options that are available to solve the problem. One thing to note is I am going to consider this to be a 1 way interface for this discussion.

Option 1 – CRM Custom Plugin – Synchronous

In option 1 the CRM developer would use the extensibility features of Dynamics. This allows you to write C# code which will execute within the CRM runtime environment as a plugin. With a plugin you can configure when the code will execute. Options include things like:

  • When an entity is updated but before the save is made
  • When the entity is updated but after the save is made
  • As above but on other commands such as created/deleted

The below picture shows what this scenario will look like

Good things:

  • This is probably the quickest way you can get the data from the commit in CRM to the other application
  • This is probably the simplest way you can do this integration with the minimum number of network hops
  • This solution probably only needs the skill set of the CRM developer

Things to consider:

  • You would be very tightly coupling the two applications
  • You would have some potential challenges around error scenarios
    • What happens if the save to the other app works but the save to CRM doesn’t or visa-versa
  • The custom plugin is probably going to block the CRM users thread while it makes the external call which is asking for performance issues
  • You would need to consider if you would do the call to the other application before or after saving the data to CRM
  • You would need to consider where to store the configuration for the plugin
  • There would be error and retry scenarios to consider
  • There would be the typical considerations of tightly coupled apps
    • What if the other app is broken
    • What if it has a service window
  • Errors are likely to bubble up to the end user
  • You will have OOTB (out of the box) CRM plugin tracing diagnostics but this may require some custom code to ensure it logs appropriate diagnostic information

Option 1.5 – CRM Custom Plugin – Asynchronous

In this option the solution is very similar to the above solution with the exception that the developer has chosen to take advantage of the asynchronous system jobs feature in CRM. The plugin that was developed is probably the same code but this time the configuration of the plugin in CRM has indicated that the plugin should be executed out of process from the transaction where the user is saving a change. This means that the commit of the change will trigger a system job which will be added to the processing queue and it will execute the plugin which will send data to the other application.

The below picture illustrates this option.

Good things:

  • The synchronize transaction will no longer block the users thread when they save data
  • The system jobs gives a degree of troubleshooting and retry options if the other system was down compared to option 1
  • This only required CRM developer skills

Things to consider:

  • There may be other things on the processing queue so there is no guarantee how long it will take to synchronize
  • You may get race conditions if another transaction updates the entity and you haven’t appropriately covered these scenarios in your design
    • Also think about the concurrency of system jobs and other plugins
  • I have seen a few times where option 1 is implemented then flipped to option 2 due to performance concerns as a workaround
    • This needs to be thought about upfront
  • You may struggle to control the load on the downstream system
  • Again there is a tight coupling of systems. CRM has explicit knowledge of the other application and a heavy dependency on it
    • What if the app is down
    • What if there are service windows
  • Error scenarios are highly likely and there could be lots of failed jobs

Option 2 – CRM out of the Box Publishing to Azure Service Bus

Option 1 and 1.5 are common ways a CRM developer will attempt to solve the problem. Typically they have a CRM toolset and they try to use a tool from that toolset to solve the problem as bringing in other things was traditionally a big deal.

With the wide adoption of Azure we are starting to see a major shift in this space. Now many Dynamics projects are also including Azure by default in their toolset. This means CRM developers are also gaining experience with tooling on Azure and have a wider set of options available. This allows a shift in the mindset that not everything has to be solved in CRM and actually doing stuff outside of CRM offers many more opportunities to build better solutions while at the same time keeping the CRM implementation pure and focused on its core aim.

In this solution the CRM developer has chosen to add an Azure Service Bus instance to the solution. This means they can use the OOTB plugin (not a custom one) in CRM which will publish messages from CRM to a queue or topic when an entity changes. From here the architecture can choose some other tools to get messages from Service Bus to the destination application. For simplicity in this case I may choose an Azure Function which could allow me to write a simple bit of C# to do the job.

The below solution illustrates this:

Good things:

  • No custom coding in CRM
  • The Service Bus plugin will be much more reliable than the custom one
  • The Service Bus plugin will get a lot of messages out to Service Bus very fast by comparison to the custom plugin in 1.5 which will bottleneck on the downstream system probably
  • Service Bus supports pub/sub so you can plugin routing of messages to other systems
  • The Azure Function could be developed by the CRM developer quite easily with a basic C# skillset
  • Service Bus offers lots of retry capabilities
  • The queue offers a buffer between the applications so there is no dependency between them
  • The function could be paused in downtime so that CRM can keep pumping out changes and they will be loaded when the other app is back online
  • The solution will be pretty cheap, you will pay a small cost for the service bus instance and per execution for the function. Unless you have very high load this should be a cheap option

Things to consider:

  • The key thing to remember here is that the solution is near realtime. It is not an instant synch. In most cases it is likely the sync will happen very quickly but the CRM System Jobs could be one bottleneck if you have lots of changes or jobs in CRM. Also the capability of the downstream system may be a bottleneck so you may need to consider how fast you want to load changes
  • The only bad thing is that there are quite a few moving parts in this solution so you may want to ensure you are using appropriate management and monitoring for the solution. In addition too CRM System jobs you may want to consider Service Bus 360 to manage and monitor your queues and also Application Insights for your Azure Functions

Option 3 – Logic App Integration

In option 3 the developer has chosen to use a Logic App to detect changes in CRM and to push them over to the other application. This means that the CRM solution is very vanilla, it doesn’t even really know that changes are going elsewhere. In the above options a change in CRM triggered a process to push the data elsewhere. In this option the Logic App is outside CRM and is periodically checking for changes and pulling them out.

Typically the Logic App will check every 3 minutes (this is configurable) and it will pull out a collection of changes and then 1 instance of the logic app will be triggered for each change detected.

The logic app will then use an appropriate connector to pass the message to the downstream application.

The below picture shows what this looks like.

Good things:

  • There is nothing to do in CRM
  • The Logic App will need monitoring and managing separate to CRM
  • The Logic App is not part of the CRM developers core skill set, but they are very simple to use so it should be easy to pick this up
  • The Logic App has a lot of features if you run into more advanced scenarios
  • The Logic App has connectors for lots of applications
  • You may be able to develop the solution with no custom code
  • The Logic App has some excellent diagnostics features to help you develop and manage the solution
  • The Logic App has retry and resubmit capabilities
  • The solution will be pretty cheap with no upfront capital cost. You just pay per execution. Unless you have very high load this should be a cheap option
  • This option can also be combined with Service Bus and BizTalk Server for very advanced integration scenarios

Things to consider:

  • Is the polling interval going to be often enough
  • Only the most recent change will be extracted, if a particular row has been updated 3 times since the last trigger you will get the latest stage
  • It may require some more advanced patterns to control the load if the downstream system is a bottleneck. This may be beyond the CRM developers Logic App skills

Option 4 – SSIS Integration

The next option to consider is an ETL based approach using SSIS. This approach is quite common for CRM projects because they often have people with SQL skills. The solution would involve setting up an SSIS capability and then purchasing the 3rd party Kingswaysoft SSIS connectors which includes support for Dynamics.

The solution would then pull out data from CRM via the API using a fetch xml query or OData Query. It would then push the changes to the destination system. Often SSIS would be integrating at database level which is its sweetspot but it does have the capability to call HTTP endpoints and API’s.

Although the diagrams look similar, the big difference between the Logic App approach and SSIS is that SSIS is treating the records as a batch of data which it is attempting to process in bulk. The Logic App is attempting to execute a separate transaction for each row it pulls out from the CRM changes. Each solution has its own way of dealing with errors which makes this comparison slightly more complex, but typically think of the idea of a batch of changes vs individual changes.

In the SSIS solution it is also very common for the solution to include a staging database between the systems where the developer will attempt to create some separation of concern and create deltas to minimize the size of the data being sent to downstream systems.

Good things:

  • You can process a lot of data very quickly
  • Common approach on CRM projects
  • Kingswaysoft product is mature
  • Predominantly configuration based solution
  • Sometimes error scenarios can be complex

Things to consider:

  • Capital cost for 3rd party software and probably maintenance too
  • Need to consider where to host SSIS (Azure VM or On Premise VM) – Cost associated with this
  • Possible license cost for SQL depending on organisation setup
  • You will sync on a schedule, how often does it need to be
    • The more frequent the less data each time
    • Cant be too frequent
  • How will you monitor and schedule the SSIS package

There is no right or wrong answer based on the original 2 line requirement we got, but you can see each solution has a lot to think about.

This emphasises the importance of asking questions and elaborating on the requirements and working out the capabilities of the applications you will integrate with before choosing which option to take. As a general rule I would recommend not to jump too quickly to option 1 or 1.5. As an integration guy we usually frown upon these kind of options because of the way they couple applications and create long term problems even though they might work initially. I think the other 3 options (2-4) will be relatively easy to choose between depending on the requirements elaboration but with option 1 and 1.5 I would only choose these in niche cases and I would do so only with full buy in from your architecture team that you have a justifiable reason for choosing it that has been documented enough to be able to explain later when someone comes along and asks WTF?

One other factor to consider which we didn’t touch on too much above. I have kind of assumed you have an open toolset on todays typical CRM and Azure project. It may also be the case that your project has some constraints which may influence your decision to choose one option over the other. I hope in these cases the above considerations will help you to validate the choice you make or also give you some ammunition if you feel that you should challenge the constraint and consider another option.