At Integrate 2020 we announced the release of Resource Map which is a new feature in Serverless360. The aim of the feature is to help you organise your cloud estate and keep it structures within a logical model which will help to demystify the complexity of viewing your estate through the physical deployment model which the Azure Portal gives you. Resource Map will allow you to group resources into logical scopes which will make sense to a non Azure Expert and it will help your team keep on top of keeping a clean and well organised environment.
The below picture shows how we will have a tree representation of your estate where resources can be given to scope. When a scope is tidy it will show up as green and when there is clean up required it will show as red or yellow indicating resources arent organised.
Once your resources are mapped to a scope, you can then indicate which resources are dev/test/production versions of each other covering all of your test environments so you can view a cross tab indicating which resources belong to which environment.
You will also be able to do things like automate the allocation of resources in your map and do things like cost analysis below:
I also added a couple of videos which extend on my demos from Integrate 2020
Intro to Resource Map
Setting up Resource Map Manually
More info is on this post on the Serverless360 blog below.
Recently I added to the Integration Playbook an article talking about how we handle configuration settings for Logic Apps on the local dev box and devops pipelines with App Config, Key Vault and Pipeline Variables. There are a few videos walking through the approach
I have worked with Azure Service Bus for years and one of the biggest challenges was always how to manage change on the Queues and Topics within a Service Bus Namespace. You could use a tool like Service Bus Explorer to manually make the changes, but this approach is manual and error prone. You can do exports and imports of configuration but a bit like managing SQL Script changes it’s tough to work out the delta and deploy just what’s needed to your various environments. If only there was a way to do this just like you do for any other Azure component with CI and CD pipelines.
Recently I added an article to the integration playbook which compares the different approaches used for Durability, Persistence and Retry across the various microsoft technologies such as BizTalk, Logic Apps, Event Grid, Event Hubs, Service Bus Messaging and Functions
Ive recently added an article in the Integration Playbook talking about how messages can be correlated between processing instances. In particular this compares the approach in BizTalk against the approach you can use in Logic Apps by combining Logic Apps and Service Bus
This blog post is more of a reminder for myself as much as anything. I had a need to mark some service accounts in Azure AD so that their passwords dont expire.
The aim was that we had a few service accounts used in a couple of places and we wanted to have a controlled process to change their passwords.
To do this we did the following:
Create a group to store associate all of the service accounts for our project for easy management
Add all of the service accounts to that group
Run a script which will check every member of the group and to change the password policy so the password doesnt expire
I had a look online and couldnt really find a resource showing how to do this which didnt use the old Office 365 mso powershell functionality so I thought id share this for anyone else who might find it useful.
Below is the script I used and usually run each time we might need a new service account where we want more granular control of the changing of passwords for service accounts.
When you are working with API’s and Logic Apps and there is lots of rows of data involved you will sometimes come up with the following problems:
An API often pages the data once you go beyond a certain number of records
When you want to insert lots of rows with a Logic App into SQL you will usually have a loop which iterates over a dataset and does inserts
This takes a long time to execute
There is a cost implication to your implementation when you pay for each action
I recently had a scenario in this space and used quite a cool approach to solve the problem which I wanted to share.
Scenario
The scenario I had started in Shopify. When I add products & collections to my online store in Shopify I wanted to have a daily extract from Shopify to synchronise these new product/collections to my Azure SQL database which I use for reporting with Power BI.
To achieve this I would have a Logic App with a nightly trigger which would take the following actions:
Clean the table of which product is in which collection
Extract all products in collections via the Shopify API
Insert them all into the SQL table
The end result is I have a table which has all of the products in each collection listed for my analysis.
At a high level the scenario looks like the below diagram:
Implementation
As I mentioned above the problem is two folded here, when we query Shopify there many be thousands of products so we need to use a paging approach to query their API, secondly I want to insert into SQL in batches to try to minimise the number of action calls on SQL to improve performance and reduce cost.
Lets look at how I did this.
Paging API calls to Shopify
When it comes to the Shopify API you are able to execute a GET operation against the collection and it will return the products within it. If you have lots of products you can get them in pages. I chose to get 250 per time and you need to pass a page index to the API as a query parameter. The below picture shows you what a call to Shopify would look like with the paging parameters set.
Once I can make this single call I can then use a loop around the call to Shopify, but before I do this I need to know how many pages there are. I can do this by executing a GET against the collections API with the count extension on the url. This will return me the number of products in collections. You can see this below.
From the response I can parse the count and then I would set a variable which is the number of pages which I will work out with a calculation of dividing the number of products by the number of products I will get per page. I will also add 1 to this so I get 1 more page than the count incase the division is not a whole number. The calculation is shown below.
Now I know the number of pages I can implement the loop where I will increment the page index each time until we have matched the number of pages. Within the loop we will get the next page of data from the API as shown in the picture below.
SQL Json Insert
It would be possible to just call the insert action for SQL in the logic app but if there are say 10000 products then the loop will do 10000 iterations which will take quite a while to run and also there is a cost associated with that. I wanted to look at options for inserting the data in batches. If I could insert the entire page returned from the API as a batch then with my 250 records at a time I could reduce the 10000 iterations down to 40. That should be a lot less time and a much lower cost.
To do this I developed a stored procedure where I passed the entire JSON string from the API response to the stored procedure as an NVARCHAR(max) parameter. In the stored procedure I was fortunate that the format of the json in this case was very table/row like making it easy to do this insert. I used SQL’s OPENJSON feature and was able to insert the entire page of data from the API in a simple insert statement as you can see in the SQL below.
Summary
Once it was all put together I was able to run my Logic App to refresh my SQL database each night and the process took 10 seconds to copy across 2500 records. This took 10 iterations of the loop.
That’s a nice and easy to support and run Logic App which does a nice job in this case.
Recently I have been looking at some opportunities to utilise the new Model-Driven capabilities in Power Apps. I spent some time at Integrate 2018 chatting to Kent Weare about some of its capabilities and realised it was a great fit for some of the architecture challenges we have. Before I go into some of the opportunities in a sample architecture lets consider an existing setup.
Existing Architecture
In the existing architecture we have a cloud hosted integration platform which the company uses to integrate partners into Dynamics CRM Online and some existing on premise line of business applications. The cloud integration platform is able to support partners submitting data via multiple channels. In this case we have a traditional SFTP and batch based mechanism which old school partners still use. With this pattern we use BizTalk where it excels on the IaaS part of the platform to manage multiple partners submitting different file formats all being converted to a canonical format and then messages are loaded into systems via helper functions on Azure which implement the service façade pattern.
You can see this in the diagram below represented by Partner B.
We also have partners who use more modern approaches to integration where we expose an API via Azure APIM which allows them to submit data which is saved to a queue. BizTalk will process the queue and reuse the existing functionality to load data into our core systems.
The Challenge
While we support 2 example channels in this architecture, we have a massive partner network with different capabilities and some partners even use a person to person and email based interactions. If you imagine a person in a call centre is sent an email with some data or a form in the post and they will type the data into systems manually.
As the application architecture expanded there were more systems these users would need to work with and we needed to find efficiencies to optimise the user entering data. The more records a user can enter in 1 day the bigger the potential cost savings.
The challenge with this was to provide a new form to enter data that was simple and quick. We initially looked at options like Microsoft Forms and Cognitio Forms which could allow us to create forms to capture data but they missed ticking boxes on some of the key non functional requirements such as security and authentication. We needed something which had more features than these options which were good but too simple.
Above we do have Dynamics CRM but the key problem with that like our other applications is that it is tied to a product backlog which means our changes and optimisations would need to fit within an agile release process which was delivering change in a complex system. What we really needed was a sandbox type application where we could build a simple App without many dependencies which would then integrate with our processes.
Proposed Architecture
Coming back to the discussion with Kent, I could see that model driven Power Apps is really like a cut down version of Dynamics and looking at some of the apps in the samples and that people are building you could see straightaway this could be a great opportunity. The Power Apps environment allowed us to build some forms and a data model very quickly to model the data we need users to capture.
We then implemented a logic app which would fire on the update of a record which would check for a field being set to indicate that the record was ready to be published. The logic app would extract the data from the Power App. The really cool bit was that I can use the Dynamics connectors in Logic Apps because the Power App is really just a Dynamics instance. The Logic App puts a message on a queue which is then used to reuse our existing integration.
The below picture represents the architecture from the perspective of the new Power App. Please note that to keep the diagram simple I have omitted the existing B2B SFTP and API integrations so that we can focus on the Power Apps bit.
From this point I now have a pretty simple Power App which can allow these users to input data manually into our process which we think can save a few minutes per record based on manually keying the record in the old ways.
The benefits of Power Apps though are way beyond just this, first off the key to empowering rapid change is that its in an isolated app focusing on just this use case. I don’t have to worry about all of the many features within a bigger CRM implementation. When it comes to implementing changes and regression testing things are much simpler.
At the same time the licensing is slightly different with Power Apps our users are using P1 licenses which aren’t that expensive and good for users who just run the Power App. we use P2 Power Apps licenses for those users who need to admin and develop the Power App.
We also get for free the integration with Azure AD so that our users have a good authentication story. This was one of the challenges with our previous considered options. The products we looked at which provided out of the box forms capability seemed to lack the ability to authenticate then restrict the users to just certain users and to then know who filled in which form. This is a key requirement.
When it comes to many of the other security scenarios as existing Dynamics users we have already gone through the governance around what Dynamics is, how it works, its security, etc. The model driven Power App seems to be just the same in terms of capabilities.
At one time we were considering building an ASP.net app for our users and when you consider everything PaaS on Azure offers for very little cost it would seem an attractive option, but compared to these new more powerful Power Apps I think removing the considerations about hosting, security, custom coding, design experience, etc you get so much out of the box that it’s a compelling argument to try the Power App.
At this point Power Apps seems to be offering a great opportunity for us to build those utility applications and system of engagement applications on an enterprise ready platform but without lots of custom development. Really focusing on delivering business value there seems to be loads of places we could use this.
Hopefully we can provide more info about Power Apps as our journey progresses.
At the recent Integrate 2018 summit the Q&A drew some contentious questions from the audience about the next version of BizTalk and when it is going to be. What was clear is that the product teams new approach of having a customer feedback driven back log means they have been busy and successful in delivering changes to Logic Apps and also the BizTalk feature pack and having just completed those they have not planned the next major release of BizTalk.
Now that being said, the team should have expected these questions because they always come up and I think an answer of “we aren’t ready to talk about that yet and we will get back to you” would have been fine, but there was a bit of fluff around the answers given which resulted in the audience drawing their own conclusions in a negative way. After such a great conference I found myself wishing the Q&A had never taken place as this miscommunication at the end sent a lot of people away with a degree of confusion.
With that said in the pub later we were talking about the idea of product support lifecycles and I have always felt the problem around Microsoft tech was that there is too much info out there on the subject and that is actually detrimental to the intention. I decided to test this idea by looking at the support lifecycle for some other vendors. First off lets recap Microsoft’s position.
Microsoft BizTalk Server
Lets start by using the link below where community members have a nice easy to follow interpretation of the Microsoft Support Lifecycle for BizTalk.
You can see from the above table there is still some kind of support available for 5 versions of BizTalk covering up to 9 years from now. Even a 9 year old version of BizTalk is still available under extended support for over 1 more year.
Now we have a picture of the Microsoft position, lets take a look at some of the other vendors out there.
MuleSoft provides Standard Support for the latest released minor version of the Mule runtime.
Once a new minor version for a major version is released, the previous minor version will receive Standard Support for an additional 18 months. All minor versions for a major version will receive Standard Support for a minimum of 2 years.
Starting with Mule 3.7 and later, after Standard Support ends, MuleSoft will offer Extended Support for an additional 2 years. Mule 3.5 and 3.8 will receive Extended Support for a total of 3 years.
Extended Support versions are only available on CloudHub for applications already deployed on it
Once a new major version is released, MuleSoft will continue to offer Standard Support for at least one minor version of the previous major version for a minimum of 3 years.
Once a minor version is outside the Standard Support and Extended Support windows, MuleSoft will provide End of Life Support.
My interpretation of the Mulesoft position compared to BizTalk is that the current version of Mulesoft has committed support for 2 years less than the current version of BizTalk and extended support for 5 years less than the current version of BizTalk.
Jitterbit
If we take a look at Jitterbit, their documentation states, “Jitterbit is committed to supporting a version for 12 months from the release date”. So effectively each release is under support for a committed 12 months only. It may be longer in reality but if we look at the example of their last version to end of life you can see below it was only supported for 1 year which seems fairly consistent.
My interpretation of the comparison of BizTalk vs Jitterbit is that Jitterbit are only committing to year on year support versus long term commitments from Microsoft.
Oracle Fusion Middleware
From the below link I was able to findout some basic info about Oracle Fusion.
I could not find any information online about support life cycle commitments or versions released. I did find some information on the release notes for each release which is available below to give an idea of how often change occurs.
Interestingly I can not find any public information about the public position on the support life cycle for Logic Apps.
Conclusion
The simple fact is that Microsoft have a publicly stated duration of support for the current version of BizTalk which is 5 years longer than any other vendor I am able to find information on! That means if “BizTalk is dead”, based on info in the public domain all of the other vendors are going to be dead well before it.
With that said it is important to consider that iPaaS products may mean we need to think about the lifecycle in a different way because the idea of upgrading is not really the same thing, but with that in mind we do need to consider that the example of MABS from Microsoft was an iPaaS product which some customers bet on and was then deprecated. With the volatility in the iPaaS market and the expected consolidation of vendors with things such as the recent acquisition of Mulesoft by SalesForce it may be worth considering if the absence of a communicated lifecycle on iPaaS is a good thing for customers.
I would draw the conclusion that at this stage the recent discussions around BizTalk product lifecycle are really just a case of under par communications and marketing on the Microsoft side which allow the myth to be created that BizTalk product lifecycle is a problem. If you look at the hard facts actually it has by far the strongest story I can find.
If we compare the 2 biggest competitors, with BizTalk we are 4 years from the end of support of the current version, that’s 2 year more than a brand new release of Mule gets so maybe things aren’t so bad after all.
Having worked a lot with Dynamics CRM/365 over the last few years I thought it would be interesting to discuss a common use case and some of the architecture patterns you may consider to implement the solution.
Lets imagine a scenario where the business requirement is as follows:
The user will be updating a customers record in Dynamics 365
When the user saves the change we need the change to be synchronised with the billing system
Now at this point I am going to deliberately ignore flushing out these requirements too much. Any experiences integration person will now be thinking of a number of functional and non-functional questions they would want to get more information about, but the above is the typical first requirement. We will use this vagueness to allow us to explore some of the considerations when we look at the options that are available to solve the problem. One thing to note is I am going to consider this to be a 1 way interface for this discussion.
Option 1 – CRM Custom Plugin – Synchronous
In option 1 the CRM developer would use the extensibility features of Dynamics. This allows you to write C# code which will execute within the CRM runtime environment as a plugin. With a plugin you can configure when the code will execute. Options include things like:
When an entity is updated but before the save is made
When the entity is updated but after the save is made
As above but on other commands such as created/deleted
The below picture shows what this scenario will look like
Good things:
This is probably the quickest way you can get the data from the commit in CRM to the other application
This is probably the simplest way you can do this integration with the minimum number of network hops
This solution probably only needs the skill set of the CRM developer
Things to consider:
You would be very tightly coupling the two applications
You would have some potential challenges around error scenarios
What happens if the save to the other app works but the save to CRM doesn’t or visa-versa
The custom plugin is probably going to block the CRM users thread while it makes the external call which is asking for performance issues
You would need to consider if you would do the call to the other application before or after saving the data to CRM
You would need to consider where to store the configuration for the plugin
There would be error and retry scenarios to consider
There would be the typical considerations of tightly coupled apps
What if the other app is broken
What if it has a service window
Errors are likely to bubble up to the end user
You will have OOTB (out of the box) CRM plugin tracing diagnostics but this may require some custom code to ensure it logs appropriate diagnostic information
Option 1.5 – CRM Custom Plugin – Asynchronous
In this option the solution is very similar to the above solution with the exception that the developer has chosen to take advantage of the asynchronous system jobs feature in CRM. The plugin that was developed is probably the same code but this time the configuration of the plugin in CRM has indicated that the plugin should be executed out of process from the transaction where the user is saving a change. This means that the commit of the change will trigger a system job which will be added to the processing queue and it will execute the plugin which will send data to the other application.
The below picture illustrates this option.
Good things:
The synchronize transaction will no longer block the users thread when they save data
The system jobs gives a degree of troubleshooting and retry options if the other system was down compared to option 1
This only required CRM developer skills
Things to consider:
There may be other things on the processing queue so there is no guarantee how long it will take to synchronize
You may get race conditions if another transaction updates the entity and you haven’t appropriately covered these scenarios in your design
Also think about the concurrency of system jobs and other plugins
I have seen a few times where option 1 is implemented then flipped to option 2 due to performance concerns as a workaround
This needs to be thought about upfront
You may struggle to control the load on the downstream system
Again there is a tight coupling of systems. CRM has explicit knowledge of the other application and a heavy dependency on it
What if the app is down
What if there are service windows
Error scenarios are highly likely and there could be lots of failed jobs
Option 2 – CRM out of the Box Publishing to Azure Service Bus
Option 1 and 1.5 are common ways a CRM developer will attempt to solve the problem. Typically they have a CRM toolset and they try to use a tool from that toolset to solve the problem as bringing in other things was traditionally a big deal.
With the wide adoption of Azure we are starting to see a major shift in this space. Now many Dynamics projects are also including Azure by default in their toolset. This means CRM developers are also gaining experience with tooling on Azure and have a wider set of options available. This allows a shift in the mindset that not everything has to be solved in CRM and actually doing stuff outside of CRM offers many more opportunities to build better solutions while at the same time keeping the CRM implementation pure and focused on its core aim.
In this solution the CRM developer has chosen to add an Azure Service Bus instance to the solution. This means they can use the OOTB plugin (not a custom one) in CRM which will publish messages from CRM to a queue or topic when an entity changes. From here the architecture can choose some other tools to get messages from Service Bus to the destination application. For simplicity in this case I may choose an Azure Function which could allow me to write a simple bit of C# to do the job.
The below solution illustrates this:
Good things:
No custom coding in CRM
The Service Bus plugin will be much more reliable than the custom one
The Service Bus plugin will get a lot of messages out to Service Bus very fast by comparison to the custom plugin in 1.5 which will bottleneck on the downstream system probably
Service Bus supports pub/sub so you can plugin routing of messages to other systems
The Azure Function could be developed by the CRM developer quite easily with a basic C# skillset
Service Bus offers lots of retry capabilities
The queue offers a buffer between the applications so there is no dependency between them
The function could be paused in downtime so that CRM can keep pumping out changes and they will be loaded when the other app is back online
The solution will be pretty cheap, you will pay a small cost for the service bus instance and per execution for the function. Unless you have very high load this should be a cheap option
Things to consider:
The key thing to remember here is that the solution is near realtime. It is not an instant synch. In most cases it is likely the sync will happen very quickly but the CRM System Jobs could be one bottleneck if you have lots of changes or jobs in CRM. Also the capability of the downstream system may be a bottleneck so you may need to consider how fast you want to load changes
The only bad thing is that there are quite a few moving parts in this solution so you may want to ensure you are using appropriate management and monitoring for the solution. In addition too CRM System jobs you may want to consider Service Bus 360 to manage and monitor your queues and also Application Insights for your Azure Functions
Option 3 – Logic App Integration
In option 3 the developer has chosen to use a Logic App to detect changes in CRM and to push them over to the other application. This means that the CRM solution is very vanilla, it doesn’t even really know that changes are going elsewhere. In the above options a change in CRM triggered a process to push the data elsewhere. In this option the Logic App is outside CRM and is periodically checking for changes and pulling them out.
Typically the Logic App will check every 3 minutes (this is configurable) and it will pull out a collection of changes and then 1 instance of the logic app will be triggered for each change detected.
The logic app will then use an appropriate connector to pass the message to the downstream application.
The below picture shows what this looks like.
Good things:
There is nothing to do in CRM
The Logic App will need monitoring and managing separate to CRM
The Logic App is not part of the CRM developers core skill set, but they are very simple to use so it should be easy to pick this up
The Logic App has a lot of features if you run into more advanced scenarios
The Logic App has connectors for lots of applications
You may be able to develop the solution with no custom code
The Logic App has some excellent diagnostics features to help you develop and manage the solution
The Logic App has retry and resubmit capabilities
The solution will be pretty cheap with no upfront capital cost. You just pay per execution. Unless you have very high load this should be a cheap option
This option can also be combined with Service Bus and BizTalk Server for very advanced integration scenarios
Things to consider:
Is the polling interval going to be often enough
Only the most recent change will be extracted, if a particular row has been updated 3 times since the last trigger you will get the latest stage
It may require some more advanced patterns to control the load if the downstream system is a bottleneck. This may be beyond the CRM developers Logic App skills
Option 4 – SSIS Integration
The next option to consider is an ETL based approach using SSIS. This approach is quite common for CRM projects because they often have people with SQL skills. The solution would involve setting up an SSIS capability and then purchasing the 3rd party Kingswaysoft SSIS connectors which includes support for Dynamics.
The solution would then pull out data from CRM via the API using a fetch xml query or OData Query. It would then push the changes to the destination system. Often SSIS would be integrating at database level which is its sweetspot but it does have the capability to call HTTP endpoints and API’s.
Although the diagrams look similar, the big difference between the Logic App approach and SSIS is that SSIS is treating the records as a batch of data which it is attempting to process in bulk. The Logic App is attempting to execute a separate transaction for each row it pulls out from the CRM changes. Each solution has its own way of dealing with errors which makes this comparison slightly more complex, but typically think of the idea of a batch of changes vs individual changes.
In the SSIS solution it is also very common for the solution to include a staging database between the systems where the developer will attempt to create some separation of concern and create deltas to minimize the size of the data being sent to downstream systems.
Good things:
You can process a lot of data very quickly
Common approach on CRM projects
Kingswaysoft product is mature
Predominantly configuration based solution
Sometimes error scenarios can be complex
Things to consider:
Capital cost for 3rd party software and probably maintenance too
Need to consider where to host SSIS (Azure VM or On Premise VM) – Cost associated with this
Possible license cost for SQL depending on organisation setup
You will sync on a schedule, how often does it need to be
The more frequent the less data each time
Cant be too frequent
How will you monitor and schedule the SSIS package
There is no right or wrong answer based on the original 2 line requirement we got, but you can see each solution has a lot to think about.
This emphasises the importance of asking questions and elaborating on the requirements and working out the capabilities of the applications you will integrate with before choosing which option to take. As a general rule I would recommend not to jump too quickly to option 1 or 1.5. As an integration guy we usually frown upon these kind of options because of the way they couple applications and create long term problems even though they might work initially. I think the other 3 options (2-4) will be relatively easy to choose between depending on the requirements elaboration but with option 1 and 1.5 I would only choose these in niche cases and I would do so only with full buy in from your architecture team that you have a justifiable reason for choosing it that has been documented enough to be able to explain later when someone comes along and asks WTF?
One other factor to consider which we didn’t touch on too much above. I have kind of assumed you have an open toolset on todays typical CRM and Azure project. It may also be the case that your project has some constraints which may influence your decision to choose one option over the other. I hope in these cases the above considerations will help you to validate the choice you make or also give you some ammunition if you feel that you should challenge the constraint and consider another option.