by michaelstephensonuk | Feb 10, 2018 | BizTalk Community Blogs via Syndication
Having worked a lot with Dynamics CRM/365 over the last few years I thought it would be interesting to discuss a common use case and some of the architecture patterns you may consider to implement the solution.
Lets imagine a scenario where the business requirement is as follows:
- The user will be updating a customers record in Dynamics 365
- When the user saves the change we need the change to be synchronised with the billing system
Now at this point I am going to deliberately ignore flushing out these requirements too much. Any experiences integration person will now be thinking of a number of functional and non-functional questions they would want to get more information about, but the above is the typical first requirement. We will use this vagueness to allow us to explore some of the considerations when we look at the options that are available to solve the problem. One thing to note is I am going to consider this to be a 1 way interface for this discussion.
Option 1 – CRM Custom Plugin – Synchronous
In option 1 the CRM developer would use the extensibility features of Dynamics. This allows you to write C# code which will execute within the CRM runtime environment as a plugin. With a plugin you can configure when the code will execute. Options include things like:
- When an entity is updated but before the save is made
- When the entity is updated but after the save is made
- As above but on other commands such as created/deleted
The below picture shows what this scenario will look like
Good things:
- This is probably the quickest way you can get the data from the commit in CRM to the other application
- This is probably the simplest way you can do this integration with the minimum number of network hops
- This solution probably only needs the skill set of the CRM developer
Things to consider:
- You would be very tightly coupling the two applications
-
You would have some potential challenges around error scenarios
- What happens if the save to the other app works but the save to CRM doesn’t or visa-versa
- The custom plugin is probably going to block the CRM users thread while it makes the external call which is asking for performance issues
- You would need to consider if you would do the call to the other application before or after saving the data to CRM
- You would need to consider where to store the configuration for the plugin
- There would be error and retry scenarios to consider
-
There would be the typical considerations of tightly coupled apps
- What if the other app is broken
- What if it has a service window
- Errors are likely to bubble up to the end user
-
You will have OOTB (out of the box) CRM plugin tracing diagnostics but this may require some custom code to ensure it logs appropriate diagnostic information
Option 1.5 – CRM Custom Plugin – Asynchronous
In this option the solution is very similar to the above solution with the exception that the developer has chosen to take advantage of the asynchronous system jobs feature in CRM. The plugin that was developed is probably the same code but this time the configuration of the plugin in CRM has indicated that the plugin should be executed out of process from the transaction where the user is saving a change. This means that the commit of the change will trigger a system job which will be added to the processing queue and it will execute the plugin which will send data to the other application.
The below picture illustrates this option.
Good things:
Things to consider:
- There may be other things on the processing queue so there is no guarantee how long it will take to synchronize
-
You may get race conditions if another transaction updates the entity and you haven’t appropriately covered these scenarios in your design
- Also think about the concurrency of system jobs and other plugins
-
I have seen a few times where option 1 is implemented then flipped to option 2 due to performance concerns as a workaround
- This needs to be thought about upfront
- You may struggle to control the load on the downstream system
-
Again there is a tight coupling of systems. CRM has explicit knowledge of the other application and a heavy dependency on it
- What if the app is down
- What if there are service windows
- Error scenarios are highly likely and there could be lots of failed jobs
Option 2 – CRM out of the Box Publishing to Azure Service Bus
Option 1 and 1.5 are common ways a CRM developer will attempt to solve the problem. Typically they have a CRM toolset and they try to use a tool from that toolset to solve the problem as bringing in other things was traditionally a big deal.
With the wide adoption of Azure we are starting to see a major shift in this space. Now many Dynamics projects are also including Azure by default in their toolset. This means CRM developers are also gaining experience with tooling on Azure and have a wider set of options available. This allows a shift in the mindset that not everything has to be solved in CRM and actually doing stuff outside of CRM offers many more opportunities to build better solutions while at the same time keeping the CRM implementation pure and focused on its core aim.
In this solution the CRM developer has chosen to add an Azure Service Bus instance to the solution. This means they can use the OOTB plugin (not a custom one) in CRM which will publish messages from CRM to a queue or topic when an entity changes. From here the architecture can choose some other tools to get messages from Service Bus to the destination application. For simplicity in this case I may choose an Azure Function which could allow me to write a simple bit of C# to do the job.
The below solution illustrates this:
Good things:
- No custom coding in CRM
- The Service Bus plugin will be much more reliable than the custom one
- The Service Bus plugin will get a lot of messages out to Service Bus very fast by comparison to the custom plugin in 1.5 which will bottleneck on the downstream system probably
- Service Bus supports pub/sub so you can plugin routing of messages to other systems
- The Azure Function could be developed by the CRM developer quite easily with a basic C# skillset
- Service Bus offers lots of retry capabilities
- The queue offers a buffer between the applications so there is no dependency between them
- The function could be paused in downtime so that CRM can keep pumping out changes and they will be loaded when the other app is back online
- The solution will be pretty cheap, you will pay a small cost for the service bus instance and per execution for the function. Unless you have very high load this should be a cheap option
Things to consider:
- The key thing to remember here is that the solution is near realtime. It is not an instant synch. In most cases it is likely the sync will happen very quickly but the CRM System Jobs could be one bottleneck if you have lots of changes or jobs in CRM. Also the capability of the downstream system may be a bottleneck so you may need to consider how fast you want to load changes
- The only bad thing is that there are quite a few moving parts in this solution so you may want to ensure you are using appropriate management and monitoring for the solution. In addition too CRM System jobs you may want to consider Service Bus 360 to manage and monitor your queues and also Application Insights for your Azure Functions
Option 3 – Logic App Integration
In option 3 the developer has chosen to use a Logic App to detect changes in CRM and to push them over to the other application. This means that the CRM solution is very vanilla, it doesn’t even really know that changes are going elsewhere. In the above options a change in CRM triggered a process to push the data elsewhere. In this option the Logic App is outside CRM and is periodically checking for changes and pulling them out.
Typically the Logic App will check every 3 minutes (this is configurable) and it will pull out a collection of changes and then 1 instance of the logic app will be triggered for each change detected.
The logic app will then use an appropriate connector to pass the message to the downstream application.
The below picture shows what this looks like.
Good things:
- There is nothing to do in CRM
- The Logic App will need monitoring and managing separate to CRM
- The Logic App is not part of the CRM developers core skill set, but they are very simple to use so it should be easy to pick this up
- The Logic App has a lot of features if you run into more advanced scenarios
- The Logic App has connectors for lots of applications
- You may be able to develop the solution with no custom code
- The Logic App has some excellent diagnostics features to help you develop and manage the solution
- The Logic App has retry and resubmit capabilities
- The solution will be pretty cheap with no upfront capital cost. You just pay per execution. Unless you have very high load this should be a cheap option
- This option can also be combined with Service Bus and BizTalk Server for very advanced integration scenarios
Things to consider:
Option 4 – SSIS Integration
The next option to consider is an ETL based approach using SSIS. This approach is quite common for CRM projects because they often have people with SQL skills. The solution would involve setting up an SSIS capability and then purchasing the 3rd party Kingswaysoft SSIS connectors which includes support for Dynamics.
The solution would then pull out data from CRM via the API using a fetch xml query or OData Query. It would then push the changes to the destination system. Often SSIS would be integrating at database level which is its sweetspot but it does have the capability to call HTTP endpoints and API’s.
Although the diagrams look similar, the big difference between the Logic App approach and SSIS is that SSIS is treating the records as a batch of data which it is attempting to process in bulk. The Logic App is attempting to execute a separate transaction for each row it pulls out from the CRM changes. Each solution has its own way of dealing with errors which makes this comparison slightly more complex, but typically think of the idea of a batch of changes vs individual changes.
In the SSIS solution it is also very common for the solution to include a staging database between the systems where the developer will attempt to create some separation of concern and create deltas to minimize the size of the data being sent to downstream systems.
Good things:
- You can process a lot of data very quickly
- Common approach on CRM projects
- Kingswaysoft product is mature
- Predominantly configuration based solution
- Sometimes error scenarios can be complex
Things to consider:
There is no right or wrong answer based on the original 2 line requirement we got, but you can see each solution has a lot to think about.
This emphasises the importance of asking questions and elaborating on the requirements and working out the capabilities of the applications you will integrate with before choosing which option to take. As a general rule I would recommend not to jump too quickly to option 1 or 1.5. As an integration guy we usually frown upon these kind of options because of the way they couple applications and create long term problems even though they might work initially. I think the other 3 options (2-4) will be relatively easy to choose between depending on the requirements elaboration but with option 1 and 1.5 I would only choose these in niche cases and I would do so only with full buy in from your architecture team that you have a justifiable reason for choosing it that has been documented enough to be able to explain later when someone comes along and asks WTF?
One other factor to consider which we didn’t touch on too much above. I have kind of assumed you have an open toolset on todays typical CRM and Azure project. It may also be the case that your project has some constraints which may influence your decision to choose one option over the other. I hope in these cases the above considerations will help you to validate the choice you make or also give you some ammunition if you feel that you should challenge the constraint and consider another option.
by michaelstephensonuk | Jul 22, 2017 | BizTalk Community Blogs via Syndication
I wanted to talk a little about the architecture I designed recently for a Dynamics CRM + Portal + Integration project. In the initial stages of the project a number of options were considered for a Portal (or group of portals) which would support staff, students and other users which would integrate with Dynamics CRM and other applications in the application estate. One of the challenges I could see coming up in the architecture was the level of coupling between the Portal and Dynamics CRM. Ive seen this a few times where an architecture has been designed where the portal is directly querying CRM and has the CRM SDK embedded in it which is an obviously highly coupled integration between the two. What I think is a far bigger challenge however is the fact that CRM Online is a SaaS application and you have very little control over the tuning and performance of CRM.
Lets imagine you have 1000 CRM user licenses for staff and back office users. CRM is going to be your core system of record for customers but you want to build systems of engagement to drive a positive customer experience and creating a Portal which can communicate with CRM is a very likely scenario. When you have bought your 1000 licenses from Microsoft you are going to be given the infrastructure to support the load from 1000 users. The problem however is your CRM portal being tightly coupled to CRM may introduce another amount of users on top of the 1000 back office users. Well whats going to happen when you have 50,000 students or a thousands/millions of customers starting to use your portal. You now have a problem that CRM may become a bottle neck to performance but because its SaaS you have almost no options to scale up or out your system.
With this kind of architecture you have the choices to roll your own portal using .net and either Web API or CRM SDK integration directly to CRM. There are also options to use products like ADXStudio which can help you build a portal too. The main reason these options are very attractive is because they are probably the quickest to build and minimize the number of moving parts. From a productivity perspective they are very good.
An illustration of this architecture could look something like the below:
What we were proposing to do instead was to leverage some of the powerful features of Azure to allow us to build an architecture for a Portal which was integrated with CRM Online and other stuff which would scale to a much higher user base without having performance problems on CRM. Noting that problems in CRM could create a negative experience for Portal Users but also could significantly effect the performance of staff in the back office is CRM was running slow.
To achieve this we decided that using asynchronous approaches with CRM and hosting an intermediate data layer in Azure would allow us at a relatively low cost have a much faster and more scalable data layer to base the core architecture on. We would call this our cloud data layer and it would sit behind an API for consumers but be fed with data from CRM and other applications which were both on premise and in the cloud. From here the API was to expose this data to the various portals we may build.
The core idea was that the more we could minimize the use of RPC calls to any of our SaaS or On Premise applications the better we would be able to scale the portal we would build. Also at the same time the more resilient they would be to any of the applications going down.
Hopefully at this point you have an understanding of the aim and can visualise the high level architecture. I will next talk through some of the patterns in the implementation.
Simple Command from Portal
In this patter we have the scenario where the portal needs to send a simple command for something to happen. The below diagram will show how this works.
Lets imagine a scenario of a user in the portal adding a chat comment to a case.
The process for the simple command is:
- The portal will send a message to the API which will do some basic processing but then it will off load the message to a service bus topic
- The topic allows us to route the message to many places if we want to
- The main subscriber is a Logic App and it will use the CRM connectors to be able to interact with the appropriate entities to create the chat command as an annotation in CRM
This particular approach is pretty simple and the interaction with CRM is not overly complicated. This is a good candidate to use the Logic App to process this message.
Complex Command from Portal
In some cases the portal would publish a command which would require a more complex processing path. Lets imagine a scenario where the customer or student raised a case from the portal. In this scenario the processing could be:
- Portal calls the API to submit a case
- API drops a message onto a service bus topic
- BizTalk picks up the message and enriches with additional data from some on premise systems
- BizTalk then updates some on premise applications with some data
- BizTalk then creates the case in CRM
The below picture might illustrate this scenario
In this case we choose to use BizTalk rather than Logic Apps to process the message. I think as a general rule the more complex the processing requirements, the more I would tend to lean towards BizTalk than Logic Apps. BizTalks support for more complex orchestration, compensation approaches and advanced mapping just lends itself a little better in this case.
I think the great thing in the Microsoft stack is that you can choose from the following technologies to implement the above two patterns behind the scenes:
- Web Jobs
- Functions
- Logic Apps
- BizTalk
Each have their pro’s and con’s which make them suit different scenarios better but also it allows you to work in a skillset your most comfortable with.
Cloud Data Layer
Earlier in the article I mentioned that we have the cloud data layer as one of our architectural components. I guess in some ways this follows the CQRS pattern to some degree but we are not specifically implementing CQRS for the entire system. Data in the Cloud Data Layer is owned by some other application and we are simply choosing to copy some of it to the cloud so it is in a place which will allow us to build better applications. Exposing this data via an API means that we can leverage a data platform based on Cosmos DB (Document DB) and Azure Table Storage and Azure Blob Storage.
If you look at Cosmos DB and Azure Storage, they are all very easy to use and to get up and running with but the other big benefits is they offer high performance if used right. By comparison we have little control over the performance of CRM online, but with Cosmos DB and Azure Storage we have lots of options over the way we index and store data to make it suit a high performing application without all of the baggage CRM would bring with it.
The main difference over how we use these data stored to make a combines data layer is:
- Cosmos DB is used for a small amount of meta data related to entities to aid complex searching
- Azure Table store is used to store related info for fast retrieval by good partitioning
- Azure Blob Storage is used for storing larger json objects
Some examples of how we may use this would be:
- In an azure table a students courses, modules, etc may be partitioned by the student id so it is fast to retrieve the information related to one student
- In Cosmos DB we may store info to make advanced searching efficient and easy. For example find all of the students who are on course 123
- In blob storage we may store objects like the details of a KB article which might be a big dataset. We may use Cosmos DB to search for KB articles by keywords and tags but then pull the detail from Blob Storage
CRM Event to Cloud Data Layer
Now that we understand that queries of data will not come directly from CRM but instead via an API which exposes an intermediate data layer hosted on Azure. The question is how is this data layer populated from CRM. We will use a couple of patterns to achieve this. The first of which is event based.
Imagine that in CRM each time an entity is updated/etc we can use the CRM plugin for Service Bus to publish that event externally. We can then subscribe to the queue and with the data from CRM we can look up additional entities if required and then we can transform and push this data some where. In our architecture we may choose to use a Logic App to collect the message. Lets imagine a case was updated. The Logic App may then use info from the case to look up related entity data such as a contact and other similar entities. It will build up a canonical message related to the event and then it can store it in the cloud data layer.
Lets imagine a specific example. We have a knowledge base article in CRM. It is updated by a user and the event fires. The Logic App will get the event and lookup the KB article. The Logic App will then update Cosmos DB to update the metadata of the article for searching by apps. The Logic App will then transform the various related entities to a canonical json format and save them to Blob storage. When the application searches for KB articles via the API it will be under the hood retrieving data from Cosmos DB. When it has chosen a KB article to display then it will retrieve the KB article details from blob storage.
The below picture shows how this pattern will work.
CRM Entity Sync to Cloud Data Layer
One of the other ways we can populate the cloud data layer from CRM is via a with a job that will copy data. There are a few different ways this can be done. The main way will involve executing a fetch xml query against CRM to retrieve all of the records from an entity or all of the records that have been changed recently. They will then be pushed over to the cloud data layer and stored in one of the data stores depending on which is used for that data type. It is likely there will be some form of transformation on the way too.
An example of where we may do this is if we had a list of reference data in CRM such as the nationalities of contacts. We may want to display this list in the portal but without querying CRM directly. In this case we could copy the list of entities from CRM to the cloud data layer on a weekly basis where we copy the whole table. There are other cases where we may copy data more frequently and we may use different data sources in the cloud data layer depending upon the data type and how we expect to use it.
The below example shows how we may use BizTalk to query some data from CRM and then we may send messages to table storage and Cosmos DB.
Another way we may solve this problem is using Data Factory in Azure. In Data Factory we can do a more traditional ETL style interface where we will copy data from CRM using the OData feeds and download it into the target data sources. The transformation and advanced features in Data Factory are a bit more limited but in the right case this can be done like in the below picture.
In these data synchronisation interfaces it will tend to be data that doesn’t change that often and data which you don’t need the real time event to update it which it will work the best with. While I have mentioned Data Factory and BizTalk as the options we used, you could also use SSIS, custom code and a web job or other options to implement it.
Summary
Hopefully the above approach gives you some ideas how you can build a high performing portal which integrated with CRM Online and potentially other applications but by using a slightly more complex architecture which introduces asynchronous processing in places and CQRS in others you can create a decoupling between the portal(s) you build and CRM and other back end systems. In this case it has allowed us to introduce a data layer in Azure which will scale and perform better than CRM will but also give us significant control over things rather than having a bottle neck on a black box outside of our control.
In addition to the performance benefits its also potentially possible for CRM to go completely off line without bringing down the portal and only having a minimal effect on functionality. While the cloud data layer could still have problems, firstly it is much simpler but it is also using services which can easily be geo-redundant so reducing your risks. An example here of one of the practical aspects of this is if CRM was off line for a few hours while a deployment is performed I would not expect the portal to be effected except for a delay in processing messages.
I hope this is useful for others and gives people a few ideas to think about when integrating with CRM.
by michaelstephensonuk | Mar 9, 2017 | BizTalk Community Blogs via Syndication
When we build integration solutions one of the biggest challenges we face is “sh!t in sh!t out”. Explained more eloquently we often have line of business systems which have some poor data in it and then we have to massage this and work around it in the integration solution so that the receivers of the data don’t break when they get that data. Also sometimes the receiver doesn’t break but its functionality is impaired by poor data.
Having faced this challenge recently and the problem in many organizations is as follows:
- No one knows there is a data quality issue
- If it is known then its difficult to workout how bad it is or estimate its impact
- Often no one own the problem
- If no one owns the problem then its unlikely anyone is fixing the problem
Imagine we have a scenario where we have loaded all of the students from one of our line of business systems into our new CRM system and then we are trying to load course data from another system into CRM and to make it all match up. When we try to ask questions of the data in CRM we are not getting the answers we expect and people lack confidence in the new solution. The thing is, the root cause of the problem is poor data quality from the underlying systems but the end users don’t have visibility of that so they just see the problem being with the new system as the old stuff has been around and kind of worked for years.
Dealing with the Issue
There are a number of ways you can tackle this problem and we saw business steering groups discussing data quality and other such things but nothing was as effective and cheap as a simple solution we put in place.
If you can imagine that we use BizTalk to extract the data from the source system and then load it to Service Bus, from where we have various approaches to put/sub the data into other systems. The main recipient of most of the data was Microsoft Dynamics CRM Online. Our idea was to implement some tests of the data as we attempted to load it into CRM. We implemented these in .net and the result of the tests would be a decimal value representing a % score based on the number of tests passed and a string listing the names of the tests that failed.
We would then save this data alongside the record as part of the CRM entity so it was very visible. You can see an example of this below:
We implemented tests like the following:
- Is a field populated
- Does the text match a regular expression
- If we had a relationship to another entity can we find a match
For most of the records we would implement 10 to 20 tests of the data coming from other systems. We can then in CRM easily sort and manage records based on their data quality score.
Making the results visible
At this point from an operational perspective we were able to see how good and bad the data coming into CRM is on a per record basis. The next thing we need to do is to get some focus on fixing the data. The best way to do this is to provide visualisations to the key stakeholders to show how good or bad the data is.
To do this we used a simple Power BI report dashboard pointing at CRM which would do an average for each entity of the data quality score. This is shown in the below picture.
If I am able to say to the business stakeholders that we can not reliably answer certain questions in CRM because the data coming into CRM has a quality score of 50% then this is a powerful statement. Backed up by some specific tests which show whats good and isnt. This is highly likely to create an interest in the stakeholders in improving the data quality so that is serves the purpose they require. The great thing is each time they fix missing data or partly complete data which has accrued over the years into the LOB application, each time data is fixed and reloaded we should see the data quality score improving which means you will get more out of your investment in the new applications.
Summary
The key thing here isnt really how we implemented this solution. We were lucky that adding a few fields to CRM is dead easy. You could implement this in a number of different ways. What is important about this approach is the idea of testing the data during the loading process and recording that quality score and most importantly making it very visible to help everyone have the same view.
The post Dealing with Bad Data in Integration Solutions appeared first on Microsoft Integration & Cloud Architect.
by Rene Brauwers | Jan 17, 2012 | BizTalk Community Blogs via Syndication
First things first, at this point in time I assume
- you’ve read the previous post
- downloaded and installed the CRM2011 SDK
- have a working CRM2011 environment to your proposal.
- you have an account for CRM2011 with sufficient rights (I’d recommend System Administrator)
- have visual studio 2010 installed.
- downloaded and extract my visual studio example solution
So you’ve met all the requirements mentioned above? Good; let’s get started.
Note: all code should be used for Demo/Test purposes only! I did not intent it to be Production Grade. So, if you decide to use it, don’t use it for Production Purposes!
Building your Custom Workflow Activity for CRM2011
Once you’ve downloaded and extracted my visual studio example solution, it is time to open it and fix some issues.
Ensure your references are correct
Go to the Crm2011Entities Project and extend the references folder and remove the following two references
Once done, we are going to re-add these references; so right click on the References folder of the Crm2011Entities Project and click ‘Add Reference’
Now click on the ‘browse’ button, in the add reference dialog window
Now browse to your Windows CRM 2011 SDK BIN folder (in my case: B:InstallMicrosoft CRMSDK CRM 2011bin) and select the following two assemblies:
- microsoft.xrm.sdk
- microsoft.xrm.sdk.workflow
Now repeat the above mentioned steps for the other project
Generate a strongly typed class of all your existing CRM entities.
Open op the “Crm2011Entities Project”, and notice that it does not contain any files except a readme.txt file.
Q&A session
Me: Well let’s add a file to this project, shall we?
You: Hmmm, what file you ask?
Me: Well this project will hold a class file which contains all the definitions of your CRM 2011 Entities.
You: O no, do I need to create this myself?
Me: Well lucky you, there is no need for this.
You: So how do I create this file then?
Me: Well just follow the steps mentioned below
So let’s fix this, and open up a command prompt with administrator privileges.
Now navigate to your CRM 2011 SDK Folder (in my case this would be: B:InstallMicrosoft CRMSDK CRM 2011bin)
Note: Before you proceed, ensure that you know the url of the the CRM2011 OrganizationService. Just test it, by simply browsing to this address, and if everything goes right you should see the following page:
now type in the following command (and replace the values between <….> with your values (see readme.txt)):
Once completed, you should be presented with the following output:
The actual file should be written to the location you set and in my case this is: c:Temp
Once the actual class has been generated, open Visual Studio and right click on the CRM2011Entities project and select ‘Add Existing Item’
Browse to the directory in which the generated class was saved, and select the generated class.
At this point you should be able to compile the complete solution, so go ahead and do so.
Note: The source code; includes comments which should be self-explanatory
Making the custom workflow activity available in CRM2011.
So you’ve successfully compiled the solution, so what’s next? Well now it’s time to import this custom created activity in CRM2011.
In order to do this we will use this nifty application which comes with the CRM2011 SDK. This application is called ‘pluginregistration’ and can be found in the subdirectory tools/pluginregistration of the CRM2011 SDK (in my case the location is
B:InstallMicrosoft CRMSDK CRM 2011toolspluginregistration)
Note: As you will notice, only the source code of the pluginregistration is available; so you need to compile it; in order to use it.
In the pluginregistration folder, browse to the bin folder and either open the debug or release folder and double click the application PluginRegistration.exe
You will be presented with the following GUI:
Now click on “Create New Connection”
Fill out the connection information, consisting of:
- Label: Friendly name of connection
- In my case I named it: CRM2011
- Discovery Url: Base url of CRM
- User Name: Domain Account with sufficient rights in CRM 2011
- In my case I used: LABAdministrator
Once everything is filled in, press Connect and wait until the discovery is finished. Once finished double click on the organization name (in my case: Motion10 Lab Environent ) and wait for the objects to be loaded.
Once the objects have been loaded; you should see a screen similar to the one depicted here below:
Now let’s add our ‘Custom Activity or plugin’. Do this by selecting the ‘Register’ tab and clicking on ‘Register new Assembly’
The ‘Register New Plugin’ screen will popup and click on the ‘Browse (…)’ button.
Now browse to the bin folder of the example project “SendCrmEntityToEndPoint“ (the one you compiled earlier) and select the SendCrmEntityToEndPoint.dll file and click on ‘Open’
Once done, select the option “None“ at step 3 and select the option “Database“ at step 4 and press the button ‘Register Selected Plugins’
Once done you should receive feedback that the plugin was successfully registered.
Creating a workflow in CRM2011 which uses the custom activity.
Now that we have registered our ‘plugin’, it is time to put it to action. In order to do so; we will logon to CRM2011 and create a custom workflow.
Once you’ve logged on to CRM2011, click on ‘Settings’
Now find the ‘Process Center’ section and click on ‘Processes’
In the main window, click on ‘New’
A dialog window will pop up; fill in the following details and once done press OK:
- Process Name: Logical name for this workflow
- I named it: OnAccountProspectStatusExport
- Entity: Entity which could trigger this workflow
- I used the Account Entity
- Category: Select WorkFlow
A new window will pop up; which is used to define the actual workflow. Use the following settings:
- Activate as: Process
- Scope: Organization
- Start When:
- check Record is created
- check Record fields change and select the field RelationShipType
- Now add the following step: Check Condition
- Set the condition to be
- Select “Account”
- Select Field “RelationshipType”
- Select “Equals”
- Select “Prospect”
- Now add our custom activity the following step: SendCrmEntityToEndPoint
- Configure this activity like this:
- Export to disk: True
- EndPoint location: <Path where entity needs to be written>
- In my case I used: c:temp (note this will be written on the c drive on the CRM server!)
- Now once again add our custom activity the following step: SendCrmEntityToEndPoint
- Configure this activity like this:
- Export to disk: False
- EndPoint location: Url path to your BizTalk webservice
- In my case I used: the endpoint which points to my generated BizTalk WebService (which we will cover in our next blogpost)
Well at this point your workflow should look similar to this:
Now click on the ‘Activate’ button
Confirm the ‘Activation’
Save and close the new workflow
Test if everything works
So now it is time to see if everything works; in order to do so we will create a new Account and if everything went ok; we should see
- An Account.xml file somewhere on disk
- An Routing Error in BizTalk (as we send a document which was not recognized by BizTalk)
In CRM2011 click on the ‘Work Place’ button
Subsequently click on ‘Accounts’
And finally add a new ‘Account’, by clicking on ‘NEW’
A new window will pop-up; fill in some basic details
and don’t forget to set the Relationship type to ‘Prospect’
Once done click on the ‘Save & Close’ button
After a few minutes we can check both our output directory and the BizTalk Administrator, and we should notice that in the output directory a file has been written
and we should have an ‘Routing Failure’ error in BizTalk.
Closing Note
So this sums up our first part in which we build our own Workflow activity, imported it into CRM2011, constructed a workflow and last but not least saw that it worked.
Hope you enjoyed the read
Cheers
René
by Rene Brauwers | Jan 13, 2012 | BizTalk Community Blogs via Syndication
Well it has been a while since my last post; however as I stated in my first post. “I’ll only try to blog whenever I have something which in my opinion adds value”, and well the topic I want to discuss today might just add that additional value.
Please note: This post will supply you with background information, the actual implementation of the solution will be covered in the next blog posts. However the sample files which are mentioned in this post can already be downloaded.
Scenario sketch
Let’s say one of your customer’s are considering to replace their current CRM with Microsoft CRM2011.
Now one of the company’s business processes dictates that whenever a new customer or contact has been added to their CRM system, this data has to be send to their ERP system near-real-time. This customer or contact is then added into to ERP system and is assigned an unique account number. This account number then needs to be send back to the CRM system. As an end result the corresponding customer in CRM2011 is updated with the account number from the ERP system.
Their current CRM solution already takes care of this functionality however this has been implemented using a point-to-point solution and therefore replacing their current CRM with Microsoft CRM2011 would break this ‘integration-point’. The customer is aware that in the long-term it would be best to move away from these kind of point-to-point solutions and move more to a Service Oriented Architecture.
At the end of the day it is up to you to convince your customer that it is no problem at all with Microsoft CRM2011 to setup a solution which includes an integration with their ERP system and as you are aware of the fact that the customer wants to move to a Service Oriented Architecture, you see the opportunity fit to introduce the company to BizTalk Server 2010 as well.
So eventually you propose the following Proof of Concept scenario to your customer: ‘You will show to the customer that it is possible with almost no effort to build a solution which connects Microsoft CRM 2011 to their ERP system, whilst adhering to the general known Service Oriented Architecture principles’; once you tell your customer that this POC does not involve any costs for them except time and cooperation; they are more than happy and agree to it.
Preparing your dish
In order to complete the solution discussed in this blog post you will need the following ingredients:
A test environment consisting of:
- 1 Windows Server 2008R2 which acts as Domain Server (Active Directory)
- 1 Windows Server 2008R2 on which Microsoft CRM2011 is installed and configured
- 1 Windows Server 2008R2 on which Microsoft BizTalk Server 2010 is installed and configured.
- One Development Machine with Visual Studio 2010 installed
Step 1: How do I get data out of Microsoft CRM2011?
Well in order to get data (let me rephrase; an entity) out of CRM for our Integration scenario we will need to build a custom activity which can be added as a workflow step within CRM2011.
So which ingredients are required to do this?
- We need to download the CRM2011 SDK; so go and fetch it here
So what are we going to build?
- We will build a custom activity and deploy it to CRM2011 such that it can be used in a workflow, or download my example and install it
Step 2: How do I get data in my custom ERP?
Well for this step I’ve build a custom application which will act as a stub for our custom ERP. This custom ERP system will be exposed by means of a WCF service.
So which ingredients are required to do this?
So what are we going to build?
- Well you could build your own application, or download my example and install it.
Step 3: How do I get data into CRM2011?
Well in order to get data into CRM; we will use the out of the box web services which are exposed by CRM2011.
So which ingredients are required to do this?
- Well if you have not yet downloaded the CRM2011 SDK; go and fetch it here
So what are we going to build?
- Well in order to make our life easier we will build a proxy web service; which will talk directly to CRM2011 this way we will make our integration efforts go smoother.
Step 4: How do I hook it all together?
Well for this part we will use BizTalk, BizTalk will receive the ‘Create Customer’ event from CRM and subsequently logic will be applied such that this data is send to the custom ERP application. Once the insert was successful the ERP system sends back an customer account number and subsequently we will update the corresponding Entity in CRM2011 with the account number obtained from the ERP system.
So which ingredients are required to do this?
- Well if you have not yet downloaded the CRM2011 SDK; go and fetch it here 🙂
So what are we going to build?
- Well we need to make a customization to our Account Entity in CRM2011, to be more specific; we will add a custom field to the Account entity and call it Account Number.
- We will build a BizTalk solution which will hook all the bits together.
Closing Note
So this sums up the introduction part. Be sure to check back soon for the follow up part in which I’ll discuss how to build our CRM Trigger