[Source: http://geekswithblogs.net/EltonStoneman]

The sample BizTalk application provided with the BizTalk Cache Adapter on CodePlex illustrates three approaches for using the cache:

  • Simple Messaging – routes all messages configured for caching to the cache adapter, and sends cache-misses on to the service provider;
  • Orchestration – explicitly routes messages to either the cache adapter or service provider in orchestration logic;
  • Complex Messaging – routes messages configured for caching either to the cache adapter or to the service provider.

All the samples use FILE locations for the originating request and ultimate response. The expected locations will be created in the install, with sample request files in the receive locations. Enable the relevant receive location to run the sample, then copy the request file to the receive location again to re-run it – from the second run onwards, the responses should be the cached versions.


To use the samples, you’ll need to install and configure a cache provider, the cache adapter, cache adapter sample, sample service, and the cache viewer (if you want to see the contents of the cache).

Both the Cache Adapter and Sample BizTalk MSIs use PowerShell scripts for their post-processing steps. If you want to install from the MSIs, you’ll need to have PowerShell installed and set up to enable script execution (Set-ExecutionPolicy RemoteSigned is fine).

1. Cache Provider.

Download and install NCache Express (this is a free edition, you’ll have a product key emailed to you), and configure a cache instance for the samples by modifying the configuration file (config.ncconf – by default in C:\Program Files\NCache Express\config):


<cache-config name=Sixeyed.CacheAdapter.NCacheExpressProvider.Tests inproc=false>

<cleanup interval=1sec/>

<log trace-errors=true trace-debug=false enabled=true/>

<storage cache-size=5mb/>

<eviction-policy default-priority=normal eviction-ratio=10% eviction-enabled=true/>

<perf-counters enabled=false/>



This creates a single-node cache on the local machine, limited to 5Mb of system memory. Start the cache by running the startcache tool (C:\Program Files\NCache Express\bin\tools):

startcache Sixeyed.CacheAdapter.NCacheExpressProvider.Tests

– and verify it’s there with listcaches. You should see the cache listed as Running:

2. Cache Adapter.

Sixeyed.CacheAdapter.msi is the BizTalk MSI which installs the adapter and artifacts. Import the application into BizTalk and run the installer. Verify the Sixeyed.CacheAdapter application is listed, and add the adapter in the Administration Console through Platform Settings…Adapters…New.

Note: the installer adds the necessary adapter files to the GAC and sets up the registry keys. If you don’t use the MSI, manual installation steps are: add the files from the Binaries release to a local directory and to the GAC; modify run the .REG command to specify assembly paths and import the .REG file; add the BizTalk artifact resources to your own application.

2. Cache Adapter Sample.

Sixeyed.CacheAdapterSample.msi is the BizTalk MSI which installs the sample application and artifacts, and configures the ports. Import the application into BizTalk and run the installer. Verify the Sixeyed.CacheAdapterSample application is listed, and that the message types are configured in SSO (check the contents of SSO application Sixeyed.CacheAdapter).

Note: the installer creates FILE send and receive locations, copies sample messages, and sets up SSO configuration for the cached message types. If you don’t use the MSI, manual installation should follow the steps in: Source\Sixeyed.CacheAdapterSample\Deployment\Sixeyed.CacheAdapterSample.Install.ps1.

4. Sample Service.

Sixeyed.CacheAdapterSample.Service.msi installs a WCF service for the samples to call. Verify the service has deployed correctly by navigating to http://localhost/Sixeyed.CacheAdapterSample/SampleService.svc?wsdl.

5. Cache Viewer.

Sixeyed.CacheAdapter.CacheViewer.msi installs a Winforms app for viewing cache contents. Run the application and configure it to use the running cache instance, by navigating to the NCacheExpressProvider assembly and selecting the NCacheExpressProvider:

– and specifying the ID of the cache instance:

(This is the same management UI used for configuring cache adapter ports in the BizTalk Administration Console. For the samples the Cache Id is Sixeyed.CacheAdapter.NCacheExpressProvider.Tests).

Simple Messaging

Enable the receive location ReceiveRequest_SimpleMessaging.FILE and you should see the following workflow:

  1. Message is picked up by the receive location, which is using the CacheXMLReceive pipeline. This checks the cache configuration in SSO, where the message type is configured for caching, so context property IsMessageCacheEnabled is set to true;
  2. Send port SimpleMessaging_GetCache picks up the message and sends it to the cache adapter, using the request message to build a cache key, and checking for a response existing in the cache. There is no cached response, so the message returned from the cache adapter is the original request message, with context property IsCached set to false;
  3. Send port SimpleMessaging_SendServiceRequest picks up the message and sends it to the service;
  4. Send port SimpleMessaging_PutCache picks up the service response and adds it to the cache, using the cache key from the original request;
  5. Send port SimpleMessaging_SendServiceResponse picks up the service response and copies it to the output directory.

Check the cache viewer and you will see a byte array cached with a GUID key:

The GUID is the hash of the request message, and the byte array is the response message. Subsequent drops of the same request into the Requests folder will find the cached response at step 2, and the response will be the cached message.

This workflow is suitable for any simple messaging solution, where static ports are used and where the original request and ultimate response can be auctioned by different ports. Configuration of the cache instance is contained in the send ports.


The orchestration sample follows the same workflow as the Simple Messaging sample, and uses the same CacheXMLReceive pipeline to add the caching properties to the incoming message. The orchestration sample makes the cache access with explicit send and receives, and configuration of the cache instance is still contained in the send ports.

Enable the receive location ReceiveRequest_Orchestration.FILE to run the sample. This is suitable for solutions where finer grained control is needed over the caching mechanism, and an orchestration is a viable option.

Complex Messaging

Both the simple implementations use the cache adapter as a passthrough when there is no matching response in the cache – the incoming message is returned by the cache adapter, and a send port picks it up and routes it on the service. This is not suitable for more complex solutions:

  • where the service is invoked through a dynamic port, as the outbound transport and location set on the original message are not reset by the cache adapter on the passed-through message;
  • where the originating receive is from a request-response port, as the interchange will accept the passed-through message as a response, and not return the actual service request.

For these scenarios, the cache checking can be moved forward into the originating receive, so the cache adapter is only called if the response is known to exist in the cache. Added complexity here is that the receive pipeline needs to have the cache instance configuration, along with the send ports. This is shown in the Complex Messaging sample – run by enabling the receive location ReceiveRequest_ComplexMessaging.FILE:

  1. Message is picked up by the receive location, which is using a custom pipeline – CheckCacheReceive. This consists of the SetMessageCacheProperties component which checks SSO to see if the message type is configured for caching, followed by the CheckIsCached component which checks is the message response is actually in the cache, and sets the IsCached property:

    Note that the cache instance is configured in the pipeline. Using a custom pipeline means you can configure the instance using the Cache Adapter UI in Visual Studio, which is easier and applies validation. While you could use a generic pipeline and modify the configuration per-instance on receive locations, that means manually hacking the XML;

    1. If the message response is cached, send port ComplexMessaging_GetCache picks up the message and sends it to the cache adapter. The adapter is configured in the same way as other samples, and uses the same logic – but in this case the response is known to be cached, so it will be returned and the request message should never be passed-through;
    2. If the message response is not cached, send port ComplexMessaging_SendServiceRequest calls the service provider passing the original request message;
    3. Send port ComplexMessaging_PutCache picks up the service response and adds it to the cache, using the cache key from the original request;
    4. Send port ComplexMessaging _SendServiceResponse picks up the service response and copies it to the output directory.

There is a risk in this scenario that the object will exist in cache at the point of checking, but been removed by the time the cache is actually read. In this case the response from step 2 will be a passthrough of the original request – in the sample, this means the originating request will be copied to the output directory. Suspending the message may be a better option, but depending on the solution you may be able to cope with this differently.