WCF netTcpRelayBinding Streaming Gotcha

WCF netTcpRelayBinding Streaming Gotcha

If you are exploring the Windows Azure AppFabric Service Bus capabilities and trying the bits out in your projects, the following article may shed some light on where to be on a lookout when it comes to using the netTcpRelayBinding in Streaming mode. By following the recommendations in this article you can maximize the performance of your TCP streaming solutions.

Scenario

I’m currently leading the development efforts on a customer project where we are streaming large amounts of structured (XML) data from an on-premise BizTalk Server 2010 environment all the way to a cloud-based inventory database hosted on SQL Azure. The message flow can be simplified to an extent where it can be described as follows:

  1. Inventory files are being received from many EDI partners and transformed into a canonical inventory schema representation using BizTalk Server’s support for EDI interoperability and data mapping/transformation;
  2. The canonical inventory schema instances are being picked up by a designated WCF-Custom Send Port configured with netTcpRelayBinding that talks to the Azure Service Bus;
  3. The inventory data is relayed in streaming mode through the Service Bus to a WCF Service endpoint hosted in a worker role on the Windows Azure;
  4. The WCF Service receives the data stream and relays it further to a SQL Azure database-based queue so that the data becomes available for processing.

Below is the depicted version of the message flow that we have implemented at the initial stage of the project:

pic3

The Windows Azure AppFabric Service Bus makes the above scenario shine as it makes it easy to connect an existing on-premise BizTalk infrastructure with cloud-based service endpoints. While it’s truly an eye-opener, there are several observations that we have made as it relates to data streaming over TCP.

Observations

As referenced above, the cloud-hosted WCF service exposes a streaming-aware operation that takes the inbound data stream and makes sure that it safely lands in a SQL Azure database. Specifically, we are reading the data from an inbound stream into a memory buffer in chunks, and then flush the buffer’s content into a  varchar(max) field using the Write() mutator operation supported by the UPDATE command.

The code snippet implementing the above technique is shown below:

#region IPersistenceServiceContract implementation
public Guid PersistDataStream(Stream data)
{
    // Some unrelated content was omitted here and the code below was intentionally simplified for sake of example.

// For best performance, we recommend that data be inserted or updated in chunk sizes that are
// multiples of 8040 bytes.
int bufferSize = 8040 * 10; using (ReliableSqlConnection dbConnection = new ReliableSqlConnection(dbConnectionString)) using (SqlStream sqlStream = new SqlStream(dbConnection, readDataCommand, writeDataCommand, getDataSizeCommand)) { BinaryReader dataReader = new BinaryReader(data); byte[] buffer = new byte[bufferSize]; int bytesRead = 0; do { bytesRead = dataReader.Read(buffer, 0, bufferSize); if (bytesRead > 0) { TraceManager.CustomComponent.TraceInfo("About to write {0} bytes into SQL Stream", bytesRead); sqlStream.Write(buffer, 0, bytesRead); } } while (bytesRead > 0); } return Guid.NewGuid(); } #endregion

While this code does in fact transfer the data, we were surprised to find that it does not do so in 80400 byte chunks!  Despite the fact that both the client and server WCF bindings were configured correctly and identically as and where appropriate, including such important configuration parameters as reader quotas, max buffer size, etc, we have noticed that the specified buffer size was not appreciated by the underlying WCF stream. This basically means that the chunk size returned from the Read method was never even near the anticipated buffer size of 80400 bytes. The following trace log fragment supports the above observations (note the instrumentation event in the above code that we emit before writing data into a SQL Azure database):

SQL Stream Write Bytes

There is an explanation for the behavior in question.

First of all, some fluctuation in the read chunk size bubbled up by the OSI transport layer is expected on any TCP socket connection. With TCP streaming, the data is being made available immediately as it streams off the wire. The TCP sockets generally don’t attempt to fill the buffer completely, they do their best to supply as much data as they can as timely as they can.

Secondly, when we set the buffer size to 80400 bytes, we unintentionally attempted to ask the TCP stack to buffer up to 53 times of its Maximum Transmission Unit (MTU) value as well as potentially exceeding the maximum TCP receive window size. This is an unrealistic ask.

So, why do these small incremental (sometimes appearing to be random) chunks project potential concerns to a developer? Well, in our example, we are writing data into a SQL Azure database and we want this operation to be as optimal as possible. Writing 2, 6, 255 or even 4089 bytes per each call doesn’t allow us to achieve the desired degree of efficiency. Luckily, a solution for this challenge comes across extremely well in the following simple approach.

Solution

Simply put, we need to make sure that the data will be continuously read from the inbound stream into a buffer until the buffer is full. This means that we will not stop after the first invocation of the Read method – we will be repetitively asking the stream to provide us with the data until we are satisfied that we have received the sufficient amount. The easiest way of implementing this would be through an extension method in C#:

public static class BinaryReaderExtensions
{
    public static int ReadBuffered(this BinaryReader reader, byte[] buffer, int index, int count)
    {
        int offset = 0;

        do
        {
            int bytesRead = reader.Read(buffer, index + offset, count);

            if (bytesRead == 0)
            {
                break;
            }

            offset += bytesRead;
            count -= bytesRead;
        }
        while (count > 0);

        return offset;
    }
}

Now we can flip the method name from Read to ReadBuffered in the consumer code leaving the rest unchanged:

do {
// Note the name changed from Read to ReadBuffered as we are now using the extension method. bytesRead = dataReader.ReadBuffered(buffer, 0, bufferSize); if (bytesRead > 0) { TraceManager.CustomComponent.TraceInfo("About to write {0} bytes into SQL Stream", bytesRead); sqlStream.Write(buffer, 0, bytesRead); } } while (bytesRead > 0);

The end result is that we can now guarantee that each time we invoke a SQL command to write data into a varchar(max) field, we deal with completely full buffers and data chunks the size of which we can reliably control:

SQL Stream Write Bytes (80400 bytes)

As an extra benefit, we reduced the number of database transactions since we are now able to stream larger chunks of data as opposed to invoking the SQL command for a number of smaller chunks as it was happening before.

Conclusion

Streaming is a powerful and high-performance technique for large data transmission. Putting on the large Azure sun glasses, we can confidently say that the end-to-end streaming between on-premise applications and the Cloud unlocks extremely interesting scenarios that could make impossible possible.

In this article, we shared some observations from our recent Azure customer engagement and provided recommendations as to how to avoid a specific “gotcha” with WCF streaming over netTcpRelayBinding in the Windows Azure AppFabric Service Bus. When implemented, these recommendations may help the developers increase the efficiency of the application code consuming the WCF streams.

Additional Resources/References

For more information on the related topic, please visit the following resources:

How to make a WorkflowService implement a contract

How to make a WorkflowService implement a contract

You may have noticed that WorkflowServices have two ways of operating.  One way is to pass message content and the other way is to use message parameters.  I have always used message content because it seemed like the easiest thing to do.

Download the Sample Code on MSDN Code Gallery

Get Microsoft Silverlight

Today I wanted to write some test code that would new up a client proxy with a Channel factory without having to go through generating a service reference.  I was just writing some test code so I figured I would create an interface that looked like the workflow service and create the proxy.

Given this workflow service (the default template)

image

I created this interface

using System.ServiceModel;


namespace Contract
{
[ServiceContract]
public interface IService1
{
[OperationContract]
string GetData(int data);
}
}

It didn’t work.  I got this error

System.ServiceModel.ActionNotSupportedException was unhandled

  Message=The message with Action ‘<a href="http://tempuri.org/IService1/GetData’">http://tempuri.org/IService1/GetData'</a> cannot be processed at the receiver, due to a ContractFilter mismatch at the EndpointDispatcher. This may be because of either a contract mismatch (mismatched Actions between sender and receiver) or a binding/security mismatch between the sender and the receiver.  Check that sender and receiver have the same contract and the same binding (including security requirements, e.g. Message, Transport, None).

What to Do

When I face a problem like this I immediately stop trying to fix it in the big project and create a very simple spike project where I can explore what is going on.  I decided to create a project with a WorkflowService and a WCF Service then build a console app that could call both of them.  This allowed me to compare the WCF Service to the Workflow Service and here is what I found

Workflow Service Request

<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/">

<s:Header>
<Action s:mustUnderstand="1" xmlns="http://schemas.microsoft.com/ws/2005/05/addressing/none">http://tempuri.org/IService1/GetData</Action>
</s:Header>
<s:Body>
<int xmlns="http://schemas.microsoft.com/2003/10/Serialization/">1</int>
</s:Body>
</s:Envelope>

Workflow Service Response

<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/">

<s:Header />
<s:Body>
<string xmlns="http://schemas.microsoft.com/2003/10/Serialization/">1</string>
</s:Body>
</s:Envelope>

WCF Service Request

<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/">

<s:Header>
<Action s:mustUnderstand="1" xmlns="http://schemas.microsoft.com/ws/2005/05/addressing/none">http://tempuri.org/IService1/GetData</Action>
</s:Header>
<s:Body>
<GetData xmlns="http://tempuri.org/">
<data>1</data>
</GetData>
</s:Body>
</s:Envelope>

WCF Service Response

<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/">

<s:Header />
<s:Body>
<GetDataResponse xmlns="http://tempuri.org/">
<GetDataResult>1</GetDataResult>
</GetDataResponse>
</s:Body>
</s:Envelope>

As you can see the WCF service wraps up the parameters inside a request/response message where the Workflow Service (using message content) simply serializes the data into the message body.

The WCF Client proxy (created by the channel factory based on the interface) does not understand the message content as sent by the Workflow service.  Of course you can always add a service reference and generate the correct client proxy.

The Solution

The fix for this is to change the way that the messaging activities deal with the content.  If you want to create a Workflow Service where the content looks like it would for a WCF service then you need to use Message Parameters. 

SNAGHTMLa26cd9

And for the response you need to name the return value (MethodName)Result.

SNAGHTMLa32af9

Now when I run it everything is working as it should and the Workflow Service request response content looks exactly like the WCF Service content.

Troubleshooting

If you are still having trouble, don’t forget to set the ServiceContractName (and namespace if necessary) on each of the receive activities.

VS 2010 Web Deployment

VS 2010 Web Deployment

This is the twenty-fifth in a series of blog posts I’m doing on the VS 2010 and .NET 4 release.

Today’s blog post is the first of several posts I’ll be doing that cover some of the improvements we’ve made around web deployment.  I’ll provide a high-level overview of some of the key improvements.  Subsequent posts will then go into more details about each feature and how best to take advantage of them.

Making Web Deployment Easier

Deploying your web application to a server is something that all (successful) projects need to do.  Without good tools to help you, deployment can be a cumbersome task – especially if you need to do it manually.

VS 2010 includes a bunch of improvements that make it much easier to deploy your ASP.NET web applications – and which enable you to build automated deployment procedures that make deployment easily reproducible.  The deployment features support not just deploying your web content – but also support customizing your web.config file settings, deploying/updating your databases, and managing your other dependencies.  You can kick-off deployments manually – or via automated scripts or as part of an automated build or continuous integration process.

Below is a high-level overview of some of the key new web deployment features in VS 2010.  I’ll do subsequent posts that provide more details on how to use/customize each of them.

New “Publish Web” Dialog

Visual Studio 2010 includes a new “Publish Web” dialog that you can use to quickly deploy a web application to a remote server.

You can activate the dialog by right-clicking on an ASP.NET Web Project node within the solution explorer, and then select the “Publish” context menu item:

image

Selecting this will bring up a “Publish Web” dialog which allows you to configure publish location settings. 

Configuring and Saving a Publish Profile

You only need to define your publish settings once – you can then save them as a named “Publish Profile” to enable you to quickly re-use them again later.

image

Above I’ve created a “ScottGu Site” profile, and configured it to deploy via FTPS (a version of FTP that uses SSL) to a remote server.  To deploy over FTPS select the “FTP” node in the drop-down, and then prefix the server location you want to publish to with the “ftps://” prefix. 

Note that you can either re-enter your password each time you deploy – or save the password for future uses in a secure location (just click the “Save Password” checkbox to do this.

Web Deploy

In addition to supporting FTP/FTPS, VS 2010 also supports a more powerful publish mechanism called “Web Deploy”.  Web Deploy (earlier known as MSDeploy) provides a much more comprehensive publishing and deployment mechanism than FTP. It not only allows you to publish files, but also allows you to publish IIS Web Server Settings, Database Schema/Data, Database Change Scripts, Security ACLs, and much more.

Web Deploy can be used to deploy applications both to a single server, as well as to multiple servers within a web farm.  Web Deploy is also now supported by many inexpensive Windows hosting providers (some as cheap as $3.50/month for an ASP.NET + SQL account).  You can find great ASP.NET hosters that support Web Deploy by visiting this page: http://asp.net/find-a-hoster.

One Click Publish Toolbar

Clicking the “Publish” button within the “Publish Web” dialog will publish a web application (and optionally associated database schema/content) to a remote web server. 

VS 2010 also supports a “one click” publish toolbar that you can add to your IDE to quickly publish/re-publish your project without having to load the  “Publish Web” dialog:

image

Just select your publish profile from the toolbar drop-down and then click the publish icon to the right of it to begin deploying your application. 

Web.Config Transformations

In most real-world deployment scenarios, the web.config file you use for development is different than the one you use for production deployment.  Typically you want to change environment settings like database connection-strings, making sure debug is turned off, and enabling custom errors so that end-users (and hackers) don’t see the internals of your application.

VS 2010 now makes it easy to customize/tweak/modify your web.config files as part of your publish/deployment process.  Specifically, you can now easily have build-configuration specific transformation files that can customize your web.config file prior to the application being deployed:

image

You can maintain a separate transform file per Visual Studio build-environment.  For example, you could configure your project/solution to have a “Debug”, “Staging” and “Release” build configuration – in which case VS will maintain three separate transform files for you.  VS will automatically apply the appropriate one at deployment time depending on what your VS environment is set to.

I will dive deeper into how to perform web.config file transformations in a future blog post.

Database Deployment

VS 2010 allows you to optionally deploy a database, along with your web application files, when are using the “Web Deploy” option as your deployment mechanism. Databases deployed this way can include both schema and data, and can optionally also include change scripts to update existing databases.

ASP.NET Web Projects in VS 2010 have a special page within their the “project properties” settings to configure database deployments:

image

I will dive deeper into how to perform database deployments in future blog posts.

Web Deployment Packages

VS 2010 also supports a packaging option that enables you to package up your ASP.NET Web Application (together with its dependencies like web.config, databases, ACLs, etc) into a .zip based deployment package file that you can optionally hand-off to an IT administrator who can then easily install it either via the IIS Admin Tool or via a command-line/powershell script. 

The deployment package you create can optionally expose application configuration settings that can be overridden (like directory locations, database connection-strings, etc).  When using the IIS7 Admin Tool, the install wizard can prompt the administrator for each setting to be customized – enabling you to provide a clean customization experience without having to write any custom code to-do so.  The settings can also obviously be passed as arguments on the command-line when using a command-line or Powershell script to deploy the application.

To create a web package within Visual Studio 2010, just right click on your ASP.NET Web Project node in the solution explorer and select the “Build Deployment Package” menu item:

image

This will compile your application, perform appropriate web.config transforms on it, optionally create .sql scripts for your database schema and data files, and then package them all up into a .zip deployment package file.  Adjacent to the .zip file you’ll file a deployment script file that you can use to automate deployment of the package to a remote server.

I will dive deeper into how to create web deployment packages in future blog posts.

Continuous Integration with Team Build

Most of the VS 2010 web deployment features that I described above are built on top of MSBuild tasks & targets. The “Team Build” feature of TFS also uses MSBuild, and supports running nightly builds, rolling builds, and enabling continuous integration. This means that you can create deployment packages, or automatically publish your web applications from a Team Build environment.

I will dive deeper into how to enable this in future blog posts.

Summary

Today’s blog post covered some of the new VS 2010 web deployment features at a high-level.  All of the above features Iwork with both VS 2010 as well as the free Visual Web Developer 2010 Express Edition.

Hopefully today’s post provided a broad outline of all the new deployment capabilities, and helped set context as to how they are useful. In future posts I’ll go deeper and walkthrough the specifics of how to really take full advantage of them.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

Testing a WorkflowService .xamlx file with Visual Studio

Testing a WorkflowService .xamlx file with Visual Studio

One thing I have not tackled until today was how to test a Workflow Service in a .xamlx file.  When I started out this morning working on it I realized that there isn’t a great deal of information about how to do this available anywhere.

Update 7/30/2010Download the WF4 Workflow Test Helper library for sample code

Here is what I came up with.  Look at this test and tell me what you think.

using System.Activities;

using System.Activities.Tracking;
using System.ServiceModel;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using WorkflowTestHelper.Tracking;

namespace WorkflowTestHelper.Tests
{
/// <summary>
/// This test demonstrates the WorkflowServiceTestHost class
/// </summary>
[TestClass]
public class WorkflowServiceTestHostTest
{
/// <summary>
/// The endpoint address to be used by the test host
/// </summary>
private readonly EndpointAddress _serviceAddress = new EndpointAddress("net.pipe://localhost/TestService");

/// <summary>
/// Verifies that the WorkflowServiceTestHost hosts a service and that the service receives and sends a reply
/// </summary>
/// <remarks>
/// Be sure to enable deployment - the xamlx file must be deployed
/// </remarks>
[TestMethod]
[DeploymentItem(@"WorkflowTestHelper.Tests.Activities\TestService.xamlx")]
public void ShouldHostService()
{
var trackingProfile =
new TrackingProfile
{
Queries =
{
new ActivityStateQuery
{
ActivityName = "ReceiveRequest",
States = {"Executing"},
},
new ActivityStateQuery
{
ActivityName = "SendResponse",
States = {"Executing"},
},
}
};


using (var host = WorkflowServiceTestHost.Open("TestService.xamlx", _serviceAddress.Uri, trackingProfile))
{
var client = new ServiceClient(new NetNamedPipeBinding(), _serviceAddress);
var response = client.GetData(1);
Assert.AreEqual("1", response);

host.Tracking.Trace();

// Find the tracking records for the ReceiveRequest and SendResponse

// Activity <ReceiveRequest> state is Executing
AssertTracking.ExistsAt(host.Tracking.Records, 0, "ReceiveRequest", ActivityInstanceState.Executing);

// Activity <SendResponse> state is Executing
AssertTracking.ExistsAt(host.Tracking.Records, 1, "SendResponse", ActivityInstanceState.Executing);
}
}
}
}

-Updated on 7/30/2010- Windows Azure AppFabric LABS August Release, Breaking Changes Announcement and Scheduled Maintenance

-Updated on 7/30/2010- Windows Azure AppFabric LABS August Release, Breaking Changes Announcement and Scheduled Maintenance

The Windows Azure AppFabric LABS August release is scheduled for August 5, 2010 (Thursday).  Users will have NO access to the AppFabric LABS portal and services during the scheduled maintenance down time.  

 

When:

 

    START: August 5, 2010, 10am PST

    END:  August 5, 2010, 6pm PST

 

Impact Alert:

    LABS AppFabric Service Bus, Access Control and portal will be unavailable during this period. Additional impacts are described below.

 

Action Required:

 

Existing accounts and Service Namespaces will be available after the services are deployed.

 

However, ACS Rules, Issuers, Scopes, and Token Policies will NOT be persisted and restored after the maintenance. Users will need to back up their data if they wish to restore them after the Windows Azure AppFabric LABS August Release.

 

Also note that the Service Bus Multicast with Message Buffers features, which have been available in LABS since March 2010, will be temporarily removed from the LABS environment. The team is working on an alternative approaches to these features.

 

Thanks for working in LABS and giving us valuable feedback.  Once the update becomes available, we’ll post the details via this blog. 

Stay tuned for the upcoming LABS release!

 

The Windows Azure AppFabric Team

Visual Studio 2010 Keyboard Shortcuts

Visual Studio 2010 Keyboard Shortcuts

Earlier this week the Visual Studio team released updated VS 2010 Keyboard Shortcut Posters.  These posters are print-ready documents (that now support standard paper sizes), and provide nice “cheat sheet” tables that can help you quickly lookup (and eventually memorize) common keystroke commands within Visual Studio.

image

This week’s updated posters incorporate a number of improvements:

  • Letter-sized (8.5”x11”) print ready versions are now available
  • A4-sized (210x297mm) print ready versions are now available
  • The goofy people pictures on them are gone (thank goodness)

The posters are in PDF format – enabling you to easily download and print them using whichever paper size is in your printer.

Download the Posters

You can download the VS 2010 Keybinding posters in PDF format here.

Posters are available for each language.  Simply look for the download that corresponds to your language preference (note: CSharp = C#, VB = VB, FSharp = F#, CPP = C++). 

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu