by community-syndication | Jan 13, 2011 | BizTalk Community Blogs via Syndication
In a recent client project, I found a need to call an EDI pipeline from my orchestration.
Environment:
BizTalk Server 2009
Use Case:
Call external synchronous web service that expects a string type with contents of an EDI document. The orchestration receives response and takes action based on response code type. In my case, I passed a 270 and […]
by community-syndication | Jan 13, 2011 | BizTalk Community Blogs via Syndication
Sign up for classroom training before January 31st and receive 15% off your registration.
Kick off the new year with training from QuickLearn. Update your skills, improve at your job, and make getting a raise your New Year’s resolution!
All classes taught at our training center in Redmond, WA (or through remote training from your home or office) are eligible. Register for classroom training using the promotion code: NY2011
Search the course calendar to find and register for your course.
*Sorry this offer cannot be applied to prior registrations or combined with any other offers. Discounts are not valid at partner locations.
by community-syndication | Jan 13, 2011 | BizTalk Community Blogs via Syndication
I’m excited to announce the release today of several products:
- ASP.NET MVC 3
- NuGet
- IIS Express 7.5
- SQL Server Compact Edition 4
- Web Deploy and Web Farm Framework 2.0
- Orchard 1.0
- WebMatrix 1.0
The above products are all free. They build upon the .NET 4 and VS 2010 release, and add a ton of additional value to ASP.NET (both Web Forms and MVC) and the Microsoft Web Server stack.
ASP.NET MVC 3
Today we are shipping the final release of ASP.NET MVC 3. You can download and install ASP.NET MVC 3 here. The ASP.NET MVC 3 source code (released under an OSI-compliant open source license) can also optionally be downloaded here.
ASP.NET MVC 3 is a significant update that brings with it a bunch of great features. Some of the improvements include:
Razor
ASP.NET MVC 3 ships with a new view-engine option called “Razor” (in addition to continuing to support/enhance the existing .aspx view engine). Razor minimizes the number of characters and keystrokes required when writing a view template, and enables a fast, fluid coding workflow.
Unlike most template syntaxes, with Razor you do not need to interrupt your coding to explicitly denote the start and end of server blocks within your HTML. The Razor parser is smart enough to infer this from your code. This enables a compact and expressive syntax which is clean, fast and fun to type.
You can learn more about Razor from some of the blog posts I’ve done about it over the last 6 months
Today’s release supports full code intellisense support for Razor (both VB and C#) with Visual Studio 2010 and the free Visual Web Developer 2010 Express.
JavaScript Improvements
ASP.NET MVC 3 enables richer JavaScript scenarios and takes advantage of emerging HTML5 capabilities.
The AJAX and Validation helpers in ASP.NET MVC 3 now use an Unobtrusive JavaScript based approach. Unobtrusive JavaScript avoids injecting inline JavaScript into HTML, and enables cleaner separation of behavior using the new HTML 5 “data-“ attribute convention (which conveniently works on older browsers as well – including IE6). This keeps your HTML tight and clean, and makes it easier to optionally swap out or customize JS libraries.
ASP.NET MVC 3 now includes built-in support for posting JSON-based parameters from client-side JavaScript to action methods on the server. This makes it easier to exchange data across the client and server, and build rich JavaScript front-ends. We think this capability will be particularly useful going forward with scenarios involving client templates and data binding (including the jQuery plugins the ASP.NET team recently contributed to the jQuery project).
Previous releases of ASP.NET MVC included the core jQuery library. ASP.NET MVC 3 also now ships the jQuery Validate plugin (which our validation helpers use for client-side validation scenarios). We are also now shipping and including jQuery UI by default as well (which provides a rich set of client-side JavaScript UI widgets for you to use within projects).
Improved Validation
ASP.NET MVC 3 includes a bunch of validation enhancements that make it even easier to work with data.
Client-side validation is now enabled by default with ASP.NET MVC 3 (using an onbtrusive javascript implementation). Today’s release also includes built-in support for Remote Validation – which enables you to annotate a model class with a validation attribute that causes ASP.NET MVC to perform a remote validation call to a server method when validating input on the client.
The validation features introduced within .NET 4’s System.ComponentModel.DataAnnotations namespace are now supported by ASP.NET MVC 3. This includes support for the new IValidatableObject interface – which enables you to perform model-level validation, and allows you to provide validation error messages specific to the state of the overall model, or between two properties within the model.
ASP.NET MVC 3 also supports the improvements made to the ValidationAttribute class in .NET 4. ValidationAttribute now supports a new IsValid overload that provides more information about the current validation context, such as what object is being validated. This enables richer scenarios where you can validate the current value based on another property of the model. We’ve shipped a built-in [Compare] validation attribute with ASP.NET MVC 3 that uses this support and makes it easy out of the box to compare and validate two property values.
You can use any data access API or technology with ASP.NET MVC. This past year, though, we’ve worked closely with the .NET data team to ensure that the new EF Code First library works really well for ASP.NET MVC applications. These two posts of mine cover the latest EF Code First preview and demonstrates how to use it with ASP.NET MVC 3 to enable easy editing of data (with end to end client+server validation support). The final release of EF Code First will ship in the next few weeks.
Today we are also publishing the first preview of a new MvcScaffolding project. It enables you to easily scaffold ASP.NET MVC 3 Controllers and Views, and works great with EF Code-First (and is pluggable to support other data providers). You can learn more about it – and install it via NuGet today – from Steve Sanderson’s MvcScaffolding blog post.
Output Caching
Previous releases of ASP.NET MVC supported output caching content at a URL or action-method level.
With ASP.NET MVC V3 we are also enabling support for partial page output caching – which allows you to easily output cache regions or fragments of a response as opposed to the entire thing. This ends up being super useful in a lot of scenarios, and enables you to dramatically reduce the work your application does on the server.
The new partial page output caching support in ASP.NET MVC 3 enables you to easily re-use cached sub-regions/fragments of a page across multiple URLs on a site. It supports the ability to cache the content either on the web-server, or optionally cache it within a distributed cache server like Windows Server AppFabric or memcached.
I’ll post some tutorials on my blog that show how to take advantage of ASP.NET MVC 3’s new output caching support for partial page scenarios in the future.
Better Dependency Injection
ASP.NET MVC 3 provides better support for applying Dependency Injection (DI) and integrating with Dependency Injection/IOC containers.
With ASP.NET MVC 3 you no longer need to author custom ControllerFactory classes in order to enable DI with Controllers. You can instead just register a Dependency Injection framework with ASP.NET MVC 3 and it will resolve dependencies not only for Controllers, but also for Views, Action Filters, Model Binders, Value Providers, Validation Providers, and Model Metadata Providers that you use within your application.
This makes it much easier to cleanly integrate dependency injection within your projects.
Other Goodies
ASP.NET MVC 3 includes dozens of other nice improvements that help to both reduce the amount of code you write, and make the code you do write cleaner. Here are just a few examples:
- Improved New Project dialog that makes it easy to start new ASP.NET MVC 3 projects from templates.
- Improved Add->View Scaffolding support that enables the generation of even cleaner view templates.
- New ViewBag property that uses .NET 4’s dynamic support to make it easy to pass late-bound data from Controllers to Views.
- Global Filters support that allows specifying cross-cutting filter attributes (like [HandleError]) across all Controllers within an app.
- New [AllowHtml] attribute that allows for more granular request validation when binding form posted data to models.
- Sessionless controller support that allows fine grained control over whether SessionState is enabled on a Controller.
- New ActionResult types like HttpNotFoundResult and RedirectPermanent for common HTTP scenarios.
- New Html.Raw() helper to indicate that output should not be HTML encoded.
- New Crypto helpers for salting and hashing passwords.
- And much, much more
Learn More about ASP.NET MVC 3
We will be posting lots of tutorials and samples on the http://asp.net/mvc site in the weeks ahead. Below are two good ASP.NET MVC 3 tutorials available on the site today:
We’ll post additional ASP.NET MVC 3 tutorials and videos on the http://asp.net/mvc site in the future. Visit it regularly to find new tutorials as they are published.
How to Upgrade Existing Projects
ASP.NET MVC 3 is compatible with ASP.NET MVC 2 – which means it should be easy to update existing MVC projects to ASP.NET MVC 3.
The new features in ASP.NET MVC 3 build on top of the foundational work we’ve already done with the MVC 1 and MVC 2 releases – which means that the skills, knowledge, libraries, and books you’ve acquired are all directly applicable with the MVC 3 release. MVC 3 adds new features and capabilities – it doesn’t obsolete existing ones.
You can upgrade existing ASP.NET MVC 2 projects by following the manual upgrade steps in the release notes. Alternatively, you can use this automated ASP.NET MVC 3 upgrade tool to easily update your existing projects.
Localized Builds
Today’s ASP.NET MVC 3 release is available in English. We will be releasing localized versions of ASP.NET MVC 3 (in 9 languages) in a few days. I’ll blog pointers to the localized downloads once they are available.
NuGet
Today we are also shipping NuGet – a free, open source, package manager that makes it easy for you to find, install, and use open source libraries in your projects. It works with all .NET project types (including ASP.NET Web Forms, ASP.NET MVC, WPF, WinForms, Silverlight, and Class Libraries). You can download and install it here.
NuGet enables developers who maintain open source projects (for example, .NET projects like Moq, NHibernate, Ninject, StructureMap, NUnit, Windsor, Raven, Elmah, etc) to package up their libraries and register them with an online gallery/catalog that is searchable. The client-side NuGet tools – which include full Visual Studio integration – make it trivial for any .NET developer who wants to use one of these libraries to easily find and install it within the project they are working on.
NuGet handles dependency management between libraries (for example: library1 depends on library2). It also makes it easy to update (and optionally remove) libraries from your projects later. It supports updating web.config files (if a package needs configuration settings). It also allows packages to add PowerShell scripts to a project (for example: scaffold commands). Importantly, NuGet is transparent and clean – and does not install anything at the system level. Instead it is focused on making it easy to manage libraries you use with your projects.
Our goal with NuGet is to make it as simple as possible to integrate open source libraries within .NET projects.
NuGet Gallery
This week we also launched a beta version of the http://nuget.org web-site – which allows anyone to easily search and browse an online gallery of open source packages available via NuGet. The site also now allows developers to optionally submit new packages that they wish to share with others. You can learn more about how to create and share a package here.
There are hundreds of open-source .NET projects already within the NuGet Gallery today. We hope to have thousands there in the future.
IIS Express 7.5
Today we are also shipping IIS Express 7.5. IIS Express is a free version of IIS 7.5 that is optimized for developer scenarios. It works for both ASP.NET Web Forms and ASP.NET MVC project types.
We think IIS Express combines the ease of use of the ASP.NET Web Server (aka Cassini) currently built-into Visual Studio today with the full power of IIS. Specifically:
- It’s lightweight and easy to install (less than 5Mb download and a quick install)
- It does not require an administrator account to run/debug applications from Visual Studio
- It enables a full web-server feature set – including SSL, URL Rewrite, and other IIS 7.x modules
- It supports and enables the same extensibility model and web.config file settings that IIS 7.x support
- It can be installed side-by-side with the full IIS web server as well as the ASP.NET Development Server (they do not conflict at all)
- It works on Windows XP and higher operating systems – giving you a full IIS 7.x developer feature-set on all Windows OS platforms
IIS Express (like the ASP.NET Development Server) can be quickly launched to run a site from a directory on disk. It does not require any registration/configuration steps. This makes it really easy to launch and run for development scenarios. You can also optionally redistribute IIS Express with your own applications if you want a lightweight web-server. The standard IIS Express EULA now includes redistributable rights.
Visual Studio 2010 SP1 adds support for IIS Express. Read my VS 2010 SP1 and IIS Express blog post to learn more about what it enables.
SQL Server Compact Edition 4
Today we are also shipping SQL Server Compact Edition 4 (aka SQL CE 4). SQL CE is a free, embedded, database engine that enables easy database storage.
No Database Installation Required
SQL CE does not require you to run a setup or install a database server in order to use it. You can simply copy the SQL CE binaries into the \bin directory of your ASP.NET application, and then your web application can use it as a database engine. No setup or extra security permissions are required for it to run. You do not need to have an administrator account on the machine. Just copy your web application onto any server and it will work. This is true even of medium-trust applications running in a web hosting environment.
SQL CE runs in-memory within your ASP.NET application and will start-up when you first access a SQL CE database, and will automatically shutdown when your application is unloaded. SQL CE databases are stored as files that live within the \App_Data folder of your ASP.NET Applications.
Works with Existing Data APIs
SQL CE 4 works with existing .NET-based data APIs, and supports a SQL Server compatible query syntax. This means you can use existing data APIs like ADO.NET, as well as use higher-level ORMs like Entity Framework and NHibernate with SQL CE. This enables you to use the same data programming skills and data APIs you know today.
Supports Development, Testing and Production Scenarios
SQL CE can be used for development scenarios, testing scenarios, and light production usage scenarios. With the SQL CE 4 release we’ve done the engineering work to ensure that SQL CE won’t crash or deadlock when used in a multi-threaded server scenario (like ASP.NET). This is a big change from previous releases of SQL CE – which were designed for client-only scenarios and which explicitly blocked running in web-server environments. Starting with SQL CE 4 you can use it in a web-server as well.
There are no license restrictions with SQL CE. It is also totally free.
Tooling Support with VS 2010 SP1
Visual Studio 2010 SP1 adds support for SQL CE 4 and ASP.NET Projects. Read my VS 2010 SP1 and SQL CE 4 blog post to learn more about what it enables.
Web Deploy and Web Farm Framework 2.0
Today we are also releasing Microsoft Web Deploy V2 and Microsoft Web Farm Framework V2. These services provide a flexible and powerful way to deploy ASP.NET applications onto either a single server, or across a web farm of machines.
You can learn more about these capabilities from my previous blog posts on them:
Visit the http://iis.net website to learn more and install them. Both are free.
Orchard 1.0
Today we are also releasing Orchard v1.0.
Orchard is a free, open source, community based project. It provides Content Management System (CMS) and Blogging System support out of the box, and makes it possible to easily create and manage web-sites without having to write code (site owners can customize a site through the browser-based editing tools built-into Orchard). Read these tutorials to learn more about how you can setup and manage your own Orchard site.
Orchard itself is built as an ASP.NET MVC 3 application using Razor view templates (and by default uses SQL CE 4 for data storage). Developers wishing to extend an Orchard site with custom functionality can open and edit it as a Visual Studio project – and add new ASP.NET MVC Controllers/Views to it.
WebMatrix 1.0
WebMatrix is a new, free, web development tool from Microsoft that provides a suite of technologies that make it easier to enable website development. It enables a developer to start a new site by browsing and downloading an app template from an online gallery of web applications (which includes popular apps like Umbraco, DotNetNuke, Orchard, WordPress, Drupal and Joomla). Alternatively it also enables developers to create and code web sites from scratch.
WebMatrix is task focused and helps guide developers as they work on sites. WebMatrix includes IIS Express, SQL CE 4, and ASP.NET – providing an integrated web-server, database and programming framework combination. It also includes built-in web publishing support which makes it easy to find and deploy sites to web hosting providers.
You can learn more about WebMatrix from my Introducing WebMatrix blog post this summer. Visit http://microsoft.com/web to download and install it today.
Summary
I’m really excited about today’s releases – they provide a bunch of additional value that makes web development with ASP.NET, Visual Studio and the Microsoft Web Server a lot better.
A lot of folks worked hard to share this with you today. On behalf of my whole team – we hope you enjoy them!
Scott
P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu
by community-syndication | Jan 13, 2011 | BizTalk Community Blogs via Syndication
This morning I’ve been working on how to support cancelling a workflow via a CancellationToken. The details of that are not important right now but what is really cool is how I was able to test this.
Scenario: Caller requests Cancellation via a CancellationToken and the UnhandledExceptionAction is Cancel
Given
- An activity that contains a CancellationScope
- The CancellationScope body has an activity that will create a bookmark and go idle
- The CancellationScope has a CancelHandler with a WriteLine that has a DisplayName "CancelHandlerWriteLine"
When
- The caller invokes the workflow asynchronously as a task with a CancellationToken
- and in the idle callback calls CancellationTokenSource.Cancel
Then
- A TaskCanceledException is thrown
- The WorkflowApplication is canceled
- The CancelationScope CancelHandler is invoked
Test Challenges
- How can I wait until the cancel is completed after handling the exception before verifying?
- How will I verify that the CancelHandler is invoked?
Solution
To wait until the cancel is completed after handling the exception before verifying I simply create an AutoResetEvent (line 18) and signal it from the WorkflowApplication.Completed event callback (line 19). Then before verifying the tracking data I wait for this event (line 41)
To verify that the cancel handler was invoked I use the Microsoft.Activities.UnitTesting.Tracking.MemoryTrackingParticipant. This allows me to capture the tracking information into a collection that I can search using AssertTracking.Exists to verify that the activity with the name ExpectedCancelWriteline entered the Closed state.
1: [TestMethod]
2: public void ActivityIsCanceledViaTokenShouldInvokeCancelHandler()
3: {
4: const string ExpectedCancelWriteLine = "CancelHandlerWriteLine";
5: var workflowApplication =
6: new WorkflowApplication(
7: new CancellationScope
8: {
9: Body = new TestBookmark<int> { BookmarkName = "TestBookmark" },
10: CancellationHandler = new WriteLine { DisplayName = ExpectedCancelWriteLine }
11: });
12:
13: // Capture tracking events in memory
14: var trackingParticipant = new MemoryTrackingParticipant();
15: workflowApplication.Extensions.Add(trackingParticipant);
16:
17: // Use this event to wait until the cancel is completed
18: var completedEvent = new AutoResetEvent(false);
19: workflowApplication.Completed = args => completedEvent.Set();
20:
21: try
22: {
23: var tokenSource = new CancellationTokenSource();
24:
25: // Run the activity and cancel in the idle callback
26: var task = workflowApplication.RunEpisodeAsync(
27: (args, bn) =>
28: {
29: Debug.WriteLine("Idle callback - cancel");
30: tokenSource.Cancel();
31: return false;
32: },
33: UnhandledExceptionAction.Cancel,
34: TimeSpan.FromMilliseconds(1000),
35: tokenSource.Token);
36:
37: // Exception is thrown when Wait() or Result is accessed
38: AssertHelper.Throws<TaskCanceledException>(task);
39:
40: // Wait for the workflow to complete the cancel
41: completedEvent.WaitOne(this.DefaultTimeout);
42:
43: // Verify the the cancel handler was invoked
44: AssertTracking.Exists(
45: trackingParticipant.Records, ExpectedCancelWriteLine, ActivityInstanceState.Closed);
46: }
47: finally
48: {
49: // Write the tracking records to the test output
50: trackingParticipant.Trace();
51: }
52: }
53:
When I run this test I also get the Tracking info in the Test Results along with any Debug.WriteLine output to help me sort out what is happening. The tracking data is nicely formatted thanks to extension methods in Microsoft.Activities.UnitTesting.Tracking that provide a Trace method for each type of tracking record which produces human readable formatting.
WaitForWorkflow waiting for workflowBusy - check for cancel
Checking cancel token
System.Activities.WorkflowApplicationIdleEventArgs
Bookmarks count 1 (TestBookmark)
Idle callback - cancel
Checking cancel token from idle handler
Cancel requested canceling workflow
WaitForWorkflow workflowBusy is signaled - check for cancel
Checking cancel token
Cancel requested canceling workflow
WorkflowApplication.Cancel
this.CancellationToken.ThrowIfCancellationRequested()
*** Tracking data follows ***
WorkflowInstance for Activity <CancellationScope> state is <Started> at 04:13:53.7852
Activity <null> is scheduled child activity <CancellationScope> at 04:13:53.7852
Activity <CancellationScope> state is Executing at 04:13:53.7852
Activity <CancellationScope> is scheduled child activity <TestBookmark> at 04:13:53.7852
Activity <TestBookmark> state is Executing at 04:13:53.7852
{
Arguments
BookmarkName: TestBookmark
}
WorkflowInstance for Activity <CancellationScope> state is <Idle> at 04:13:53.7852
Activity <null> cancel is requested for child activity <CancellationScope> at 04:13:53.7852
Activity <CancellationScope> cancel is requested for child activity <TestBookmark> at 04:13:53.7852
Activity <TestBookmark> state is Canceled at 04:13:53.8008
{
Arguments
BookmarkName: TestBookmark
Result: 0
}
Activity <CancellationScope> is scheduled child activity <CancelHandlerWriteLine> at 04:13:53.8008
Activity <CancelHandlerWriteLine> state is Executing at 04:13:53.8008
{
Arguments
Text:
TextWriter:
}
Activity <CancelHandlerWriteLine> state is Closed at 04:13:53.8008
{
Arguments
Text:
TextWriter:
}
Activity <CancellationScope> state is Canceled at 04:13:53.8008
WorkflowInstance for Activity <CancellationScope> state is <Canceled> at 04:13:53.8008
by community-syndication | Jan 13, 2011 | BizTalk Community Blogs via Syndication
Ken Levy recently interviewed me for CodeCast on the state of Windows Workflow Foundation 4 and the AppFabric for hosting your workflow services. Its episode 99, I wonder what they are planning for episode 100. You can download the episode here or through iTunes.
Enjoy!
www.TheProblemSolver.nl
Wiki.WindowsWorkflowFoundation.eu
by community-syndication | Jan 12, 2011 | BizTalk Community Blogs via Syndication
With the work I’ve been doing on versioning I’ve had to write unit tests that verify the behavior I expect from the helper classes in Microsoft.Activities.dll. If you want to verify that your assembly versioning strategy is working correctly you may have to do similar testing. This sort of testing is tricky in this post I’ll share with you my solutions to some tough problems.
Test Problem: Multiple Versions of the Same Assembly in a Test Run
There is only one deployment directory for the test run and all deployment items are copied there. I can’t deploy ActivityLibrary V1 and V2 to the same test directory but I need to deploy both for a test run (note: I did not tackle GAC deployment for this set of tests)
Solution
The solution comes in two parts.
- How to create different versions of the assemblies in one build
- How to deploy the different versions of the assemblies when testing
Creating Different Versions of the Assemblies in One Build
For my testing I need a variety of different assemblies in debug and release builds with different versions, signing options and references to other assemblies
- ActivityLibrary – Version 1 (signed and unsigned), Version 2 (signed and unsigned)
- Workflow – Version 1 (signed), Version 2 (signed, unsigned)
To do this I created a number of projects that produce the same assembly and write the output to the bin directory for all build configurations (rather than bin\debug or bin\release)
As you can see I have several ActivityLibrary projects with different names but they all produce ActivityLibrary.dll and the same is true for WorkflowLibrary
Deploy the Different Versions of the Assemblies When Testing
I deploy different versions of the assemblies into subdirectories of the test directory. Then when I create the AppDomain for the test I set the ApplicationBase to the subdirectory for that test. This ensures that the test directory contains the versions of the assemblies that I want.
To make the test code less error prone, I created constants to define the versions of assemblies and combinations of directories where I will deploy them and I pass these values to the [DeploymentItem] attribute
1: /// <summary>
2: /// The Activity Library(V1)
3: /// </summary>
4: private const string ActivityV1 = @"Tests\Test Projects\ActivityLibrary.V1\bin\ActivityLibrary.dll";
5:
6: /// <summary>
7: /// Deploy Directory with Workflow (V1) Activity (V1)
8: /// </summary>
9: private const string WorkflowV1ActivityV1 = "WorkflowV1ActivityV1";
10:
11: /// <summary>
12: /// Given
13: /// * The following components are deployed
14: /// * Workflow (compiled) V1
15: /// * ActivityLibrary.dll V1
16: /// When
17: /// * Workflow (compiled) is constructed using reference to Activity V1
18: /// Then
19: /// * Workflow should load and return version 1.0.0.0
20: /// </summary>
21: [TestMethod]
22: [DeploymentItem(TestAssembly, WorkflowV1ActivityV1)]
23: [DeploymentItem(ActivityV1, WorkflowV1ActivityV1)]
24: [DeploymentItem(WorkflowV1, WorkflowV1ActivityV1)]
25: public void WorkflowV1RefActivityV1DeployedActivityV1ShouldLoad()
26: {
27: // ...
28: }
29:
Test Problem: Assembly.Load and Cached Assemblies in the AppDomain
The biggest issue I ran into when writing my tests is caused by the behavior of Assembly.Load. The issue is that when you call Assembly.Load it checks to see if it has already loaded an assembly that will satisfy the request and will use that assembly. When you have a number of tests that need to verify if the correct assembly was loaded you find that suddenly you have a test order dependency. What happens is that when you run a test by itself it passes but when you run all of your tests some of them fail.
I want to be sure that when I run my tests that I always get the same results no matter how many I run or in what order. To solve this problem I need to deal with the assemblies cached in the AppDomain and there is no way to unload an assembly once it has been loaded.
Solution
To solve this problem for each test I’m going to create a new AppDomain run the test code in the new AppDomain and then Unload it when I’m finished. This ensures that I start with an empty AppDomain and I can verify that only the assemblies I want are loaded.
My test class StaticXamlHelperTest has a matching worker class StaticXamlTestWorker which does the actual testing in the new AppDomain.
1: [Serializable]
2: public class StaticXamlTestWorker : MarshalByRefObject
Then I added two helper methods to the StaticXamlHelperTest class to create the AppDomain and create the Worker class in the AppDomain
1: private static StaticXamlTestWorker CreateTestWorker(AppDomain domain)
2: {
3: domain.Load(Assembly.GetExecutingAssembly().GetName().FullName);
4: var worker =
5: (StaticXamlTestWorker)
6: domain.CreateInstanceAndUnwrap(
7: Assembly.GetExecutingAssembly().GetName().FullName,
8: "Microsoft.Activities.Tests.StaticXamlTestWorker");
9:
10: return worker;
11: }
12:
13: private AppDomain CreateWorkerDomain(string workerPath)
14: {
15: return AppDomain.CreateDomain(
16: this.TestContext.TestName,
17: null,
18: new AppDomainSetup { ApplicationBase = Path.Combine(this.TestContext.DeploymentDirectory, workerPath) });
19: }
Then for each test I follow a simple pattern
1: [TestMethod]
2: [DeploymentItem(TestAssembly, WorkflowV1ActivityV1)]
3: [DeploymentItem(ActivityV1, WorkflowV1ActivityV1)]
4: [DeploymentItem(WorkflowV1, WorkflowV1ActivityV1)]
5: public void WorkflowV1RefActivityV1DeployedActivityV1ShouldLoad()
6: {
7: var domain = this.CreateWorkerDomain(WorkflowV1ActivityV1);
8: try
9: {
10: CreateTestWorker(domain).WorkflowV1RefActivityV1DeployedActivityV1ShouldLoad();
11: }
12: finally
13: {
14: if (domain != null)
15: {
16: AppDomain.Unload(domain);
17: }
18: }
19: }
- Create the AppDomain (line 7)
- Create the worker inside a try block and call the test method (line 10)
- In the finally block Unload the AppDomain (line 16)
Test Problem: How to Know Which Version of the Activity Library Was Actually Loaded
Problems with assembly loading generally result in exceptions being thrown but sometimes you might be surprised to find the workflow loading and happily running with a version of the activity that is something other than what you expected.
Since I am testing infrastructure I created Workflows with the sole purpose of loading an activity from an Activity Library and returning the version of the assembly that contained the activity. My activity is named GetAssemblyVersion.
1: public sealed class GetAssemblyVersion : CodeActivity<Version>
2: {
3: protected override Version Execute(CodeActivityContext context)
4: {
5: return Assembly.GetExecutingAssembly().GetName().Version;
6: }
7: }
I then create a Workflow that declares an OutArgument<Version> and uses the GetAssemblyVersion activity.
Because I’m testing Microsoft.Activities.StaticXamlHelper I create a partial class with an overloaded constructor that calls the method I really want to test StaticXamlHelper.InitializeComponent
1: public Workflow(XamlAssemblyResolutionOption xamlAssemblyResolutionOption, IList<string> referencedAssemblies)
2: {
3: referencedAssemblies.Add(Assembly.GetExecutingAssembly().GetName().FullName);
4:
5: switch (xamlAssemblyResolutionOption)
6: {
7: case XamlAssemblyResolutionOption.FullName:
8: StrictXamlHelper.InitializeComponent(this, this.FindResource(), referencedAssemblies);
9: ShowAssemblies();
10: break;
11: case XamlAssemblyResolutionOption.VersionIndependent:
12: this.InitializeComponent();
13: break;
14: default:
15: throw new ArgumentOutOfRangeException("xamlAssemblyResolutionOption");
16: }
17: }
Now I’m ready for some test code – remember this code will run in the new AppDomain and it makes use of Microsoft.Activities.UnitTesting
1: public void WorkflowV1RefActivityV1DeployedActivityV1ShouldLoad()
2: {
3: var activity = new Workflow(XamlAssemblyResolutionOption.FullName, GetListWithActivityLibraryVersion(1));
4: var host = new WorkflowInvokerTest(activity);
5: host.TestActivity();
6: host.AssertOutArgument.AreEqual("AssemblyVersion", new Version(1, 0, 0, 0));
7: }
Here is how it works
- Line 3 – Create the activity using the overloaded constructor providing the list of assemblies you want to reference (provided by a helper method)
- Line 4 – Create a test host – Microsoft.Activities.UnitTesting.WorkflowInvokerTest
- Line 5 – Test the activity
- Line 6 – Assert the out argument “AssemblyVersion” is 1.0.0.0
Bottom Line
If this sounds complicated that’s just because it is. The complete source for all the unit tests is included in the Microsoft.Activities source so you can check out the details of how it works.
I know you are thinking this sounds like way too much work for your project. If you only knew how many bugs I discovered in my code and fixed before you ever saw them (including one very obscure bug that didn’t appear on VS2010 RTM but only on VS2010 SP1 beta) you would take the time to write some quality code.
by community-syndication | Jan 12, 2011 | BizTalk Community Blogs via Syndication
Introduction I have been meaning for a while to add some PowerShell posts to my blog but had been busy lately. I have been using the BizTalk PowerShell provider (http://psbiztalk.codeplex.com) that my friend Randal van Splunteren helped create. I now usePowerShell all the time in my BizTalk work and find it to be very helpful […]
by community-syndication | Jan 12, 2011 | BizTalk Community Blogs via Syndication
Een nieuw jaar betekent, zoals gebruikelijk, weer nieuwe plannen. En voor de DotNed gebruikersgroep is een van deze nieuwe plannen om een podcast te gaan maken en publiceren. Deze DotNed Podcasts worden speciaal voor de Nederlandse .NET ontwikkelaar gemaakt en zullen dan ook zo veel mogelijk Nederlandstalig zijn. Het is de bedoeling dat de onderwerpen die we gaan behandelen heel divers zijn, zolang het maar interessant en relevant is voor een .NET ontwikkelaar.
De eerste van deze serie podcasts staat nu online op de website en is hier te vinden. Om het voor de luisteraars zo makkelijk mogelijk te maken hebben we een RSS feed gemaakt zodat nieuwe afleveringen automatisch binnen komen. Voor de gebruikers van iTunes is hier een directe link te vinden en voor de Zune gebruikers wordt nog gewerkt aan een feed. Zodra de Zune link beschikbaar is zal deze op de site komen te staan.

Uiteraard zijn we benieuwd naar wat je van de podcast vindt en wat voor onderwerpen je in de toekomst graag wil horen. Speciaal hiervoor hebben we het email adres [email protected] in het leven geroepen. Dus laat van je horen als je suggesties of feedback hebt.
Website: http://www.dotned.nl/PodCasts.aspx
RSS: http://www.dotned.nl/podcasts/podcasts.xml
iTunes: http://itunes.apple.com/nl/podcast/dotned-podcast/id413467827
by community-syndication | Jan 12, 2011 | BizTalk Community Blogs via Syndication
Continuing our series of posts about service registry patterns that suck, we decided to address one of the most common techniques that Service Oriented (SOA) governance tools use to enforce policies. Scenario Service registries and repositories serve…(read more)
by community-syndication | Jan 12, 2011 | BizTalk Community Blogs via Syndication
This blog reviews the current (January 2011) set of options available for hosting existing .NET4 Workflow (WF) programs in Windows Azure and also provides a roadmap to the upcoming features that will further enhance support for hosting and monitoring the Workflow programs. The code snippets included below are also available as an attachment for you to download and try it out yourself.
Workflow in Azure – Today
Workflow programs can broadly classified as durable or non-durable (aka non-persisted Workflow Instances). Durable Workflow Services are inherently long running, persist their state, and use correlation for follow-on activities. Non-durable Workflows are stateless, effectively they start and run to completion in a single burst.
Today non-durable Workflows are readily supported by Windows Azure of course with a few configuration/trivial changes. Hosting durable Workflows today is a challenge; since we do not yet have a ’Windows Server AppFabric’ equivalent for Azure which can persist, manage and monitor the Service. In brief the big buckets of functionality required to host the durable Workflow Services are:
- Monitoring store: There is no Event Collection Service available to gather the ETW events and write them to the SQL Azure based Monitoring database. There is also no schema that ships with .NET Framework for creating the monitoring database, and the one that ships with Windows Server AppFabric is incompatible with SQL Azure – an example, the scripts that are provided with Windows Server AppFabric make use of the XML column type which is currently not supported by SQL Azure.
- Instance Store: The schemas used by the SqlWorkflowInstanceStore have incompatibilities with SQL Azure. Specifically, the schema scripts require page locks, which are not supported on SQL Azure.
- Reliability: While the SqlWorkflowInstanceStore provides a lot of the functionality for managing instance lifetimes, the lack of the AppFabric Workflow Management Service means that you need to manually implement a way to start your WorkflowServiceHosts before any messages are received (such as when you bring up a new role instance or restart a role instance), so that the contained SqlWorkflowInstanceStore can poll for workflow service instances having expired timers and subsequently resume their execution.
The above limitations make it rather difficult to run a durable Workflow Service on Azure – the upcoming release of Azure AppFabric (Composite Application) is expected to make it possible to run durable Workflow Services. In this blog, we will focus on the design approaches to get your non-durable Workflow instances running within Azure roles.
Today you can run your non-durable Workflow on Azure. What this means, is that your Workflow programs really cannot persist their state and wait for subsequent input to resume execution- they must complete following their initial launch. With Azure you can run non-durable Workflows programs in one of the three ways:
- Web Role
- Worker Roles
- Hybrid
The Web Role acts very much like IIS does on premise as an HTTP server, and is easier to configure and requires little code to integrate and is activated by an incoming request. The Worker Role acts like an on-premise Windows Service and is typically used in backend processing scenarios have multiple options to kick off the processing – which in turn add to the complexity. The hybrid approach, which bridges communication between Azure hosted and on-premise resources, has multiple advantages: it enables you to leverage existing deployment models and also enables use of durable Workflows on premise as a solution until the next release. The following sections, succinctly, provide details on these three approaches and in the ’Conclusion’ section we will also provide you pointers on the appropriateness of each approach.
Host Workflow Services in a Web Role
The Web Role is similar to a ’Web Application’ and can also provide a Service perspective to anything that uses the http protocol – such as a WCF service using basicHttpBinding. The Web Role is generally driven by a user interface – the user interacts with a Web Page, but a call to a hosted Service can also cause some processing to happen. Below are the steps that enable you to host a Workflow Service in a Web Role.
First step is to create a Cloud Project in Visual Studio, and add a WCF Service Web Role to it. Delete the IService1.cs, Service1.svc and Service1.svc.cs added by the template since they are not needed and will be replaced by the workflow service XAMLX.
To the Web Role project, add a WCF Workflow Service. The structure of your solution is now complete (see the screenshot below for an example), but you need to add a few configuration elements to enable it to run on Azure.
Windows Azure does not include a section handler in its machine.config for system.xaml.hosting as you have in an on-premises solution. Therefore, the first configuration change (HTTP Handler for XAMLX and XAMLX Activation) is to add the following to the top of your web.config, within the configuration element:
<configSections>
<sectionGroup name="system.xaml.hosting" type="System.Xaml.Hosting.Configuration.XamlHostingSectionGroup, System.Xaml.Hosting, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35">
<section name="httpHandlers" type="System.Xaml.Hosting.Configuration.XamlHostingSection, System.Xaml.Hosting, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
</sectionGroup>
</configSections>
Next, you need to add XAML http handlers for WorkflowService and Activities root element types by adding the following within the configuration element of your web.config, below the configSection that we included above:
<system.xaml.hosting>
<httpHandlers>
<add xamlRootElementType="System.ServiceModel.Activities.WorkflowService, System.ServiceModel.Activities, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" httpHandlerType="System.ServiceModel.Activities.Activation.ServiceModelActivitiesActivationHandlerAsync, System.ServiceModel.Activation, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
<add xamlRootElementType="System.Activities.Activity, System.Activities, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" httpHandlerType="System.ServiceModel.Activities.Activation.ServiceModelActivitiesActivationHandlerAsync, System.ServiceModel.Activation, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
</httpHandlers>
</system.xaml.hosting>
Finally, configure the WorkflowServiceHostFactory to handle activation for your service by adding a serviceActivation element to system.serviceModel\serviceHostingEnvironment element:
<serviceHostingEnvironment multipleSiteBindingsEnabled="true" >
<serviceActivations>
<add relativeAddress="~/Service1.xamlx" service="Service1.xamlx" factory="System.ServiceModel.Activities.Activation.WorkflowServiceHostFactory"/>
</serviceActivations>
</serviceHostingEnvironment>
The last step is to deploy your Cloud Project and with that you now have your Workflow service hosted on Azure – graphic below!
Note: Sam Vanhoutte from CODit in his blog also elaborates on Hosting workflow services in Windows Azure and focuses on troubleshooting configuration by disabling custom errors– do review.
Host Workflows in a Worker Role
The Worker Role is similar to a Windows Service and would start up ’automatically’ and be running all the time. While the Workflow Programs could be initiated by a timer, it could use other means to activate such as a simple while (true) loop and a sleep statement. When it ’ticks’ it performs work. This is generally the option for background or computational processing.
In this scenario you use Workflows to define Worker Role logic. Worker Roles are created by deriving from the RoleEntryPoint Class and overriding a few of its method. The method that defines the actual logic performed by a Worker Role is the Run Method. Therefore, to get your workflows executing within a Worker Role, use WorkflowApplication* or WorkflowInvoker to host an instance of your non-service Workflow (e.g., it doesn’t use Receive activities) within this Method. In either case, you only exit the Run Method when you want the Worker Role to stop executing Workflows.
The general strategy to accomplish this is to start with an Azure Project and add a Worker Role to it. To this project you Add a reference to an assembly containing your XAML Workflow types. Within the Run Method of WorkerRole.cs, you initialize one of the host types (WorkflowApplication or WorkflowInvoker), referring to an Activity type contained in the referenced assembly. Alternatively, you can initialize one of the host types by loading an Activity instance from the XAML Workflow file available on the file system.. You will also need to add references to .NET Framework assemblies (System.Activities and System.Xaml – if you wish to load XAML workflows from a file).
Host Workflow (Non-Service) in a Worker Role
For ’non-Service’ Workflows, your Run method needs to describe a loop that examines some input data and passes it to a workflow instance for processing. The following shows how to accomplish this when the Workflow type is acquired from a referenced assembly:
public override void Run()
{
Trace.WriteLine("WFWorker entry point called", "Information");
while (true)
{
Thread.Sleep(1000);
/* ...
* ...Poll for data to hand to WF instance...
* ...
*/
//Create a dictionary to hold input data
Dictionary<;string, object> inputData = new Dictionary<string,object>();
//Instantiate a workflow instance from a type defined in a referenced assembly
System.Activities.Activity workflow = new Workflow1();
//Execute the WF passing in parameter data and capture output results
IDictionary<;string, object> outputData =
System.Activities.WorkflowInvoker.Invoke(workflow, inputData);
Trace.WriteLine("Working", "Information");
}
}
Alternatively, you could perform the above using the WorkflowApplication to host the Workflow instance. In this case, the main difference is that you need to use semaphores to control the flow of execution because the workflow instances will be run on threads separate from the one executing the Run method.
public override void Run()
{
Trace.WriteLine("WFWorker entry point called", "Information");
while (true)
{
Thread.Sleep(1000);
/* ...
* ...Poll for data to hand to WF...
* ...
*/
AutoResetEvent syncEvent = new AutoResetEvent(false);
//Create a dictionary to hold input data and declare another for output data
Dictionary<;string, object> inputData = new Dictionary<string,object>();
IDictionary<;string, object> outputData;
//Instantiate a workflow instance from a type defined in a referenced assembly
System.Activities.Activity workflow = new Workflow1();
//Run the workflow instance using WorkflowApplication as the host.
System.Activities.WorkflowApplication workflowHost = new System.Activities.WorkflowApplication(workflow, inputData);
workflowHost.Completed = (e) =>
{
outputData = e.Outputs;
syncEvent.Set();
};
workflowHost.Run();
syncEvent.WaitOne();
Trace.WriteLine("Working", "Information");
}
}
Finally, if instead of loading Workflow types from a referenced assembly, you want to load the XAML from a file available, for example, one included with WorkerRole or stored on an Azure Drive, you would simply replace the line that instantiates the Workflow in the above two examples with the following, passing in the appropriate path to the XAML file to XamlServices.Load:
System.Activities.Activity workflow = (System.Activities.Activity)
System.Xaml.XamlServices.Load(@"X:\workflows\Workflow1.xaml");
By and large, if you are simply hosting logic described in a non-durable workflow, WorkflowInvoker is the way to go. As it offers fewer lifecycle features (when compared to WorkflowApplication), it is also more light weight and may help you scale better when you need to run many workflows simultaneously.
Host Workflow Service in a Worker Role
When you need to host Workflow Service in a Worker Role, there are a few more steps to take. Mainly, these exist to address the fact that Worker Role instances run behind a load balancer. From a high-level, to host a Workflow Service means creating an instance of a WorkflowServiceHost based upon an instance of an Activity or WorkflowService defined either in a separate assembly or as a XAML file. The WorkflowService instance is created and opened in the Worker Role’s OnStart Method, and closed in the OnStop Method. It is important to note that you should always create the WorkflowServiceHost instance within the OnStart Method (as opposed to within Run as was shown for non-service Workflow hosts). This ensures that if a startup error occurs, the Worker Role instance will be restarted by Azure automatically. This also means the opening of the WorkflowServiceHost will be attempted again.
Begin by defining a global variable to hold a reference to the WorkflowServiceHost (so that you can access the instance within both the OnStart and OnStop Methods).
public class WorkerRole : RoleEntryPoint
{
System.ServiceModel.Activities.WorkflowServiceHost wfServiceHostA;
}
Next, within the OnStart Method, add code to initialize and open the WorkflowServiceHost, within a try block. For example:
public override bool OnStart()
{
Trace.WriteLine("Worker Role OnStart Called.");
//
try
{
OpenWorkflowServiceHostWithAddressFilterMode();
}
catch (Exception ex)
{
Trace.TraceError(ex.Message);
throw;
}
//
return base.OnStart();
}
Let’s take a look at the OpenWorkflowServiceHostWithAddressFilterMode method implementation, which really does the work. Starting from the top, notice how either an Activity or WorkflowService instance can be used by the WorkflowServiceHost constructor, they can even be loaded from a XAMLX file on the file-system. Then we acquire the internal instance endpoint and use it to define both the logical and physical address for adding an application service endpoint using a NetTcpBinding. When calling AddServiceEndpoint on a WorkflowServiceHost, you can specify either just the service name as a string or the namespace plus name as an XName (these values come from the Receive activity’s ServiceContractName property).
private void OpenWorkflowServiceHostWithAddressFilterMode()
{
//workflow service hosting with AddressFilterMode approach
//Loading from a XAMLX on the file system
System.ServiceModel.Activities.WorkflowService wfs =
(System.ServiceModel.Activities.WorkflowService)System.Xaml.XamlServices.Load("WorkflowService1.xamlx");
//As an alternative you can load from an Activity type in a referenced assembly:
//System.Activities.Activity wfs = new WorkflowService1();
wfServiceHostA = new System.ServiceModel.Activities.WorkflowServiceHost(wfs);
IPEndPoint ip = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["WorkflowServiceTcp"].IPEndpoint;
wfServiceHostA.AddServiceEndpoint(System.Xml.Linq.XName.Get("IService", "http://tempuri.org/"),
new NetTcpBinding(SecurityMode.None),
String.Format("net.tcp://{0}/MyWfServiceA", ip));
//You can also refer to the implemented contract without the namespace, just passing the name as a string:
//wfServiceHostA.AddServiceEndpoint("IService",
// new NetTcpBinding(SecurityMode.None),
// String.Format("net.tcp://{0}/MyWfServiceA", ip));
wfServiceHostA.ApplyServiceMetadataBehavior(String.Format("net.tcp://{0}/MyWfServiceA/mex", ip));
wfServiceHostA.ApplyServiceBehaviorAttribute();
wfServiceHostA.Open();
Trace.WriteLine(String.Format("Opened wfServiceHostA"));
}
In order to enable our service to be callable externally, we next need to add an Input Endpoint that Azure will expose at the load balancer for remote clients to use. This is done within the Worker Role configuration, on the Endpoints tab. The figure below shows how we have defined a single TCP Input Endpoint on port 5555 named WorkflowServiceTcp. It is this Input Endpoint, or IPEndpoint as it appears in code, that we use in the call to AddServiceEndpoint in the previous code snippet. At runtime, the variable ip provides the local instance physical address and port which the service must use, to which the load balancer will forward messages. The port number assigned at runtime (e.g., 20000) is almost always different from the port you specify in the Endpoints tab (e.g., 5555), and the address (e.g., 10.26.58.148) is not the address of your application in Azure (e.g., myapp.cloudapp.net), but rather the particular Worker Role instance.
It is very important to know that currently, Azure Worker Roles do not support using HTTP or HTTPS endpoints (primarily due to permissions issues that only Worker Roles face when trying to open one). Therefore, when exposing your service or metadata to external clients, your only option is to use TCP.
Returning to the implementation, before opening the service we add a few behaviors. The key concept to understand is that any workflow service hosted by an Azure Worker Role will run behind a load balancer, and this affects how requests must be addressed. This results in two challenges which the code above solves:
- How to properly expose service metadata and produce metadata which includes the load balancer’s address (and not the internal address of the service hosted within a Worker Role instance).
- How to configure the service to accept messages it receives from the load balancer, that are addressed to the load balancer.
To reduce repetitive work, we defined a helper class that contains extension methods for ApplyServiceBehaviorAttribute and ApplyServiceMetadataBehavior that apply the appropriate configuration to the WorkflowServiceHost and alleviate the aforementioned challenges.
//Defines extensions methods for ServiceHostBase (useable by ServiceHost &; WorkflowServiceHost)
public static class ServiceHostingHelper
{
public static void ApplyServiceBehaviorAttribute(this ServiceHostBase host)
{
ServiceBehaviorAttribute sba = host.Description.Behaviors.Find<;ServiceBehaviorAttribute>();
if (sba == null)
{
//For WorkflowServices, this behavior is not added by default (unlike for traditional WCF services).
host.Description.Behaviors.Add(new ServiceBehaviorAttribute() { AddressFilterMode = AddressFilterMode.Any });
Trace.WriteLine(String.Format("Added address filter mode ANY."));
}
else
{
sba.AddressFilterMode = System.ServiceModel.AddressFilterMode.Any;
Trace.WriteLine(String.Format("Configured address filter mode to ANY."));
}
}
public static void ApplyServiceMetadataBehavior(this ServiceHostBase host, string metadataUri)
{
//Must add this to expose metadata externally
UseRequestHeadersForMetadataAddressBehavior addressBehaviorFix = new UseRequestHeadersForMetadataAddressBehavior();
host.Description.Behaviors.Add(addressBehaviorFix);
Trace.WriteLine(String.Format("Added Address Behavior Fix"));
//Add TCP metadata endpoint. NOTE, as for application endpoints, HTTP endpoints are not supported in Worker Roles.
ServiceMetadataBehavior smb = host.Description.Behaviors.Find<;ServiceMetadataBehavior>();
if (smb == null)
{
smb = new ServiceMetadataBehavior();
host.Description.Behaviors.Add(smb);
Trace.WriteLine("Added ServiceMetaDataBehavior.");
}
host.AddServiceEndpoint(
ServiceMetadataBehavior.MexContractName,
MetadataExchangeBindings.CreateMexTcpBinding(),
metadataUri
);
}
}
Looking at how we enable service metadata in the ApplyServiceMetadataBehavior method, notice there are three key steps. First, we add the UseRequestHeadersForMetadataAddressBehavior. Without this behavior, you could only get metadata by communicating directly to the Worker Role instance, which is not possible for external clients (they must always communicate through the load balancer). Moreover, the WSDL returned in the metadata request would include the internal address of the service, which is not helpful to external clients either. By adding this behavior, the WSDL includes the address of the load balancer. Next, we add the ServiceMetadataBehavior and then add a service endpoint at which the metadata can be requested. Observe that when we call ApplyServiceMetadataBehavior, we specify a URI which is the service’s internal address with mex appended. The load balancer will now correctly route metadata requests to this metadata endpoint.
The rationale behind the ApplyServiceBehaviorAttribute method is similar to ApplyServiceMetadataBehavior. When we add a service endpoint by specifying only the address parameter (as we did above), the logical and physical address of the service are configured to be the same. This causes a problem when operating behind a load balancer, as messages coming from external clients via the load balancer will be addressed to the logical address of the load balancer, and when the instance receives such a message it will not accept-throwing an AddressFilterMismatch exception. This happens because the address in the message does not match the logical address at which the endpoint was configured. With traditional code-based WCF services, we could resolve this simply by decorating the service implementation class with [ServiceBehavior(AddressFilterMode=AddressFilterMode.Any)], which allows the incoming message to have any address and port. This is not possible with Workflow Services (as there is no code to decorate with an attribute), hence we have to add it in the hosting code.
If allowing an incoming address concerns you, an alternative to using AddressFilterMode is simply to specify the logical address that is to be allowed. Instead of adding the ServiceBehaviorAttribute, you simply open the service endpoint specifying both the logical (namely the port the load balancer will receives messages on) and physical address (at which your service listens). The only complication, is that your Workflow Role instance does not know which port the load balancer is listening- so you need to add this value to configuration and read it from their before adding the service endpoint. To add this to configuration, return to the Worker Role’s properties, Settings tab. Add string setting with the value of the of the port you specified on the Endpoints tab, as we show here for the WorkflowServiceEndpointListenerPort.
With that setting in place, the rest of the implementation is fairly straightforward:
private void OpenWorkflowServiceHostWithoutAddressFilterMode()
{
//workflow service hosting without AddressFilterMode
//Loading from a XAMLX on the file system
System.ServiceModel.Activities.WorkflowService wfs = (System.ServiceModel.Activities.WorkflowService)System.Xaml.XamlServices.Load("WorkflowService1.xamlx");
wfServiceHostB = new System.ServiceModel.Activities.WorkflowServiceHost(wfs);
//Pull the expected load balancer port from configuration...
int externalPort = int.Parse(RoleEnvironment.GetConfigurationSettingValue("WorkflowServiceEndpointListenerPort"));
IPEndPoint ip = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["WorkflowServiceTcp"].IPEndpoint;
//Use the external load balancer port in the logical address...
wfServiceHostB.AddServiceEndpoint(System.Xml.Linq.XName.Get("IService", "http://tempuri.org/"),
new NetTcpBinding(SecurityMode.None),
String.Format("net.tcp://{0}:{1}/MyWfServiceB", ip.Address, externalPort),
new Uri(String.Format("net.tcp://{0}/MyWfServiceB", ip)));
wfServiceHostB.ApplyServiceMetadataBehavior(String.Format("net.tcp://{0}/MyWfServiceB/mex", ip));
wfServiceHostB.Open();
Trace.WriteLine(String.Format("Opened wfServiceHostB"));
}
With that, we can return to the RoleEntryPoint definition of our Worker Role and override the Run and OnStop Methods. For Run, because the WorkflowServiceHost takes care of all the processing, we just need to have a loop that keeps Run from exiting.
public override void Run()
{
Trace.WriteLine("Run - WFWorker entry point called", "Information");
while (true)
{
Thread.Sleep(30000);
}
}
For OnStop we simply close the WorkflowServiceHost.
public override void OnStop()
{
Trace.WriteLine(String.Format("OnStop - Called"));
if (wfServiceHostA != null)
wfServiceHostA.Close();
base.OnStop();
}
With OnStart, Run and OnStop Methods defined, our Worker Role is fully capable of hosting a Workflow Service.
Hybrid Approach – Host Workflow On-Premise and Reach From the Cloud
Unlike ’pure’ cloud solutions, hybrid solutions have a set of “on-premises” components: business processes, data stores, and services. These must be on-premises, possibly due to compliance or deployment restrictions. A hybrid solution is one which has parts of the solution deployed in the cloud while some applications remain deployed on-premises.
This is a great interim approach, leveraging on-premise Workflows hosted within on-premise Windows Server AppFabric (as illustrated in the diagram below) to various components and application that are hosted in Azure. This approach may also be applied if stateful/durable Workflows are required to satisfy scenarios. You can build a Hybrid solution and run the Workflows on-premise and use either the AppFabric Service Bus or Windows Azure Connect to reach into your on-premise Windows Server AppFabric instance.
Source: MSDN Blog Hybrid Cloud Solutions with Windows Azure AppFabric Middleware
Conclusion
How do you choose which approach to take? The decision ultimately boils down to your specific requirements, but here are some pointers that can help.
Hosting Workflow Services in a Web Role is very easy and robust. If your Workflow is using Receive Activities as part of its definition, you should be hosting in a Web Role. While you can build and host a Workflow Service within a Worker Role, you take on the responsibility of rebuilding the entire hosting infrastructure provided by IIS in the Web Role- which is a fair amount of non-value added work. That said, you will have to host in a Worker Role when you want to use a TCP endpoint, and a Web Role when you want to use an HTTP or HTTPS endpoint.
Hosting non-service Workflows that poll for their tasks is most easily accomplished within a Worker Role. While you can build another mechanism to poll and then call Workflow Services hosted in a Web Role, the Worker Role is designed to support and keep a polling application alive. Moreover, if your Workflow design does not already define a Service, then you should host it in a Worker Role– as Web Role hosting would require you to modify the Workflow definition to add the appropriate Receive Activities.
Finally, if you have existing investments in Windows Server AppFabric as hosted Services that need to be called from Azure hosted applications, then taking a hybrid approach is a very viable option. One clear benefit, is that you retain the ability to monitor your system’s status through the IIS Dashboard. Of course this approach has to be weighed against the obvious trade-offs of added complexity and bandwidth costs.
The upcoming release of Azure AppFabric Composite Applications will enable hosting Workflow Services directly in Azure while providing feature parity to Windows Server AppFabric. Stay tuned for the exciting news and updates on this front.
Sample
The sample project attached provides a solution that shows how to host non-durable Workflows, in both Service and non-Service forms. For non-Service Workflows, it shows how to host using a WorkflowInvoker or WorkflowApplication within a Worker Role. For Services, it shows how to host both traditional WCF service alongside Workflow services, in both Web and Worker Roles.